Announcement

Collapse
No announcement yet.

Questions about the HD4870/HD4850

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    There is work going on now to add OpenGL features -- most of the discussion is on the "#radeon" channel right now. MostAwesomeDude and nha are the primary guys pushing the current support ahead, and they (nha mostly, I think) have also been rewriting some of the code to make it more extensible and to make transition to Gallium easier.

    Going much further with OpenGL also needs a good in-drm memory manager and that work is just happening now. I think memory management, DRI2 and Gallium stabilizing will all happen about the same time (fairly soon) and at that point everyone will hop over to using Gallium for the HW-specific bits of Mesa. It's not that GL2.0 "comes for free" with Gallium, just that implementing the support over Gallium seems like a good idea.

    re: vsync support for textured video, the general feeling seems to be that having the compositor sync to vblank is the best approach. I *think* that needs a compositor change (Compiz, Metacity, Kwin etc..) first but I'm not sure. The problem is that the playback chain is quite different depending on whether or not you are running through a compositor -- in one case the TexVid driver should do the syncing while in the other the compositor does the syncing and TexVid has to handle flow control and frame dropping/doubling to deal with refresh rate mismatches.

    Vblank support in drm is still at a pre-production stage as well -- you might want to follow the #dri-devel logs for more details there.

    Comment


    • #17
      Originally posted by bridgman View Post
      There is work going on now to add OpenGL features -- most of the discussion is on the "#radeon" channel right now.
      Great news. I haven't realized so much was happening behind the scenes. It's a pitty that there is no special AMD/ATI Open Source page that tries to compile everything that's been happening for a given week and posts a summary. Or at least gives a good overview over the current state of things. That might also draw new developers to the project and help devs/users in finding the appropriate informations they are looking for. And besides- it would be a nice showcase for AMD/ATI.

      Originally posted by bridgman View Post
      re: vsync support for textured video, the general feeling seems to be that having the compositor sync to vblank is the best approach.
      I understand that there are still vsync issues while using a compositor. I don't know if nvidia is any better there. Nevertheless I could live with that until those are fixed because there is a good workaround: don't use a compositor. But I was more interested if there is work going on for adding proper vsync support to the textured video engine while not running through a compositor because I've heard people complaining they are having video tearing even without using a compositor. I guess it's now some shader code that replaced the legacy 2d and video engine, right?

      Comment


      • #18
        We are going to set up some kind of page (probably a wiki on x.org) for this but I don't want to spend time on it until we have 6xx 3d engine work a bit further along. It would definitely help. Then again, tirdc and Phoronix do a pretty good job too

        We probably need vsync on both composite and non-composite paths -- pity the implementations are completely different ;(

        Yeah, textured video basically replaces the video processing block built in the older overlay with shader code (actually it's really the texture engine doing all the work, the shader just says "fill this triangle with this texture". The nice thing is that the output can go through a compositor rather than being mixed in by the display block, which is what happesn with overlay.

        Comment


        • #19
          Bridgman, a simple question : in windows, some players do use the hardware acceleration for decoding H264.
          Is this functionnality implemented for linux in some way ? If yes, how to use it ? If no, is this implementation "under construction" or no ?
          I'm just talking of decoding H264. I got a AVCHD camcoder and I'd like to know if someday I'll be able to read my rushes on my PC. I'm not talking about cracking the blu-ray protection, which I know has already being done.

          Comment


          • #20
            Fixxer_Linux, we are not accelerating video decode under Linux today. As for the future, sorry but that has to fall into the class of "we do not comment on unreleased features or functionality".

            Is the problem today that even with Xv accelerating the rendering your CPU is not able to keep up with the decoding effort (I smell laptop ) ?

            Comment


            • #21
              Originally posted by bridgman View Post
              Fixxer_Linux, we are not accelerating video decode under Linux today. As for the future, sorry but that has to fall into the class of "we do not comment on unreleased features or functionality".
              ok, I can understand that. Just take good note that if you were doing that, you would make your linux customers happy !


              Originally posted by bridgman View Post
              Is the problem today that even with Xv accelerating the rendering your CPU is not able to keep up with the decoding effort (I smell laptop ) ?
              No, desktop : P4C 2,8... (I'll change it when Ut3 will be out with a linux client)

              Many thanks anyway for your answers.

              Comment


              • #22
                my experience

                Hello all.

                It seemed only fair that after I initially started this thread that I would also write a few bits and pieces about what decision I made and what my experience with it has been so far. So I hope nobody minds.

                I got myself a shiny new HD4870 from MSI which I quickly tested under Vista because I wanted to hear how loud it would be and how hot it would get in my system before I started downgrading Xorg and so on. Well... unfortunately the card was way too loud. Admittedly it was the OC edition which I read had better temperatures but nevertheless even for an OC card it was just unbearable even in idle operation. Every now and then you could hear the fan spinning up and down which was also quite unnerving. But the card kept pretty "cold" for a HD4870 which was a plus... even though I noticed a temperature increase in my system. To make a long story short: I returned it.

                After more reading, I bought the Sapphire HD4850 Toxic which is basically one hell of a card. It's damn quiet and keeps cool... and besides that it's somewhere between a HD4850 and HD4870 because it's factory overclocked. This time I was sure I would keep it, so I downgraded Xorg and did the necessary adjustments but was soon struck by how instable the fglrx driver really is.
                • logging off from X: system lockup
                • video playback with kaffeine, xine or mplayer: after the video
                  ended, the system locked up.
                • random lockups as well which I couldn't reproduce

                This I didn't expect. Naturally there were the things like really bad video tearing or wine not working, which I can sadly confirm now but I was willing to endure that until those were fixed. But I still have to work with my system without fearing a random lockup every now and then or anything else.

                Conclusion for me: even though I really would have liked to keep the card because on paper it was everything I wanted and more, there was no way I could deal with those instabilities- especially if there was absolutely no clue when those where going to be addressed. So the card is already on its way back and for the near future, I'll closely watch what happens on the AMD/ATI front but in the meantime I will stay with NVidia even though their 2d performance is crappy and they are taking ages to fix that but at least it's stable.

                Nevertheless... I truely hope that sooner or later, fglrx is on par with the NVidia driver with regard to stability!

                Thanks everyone for your great help... especially bridgman.
                Matthew

                PS. Just for the record: the instabilities were caused by fglrx and not by my hardware. A 650 Watt PSU should be more than enough for a HD4850. Besides my system is rock solid with my NVidia card. Sorry... :-)
                Last edited by RobotMarvin; 08-06-2008, 03:04 AM.

                Comment


                • #23
                  Thanks for the report, I'm still on the fence myself.

                  Could you answer a couple of questions?
                  1) Were you using Catalyst 8.7? I've read at least one user reporting that the "system lockup after video/X" bug was solved in 8.7 - bad news if not!
                  2) What's your distro/architecture/kernel?
                  3) What NVIDIA card have you finally ended up with, and are you happy with it?

                  Thanks again!

                  Comment


                  • #24
                    Originally posted by Spanner View Post
                    Could you answer a couple of questions?
                    Sure. :-)

                    Originally posted by Spanner View Post
                    1) Were you using Catalyst 8.7? I've read at least one user reporting that the "system lockup after video/X" bug was solved in 8.7 - bad news if not!
                    Sorry to disappoint you but I was using the latest Catalyst (8.7) and I had all those issues with it.

                    Originally posted by Spanner View Post
                    2) What's your distro/architecture/kernel?
                    Distro: Gentoo (up2date ~amd64)
                    Arch : x86_64 (Intel Core2Duo E8400 on a X48 chipset based board)
                    Kernel: 2.6.26.1

                    Originally posted by Spanner View Post
                    3) What NVIDIA card have you finally ended up with, and are you happy with it?
                    Currently I am still with my old NVidia 8600 GTS card. I am pretty much undecided what to do next. I am looking into a 9800 GTX+ or a 260 GTX. But there are PROs and CONs to it. Basically the 9800 GTX+ will suffer the same 2d problems like my 8600 but is cheaper and not too bad with performance and power consumption. Whereas according to what I read the 260 GTX should have better 2d performance already but on the other hand, usually has loud fans, higher temps, drains on your PSU like hell and is currently still priced a bit too high considering that the HD4870 is cheaper and scores almost as good as it and sometimes even better. To keep things short. :-)

                    Maybe I will just wait a bit longer and I see how things develop. NVidia has to react to the serious AMD/ATI competition and will do so. And we, the users, will benefit from it. :-)

                    Ah, one more thing: the HD48x0 cards are not too widescreen friendly if you have such a TFT. (I do) For example in text mode (vt), you have a 4:3 picture and everything looks rather "edgy" whereas NVidia uses all available space and things look like they have been antialiased. It's a matter of taste but I like the NVidia way more in this regard. :-)

                    Originally posted by Spanner View Post
                    Thanks again!
                    You're welcome!
                    Last edited by RobotMarvin; 08-06-2008, 06:10 AM.

                    Comment


                    • #25
                      Originally posted by RobotMarvin View Post
                      Distro: Gentoo (up2date ~amd64)
                      Arch : x86_64 (Intel Core2Duo E8400 on a X48 chipset based board)
                      Kernel: 2.6.26.1
                      Okay, I think I know how to "fix" your "# logging off from X: system lockup" problem: Remove atieventsd from the list of progs to start.

                      The wine problems: switch to xorg-x11 backend for openGL. Most programs will work perfectly fine this way. So far I only encountered two progs that have problems: etqw and fgl_glxgears. This is also described in some other thread.

                      Yes, kaffeine is unstable as hell. It already has been since they introduced the textured video. Its xine backend does not like textured video the way it currently is implemented in fglrx at all. To get this stuff working: switch to the opengl render backend. Only the Xv interface is problematic for this one.

                      Comment


                      • #26
                        Originally posted by ivanovic View Post
                        Okay, I think I know how to "fix" your "# logging off from X: system lockup" problem: Remove atieventsd from the list of progs to start.
                        I already read your suggestion yesterday in some other thread and I have to agree with the folks there: no atieventsd here either. Sorry. I haven't checked if it was installed in the first place but since I did not add it to some runlevel and the gentoo devs would not automatically add something like this to the default runlevel, you might have placed it there yourself. Besides, like I said, it was in no runlevel here either-

                        Originally posted by ivanovic View Post
                        The wine problems: switch to xorg-x11 backend for openGL. Most programs will work perfectly fine this way. So far I only encountered two progs that have problems: etqw and fgl_glxgears. This is also described in some other thread.
                        Using software rendering is no solution or fix. Sorry. Besides, try to run some demanding app like etqw in 1920x1200 in software rendering.

                        Originally posted by ivanovic View Post
                        Yes, kaffeine is unstable as hell.
                        But that's the fault of the AMD/ATI driver. Kaffeine works fine on every other system.

                        Originally posted by ivanovic View Post
                        Its xine backend does not like textured video the way it currently is implemented in fglrx at all. To get this stuff working: switch to the opengl render backend. Only the Xv interface is problematic for this one.
                        It's not only xine... try VLC or Mplayer and you get the same. Even though I really like xine but the OpenGL backend is not very sophisticated. I planed to work on it but due to a lack of time... well. Try mplayer instead on this one with the opengl vo.

                        Comment


                        • #27
                          Originally posted by RobotMarvin View Post
                          I already read your suggestion yesterday in some other thread and I have to agree with the folks there: no atieventsd here either. Sorry. I haven't checked if it was installed in the first place but since I did not add it to some runlevel and the gentoo devs would not automatically add something like this to the default runlevel, you might have placed it there yourself. Besides, like I said, it was in no runlevel here either-
                          Yes, I once had placed it there by hand. But when it runs, it (almost always) causes this problem at shutting down X.org. Do you have acpi set active? What else in those regards are you using? Over here there are no problems with shutdown of X.org with fglrx as driver left as long as I do make sure that I don't have atieventsd started.

                          Might be that you are facing some completely different problem, but with the info you have given there is no way for me to track down what it is.

                          Originally posted by RobotMarvin View Post
                          Using software rendering is no solution or fix. Sorry. Besides, try to run some demanding app like etqw in 1920x1200 in software rendering.
                          Trust me, it is *not* using software rendering. It is using hardware accelerated rendering. Or how would I be able to play ut2004 in 1920x1200 with all details maxed and the same fps as with "normal" ATI OpenGL backend selected?

                          Originally posted by RobotMarvin View Post
                          But that's the fault of the AMD/ATI driver. Kaffeine works fine on every other system.
                          Yes, it is. Even on AMD/ATI based systems with radeonhd or radeon kaffeine works fine. Just due to the missing 2D acceleration (and Video acceleration) it is not possible to scale the images in realtime to my fullscreen resolution, so I got to stick to the OpenGL render output to have kaffeine working (eg for my DVB-C card).

                          Originally posted by RobotMarvin View Post
                          It's not only xine... try VLC or Mplayer and you get the same. Even though I really like xine but the OpenGL backend is not very sophisticated. I planed to work on it but due to a lack of time... well. Try mplayer instead on this one with the opengl vo.
                          Hehe, I just know that xine is especially hit on my system. That is I basically get an instant system freeze when trying to watch TV (dvb-c based) with textured video active and Xv output being used.

                          Comment


                          • #28
                            Sounds like the troubles I just had with my 4870 Sapphire. This card is though currently under shipping to get replaced ( bridgman claims it's a defunct card... let's see ). Did windows crash too? You said you used Vista. Could you play a game there or did it also random lookups ( as it did for me but I had an XP SP2 )?

                            Comment


                            • #29
                              Originally posted by Dragonlord View Post
                              Did windows crash too? You said you used Vista. Could you play a game there or did it also random lookups ( as it did for me but I had an XP SP2 )?
                              I only did a few short tests under Vista with the Toxic but it worked flawlessly there, no crashes.

                              Originally posted by Dragonlord View Post
                              Do you have acpi set active? What else in those regards are you using? Over here there are no problems with shutdown of X.org with fglrx as driver left as long as I do make sure that I don't have atieventsd started.
                              Thanks for your offered help, it's really appreciated. But the card is already on its way back, so I don't have it anymore (returned it the same day I got it) and cannot do any more tests with it. Besides I've seen indications from others that this could be related to x84_64 incompatibilites with the fglrx. I don't know. Fact is, I have been using Linux for over 10 years now or so and I remember the early nvidia days when things like that happened and honestly I prefer my system to be stable. I can live with broken features for a while but not with random crashes.

                              Originally posted by Dragonlord View Post
                              Might be that you are facing some completely different problem
                              Maybe but it's 100% fglrx related because without it my system is rock stable. And others are suffering from the same problems. Just search the forums...

                              Originally posted by Dragonlord View Post
                              Trust me, it is *not* using software rendering. It is using hardware accelerated rendering. Or how would I be able to play ut2004 in 1920x1200 with all details maxed and the same fps as with "normal" ATI OpenGL backend selected?
                              I don't know how this is possible. If you eselect the x11 opengl implementation, the libGL symlinks are set to the X11 delivered libs which do not make use of any hw accelerated 3d... at leat not with fglrx. But I haven't checked that so I don't know. bridgman?

                              Comment

                              Working...
                              X