Announcement

Collapse
No announcement yet.

ATI R600/700 OSS 3D Driver Reaches Gears Milestone

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by lucky_ View Post
    a software rendered opengl backend would be even faster than X11/xrender.
    You mean software-rendered OpenGL backend would be faster than a hardware-rendered XRender? Sounds like an interesting claim.
    Edit: The important bit here seems to be to analyze which of them spends more time in software-fallback state. Also another important thing is why it does that. Does QT possibly use excess features whose dropping would reduce the time spent in software-fallback state but have a negligible impact on the user-friendliness of the UI? This is what I meant by bloat.
    Last edited by nanonyme; 17 August 2009, 08:01 AM.

    Comment


    • Originally posted by nanonyme View Post
      You mean software-rendered OpenGL backend would be faster than a hardware-rendered XRender? Sounds like an interesting claim.
      Edit: The important bit here seems to be to analyze which of them spends more time in software-fallback state. Also another important thing is why it does that. Does QT possibly use excess features whose dropping would reduce the time spent in software-fallback state but have a negligible impact on the user-friendliness of the UI? This is what I meant by bloat.
      Note that a fallback on XRender involves a VRAM->RAM download (extremely slow), software rendering and a final RAM->VRAM upload - add to that the overhead of X11 and it's not inconceivable that software OpenGL can be faster than software XRender.

      Obviously, hardware XRender will be faster than software OpenGL, but the point is optimizing the worst case. Besides, hardware OpenGL is likely to be faster than XRender on modern hardware, which lacks a 2d engine.

      I wonder if OpenGL can be used to add cross-platform hardware acceleration to Qt. Right now, the developers have to test and optimize at least 3 different codepaths (GDI, XRender, Quartz), not to mention whatever stuff mobile devices ship with. Using OpenGL and OpenGL|ES, on the other hand, would allow acceleration on most modern devices - sounds a worthy goal.

      Comment


      • Originally posted by nanonyme View Post
        You mean software-rendered OpenGL backend would be faster than a hardware-rendered XRender? Sounds like an interesting claim.
        Actually since not every driver provides the same set of features, QT when "forced to use" some of them will fallback to its raster engine because it won't trust the underlying platform. This is were the penalty gets big because of the vram to ram transfert which is a huge pain.
        And according to zacks and other qt dev's blogs, there is a gap between the theorically efficient Xrender and its availability.
        Xrender ends up being slower than their pure software engine, thus assuming that opengl even running only in software can be quicker than Xrender, is not too far fetched.

        Comment


        • Pure software 3D engine or software OpenGL rendering: Which one is faster? I give you one guess

          Same applies to any framework that provides abstracted hardware acceleration. There is always overhead involved handling the API. So if you write it all pure software with tight coupling you can beat any. Downside is a lot larger code which is a lot worse than a bit slower performance.

          Comment


          • Originally posted by lucky_ View Post
            This is were the penalty gets big because of the vram to ram transfert which is a huge pain.
            And according to zacks and other qt dev's blogs, there is a gap between the theorically efficient Xrender and its availability.
            I was under the impression the fglrx was the only driver that lacks an efficient XRender implementation. *shrug*

            Comment


            • Originally posted by nanonyme View Post
              I was under the impression the fglrx was the only driver that lacks an efficient XRender implementation. *shrug*
              Well maybe you're right, but according to this post,
              http://zrusin.blogspot.com/2009/08/2d-in-kde.html.

              It seems that going the X11 way is the worst path you can follow. Hence the discussion I put forward.
              Maybe he only considered the open source stack where for sure the are many differences between the feature sset of drivers.

              Comment


              • Originally posted by nanonyme View Post
                I was under the impression the fglrx was the only driver that lacks an efficient XRender implementation. *shrug*
                Have people already forgotten the nVidia disaster with KDE 4.0? Users with $600 of graphics hardware were getting a joke framerate because the driver sucked so bad at simple 2D.

                Comment


                • Originally posted by Ant P. View Post
                  Have people already forgotten the nVidia disaster with KDE 4.0? Users with $600 of graphics hardware were getting a joke framerate because the driver sucked so bad at simple 2D.
                  I didn't forget it -- especially since the support for my chipset was discontinued so my laptop didn't profit from all the patches nvidia made to fix the problems.

                  Comment


                  • NVidia: 1 or 2 fsck ups.
                    AMD: Dozens and counting.

                    Comment


                    • nvidia fucked up so hard I switched over to AMD.

                      Comment

                      Working...
                      X