Announcement

Collapse
No announcement yet.

ATI R300g / R600g Unify Their Vertex Buffer Manager

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • ATI R300g / R600g Unify Their Vertex Buffer Manager

    Phoronix: ATI R300g / R600g Unify Their Vertex Buffer Manager

    Hitting the Mesa tree this weekend were messages of "r600g: use the new vertex buffer manager" and "r300g: use the new vertex buffer manager."..

    http://www.phoronix.com/vr.php?view=OTA4Mw

  • #2
    Code reuse is awesome and so is Marek. I wish I had something more to add :\ I'm imsomniacking.

    Comment


    • #3
      excuse my ignorance but...
      what changes for normal users?

      Comment


      • #4
        Originally posted by NomadDemon View Post
        excuse my ignorance but...
        what changes for normal users?
        Today? nothing much, maybe an integration bug or two that'll need to be flushed out by testers.

        Tomorrow? Unifying the code bases means that you avoid having to do double the work to add features/optimisations to each driver, you avoid having the two branches of the code diverge to the point that dual patching would become a pain and all this time gained can be used to improve other parts of the drivers (or, heaven forbid, allow the devs to have a life outside of open source ati drivers ).

        Comment


        • #5
          Not only does this do nothing for the end users. I'm pretty sure most people have moved on to newer GPUs already. I mean r300 is absolutely prehistoric. I have an r700 which is barely ever mentioned on phoronix and I'm already thinking of replacing it with something more up to date.

          Comment


          • #6
            Originally posted by unimatrix View Post
            Not only does this do nothing for the end users. I'm pretty sure most people have moved on to newer GPUs already. I mean r300 is absolutely prehistoric. I have an r700 which is barely ever mentioned on phoronix and I'm already thinking of replacing it with something more up to date.
            Refactoring code to ease maintainability doesn't have a direct end-user benefit. It helps the devs by making their lives easier and allowing a bugfix in R600g to help R300g too. Nobody loses here.

            R700 cards are driven by the R600{c|g} driver, so news that affects your card is fairly regular.

            Comment


            • #7
              Originally posted by unimatrix View Post
              Not only does this do nothing for the end users. I'm pretty sure most people have moved on to newer GPUs already. I mean r300 is absolutely prehistoric. I have an r700 which is barely ever mentioned on phoronix and I'm already thinking of replacing it with something more up to date.
              Having a 5 year old laptop with an X600 card I'm glad to see there is still work going on in the r300 driver. Wouldn't it be for Linux I would have been forced to throw it away 3 years ago after some partial hardware failure (corrupted acpi support).

              Considering that there's no more official ati support for r300 hardware i'm happy with this kind of news. Being still able to use the latest software I really do much more with my computer now than some years ago.

              Comment


              • #8
                Originally posted by DanL View Post
                Code reuse is awesome and so is Marek. I wish I had something more to add :\
                Agreed.

                Thanks a million, Marek. Keep up the good work and don't forget about my promise that if we ever meet, then all the beer you can drink is on me

                Comment


                • #9
                  [Rant] Is it supposed these commits have a higher priority for older hardware (R300)?

                  First of all... I'm really happy there's a lot of development in the OS community to have proper OS out of box support for AMD/ATI hardware , and I'd like to send some kudos to Marek and Bridgman for their amazing work within the OS ATI driver development. I also hope in the near future I can replace Catalyst by this lighter driver in my computer systems (3 computers with ATI HD cards). (Btw, will we have someday proper HW video acceleration on r600+ cards with xf86-ati...?)

                  But, at the same time, I think not everything is bright: I wouldn't like to rant but, why does the main development of OS drivers happens on old hardware? Is it supposed to be like that?
                  That way I think Linux will never be competitive against Windows/Mac OS video driver stacks... But that's just my personal opinion...

                  Btw, keep up with good work Xorg community!

                  Cheers

                  Comment


                  • #10
                    The driver stack is still playing catch-up as the devs learn how to write a Gallium3D driver on hardware they already understand. Don't worry, support is getting closer to release. I hope in a few years we'll start getting release day support.

                    Comment


                    • #11
                      In this case, a rising tide lifts both boats... and things like this will make it easier to get new hardware working quickly.

                      Release day drivers depend on vendors really - intel completely blew Sandy Bridge (although the platform has even worse problems!) and the 6900 series has a new VLIW design that will take a while to adapt to - i.e. a new/changed shader compiler.

                      Looking forward Linux in mobile devices (Android kinda counts) is going to become more important and getting PowerVR specs (reverse engineered or vendor) is the next big hurdle.

                      Comment


                      • #12
                        Originally posted by Chad Page View Post
                        he 6900 series has a new VLIW design that will take a while to adapt to - i.e. a new/changed shader compiler.
                        6900 series are more easy to program with thats because an 5870 shader do have 2 kind of programm units internaly 1 big and 4 smal means 5D-VLIW the 6900 do have 4 of the same big VLIW units per shader groub unit. and the 4 big ones are the same like the big one of the 5 in the 5850 means they just need to drop the smal SIMD-VLIW shader code and only use the big one 4 times per groub.
                        the 6900 is not really a new architecture its a dropdown of the more hard to program with smal SIMD units and the big SIMD-VLIW units are much more easy to use because an dev do not need to handle with the limitations of the smal ones.

                        Comment


                        • #13
                          I'm very pleased with the current speed of develoment. Maybe half a year ago r600g on my Radeon HD4770 was just not usuable for anything except gears. r600c was also terribly slow and had a lot of issues. I remember playing ut2004 with a friend at a insanely low resolution with most detail options reduced to the lowest.

                          Today r600g has not trouble running the game with all options cranked up. If multisample AA support is going to land in r600g maybe I can see performance differences again.

                          Currently the biggest issues for me are missing texture compression support and the aforementioned multisample support.

                          Comment


                          • #14
                            I love the smell of DRY in the code.

                            Smells like... Victory.

                            Comment


                            • #15
                              Originally posted by LiquidAcid View Post
                              Currently the biggest issues for me are missing texture compression support and the aforementioned multisample support.
                              Yeah I think there are a lot of people who would like to see r600g support s3tc.

                              Comment

                              Working...
                              X