Announcement

Collapse
No announcement yet.

Reasons Mesa 9.1 Is Still Disappointing For End-Users

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by wargames View Post
    Exactly what I was thinking... If OpenGL ES 3.0 is a modern API with good features (including patent free texture compression), compatibility with both mobile and desktop, ... why have two OpenGL "branches" ?
    GL|ES 3.0 is not in any way a modern API, like all of Khronos' crap. They gave up on a modern API when they abandoned Longs Peak. Their APIs are still based on the ancient C-based crap of OpenGL 1.0, just with more features bolted on in sometimes very awkward ways.

    The feature set of GL||ES 3 is still lacking compared to GL 4.3. There are some important additions over GL|ES 2.0, but there's a reason there's still separate standards.

    Patent-free texture compression is worthless if the hardware doesn't support it. The compression is not as good as contemporary compressed formats, so it's a bad choice for minimizing download time, and it can't be loaded directly into hardware and used, so it's a poor choice for efficiently using VRAM. However, implementation needs to happen soon, as delaying implementation just pushes the date we can use it further back. Driver/hardware has to support it today so the market begins getting saturated with capable devices 4-5 years from now.

    In any case, real OpenGL with all the patented crap is important. Without it, Mesa is mostly just good for running ancient Quake 3 derivatives and simple compositors, not modern AAA games or high-end visualizations. GL 3.3 makes GL tolerable to use (still needs waaay better devs tools, better error reporting, a real test suite for less buggy drivers from the big vendors, and so on outside of what Khronos considers its responsibility). GL 4.3 adds some critical features that D3D has had for over 3 years, though it still has some catching up to do. GL|ES is a compromise for limited hardware, not the future of professional graphics on PCs.

    Comment


    • #22
      Originally posted by elanthis View Post
      GL|ES 3.0 is not in any way a modern API, like all of Khronos' crap. They gave up on a modern API when they abandoned Longs Peak. Their APIs are still based on the ancient C-based crap of OpenGL 1.0, just with more features bolted on in sometimes very awkward ways.

      The feature set of GL||ES 3 is still lacking compared to GL 4.3. There are some important additions over GL|ES 2.0, but there's a reason there's still separate standards.

      Patent-free texture compression is worthless if the hardware doesn't support it. The compression is not as good as contemporary compressed formats, so it's a bad choice for minimizing download time, and it can't be loaded directly into hardware and used, so it's a poor choice for efficiently using VRAM. However, implementation needs to happen soon, as delaying implementation just pushes the date we can use it further back. Driver/hardware has to support it today so the market begins getting saturated with capable devices 4-5 years from now.

      In any case, real OpenGL with all the patented crap is important. Without it, Mesa is mostly just good for running ancient Quake 3 derivatives and simple compositors, not modern AAA games or high-end visualizations. GL 3.3 makes GL tolerable to use (still needs waaay better devs tools, better error reporting, a real test suite for less buggy drivers from the big vendors, and so on outside of what Khronos considers its responsibility). GL 4.3 adds some critical features that D3D has had for over 3 years, though it still has some catching up to do. GL|ES is a compromise for limited hardware, not the future of professional graphics on PCs.
      I must've missed your reply but have you any idea why Valve's Left4Dead game was running fast under OGL drivers?

      Comment


      • #23
        Originally posted by duby229 View Post
        I couldnt possibly disagree more There will always be a market for highend GPU's... The only real question is how big that market will be.....
        That market will shrink. But, the vendors will more and more sanitize their hardware (like they already partly had to to enable reliable GPGPU). This to the extent that they might end up doing a bit of an ARM. Having better designed hardware that works in a low-power envelope and only then scale it up to unlimited power draw. Sensible hw, that scales from mobile to desktop, will get us more reliable hardware designs and will make it easier on us driver writers.

        Originally posted by duby229 View Post
        It's -EXACTLY- that same attitude that fucked AMD in the CPU market... It was believed that competing with Intel at the very high end wasnt a very good idea because the market was shrinking... But in the end since they failed to develop a high end design, they didnt have anything to scale down and offer to mid and low end customers.... Now they are being forced to scale up.... And that is infinitely harder than scaling down. I sure as hell hope that they don't make the same mistake in the GPU market.
        Really? So killing off the geode, right before the Atom made the market, and then selling the Imageon the qualcomm for peanuts, that hasn't hurt AMDs business at all, right?

        Comment


        • #24
          Originally posted by smitty3268 View Post
          /end rant.

          Yeah, we get it, you don't like ati or nvidia.

          Anyway, I'm not going to argue with you. I'll just point out that both companies have sunk a lot more money into optimizing their hardware and drivers for performance than the ARM manufacturers have. ARM manufacturers have optimized for cost and power use, which obviously makes sense for their market. But it means that trying to draw conclusions about drivers for one and assuming they are equally true for the other is a risky proposition.
          I am sorry, but do you know that i was one of the key developers who pushed ATI open with radeonhd, right? I have had the misfortune to be forced to try to get any information out of ATI, to be able to write proper drivers, and have seen how badly organized ATI really was, and probably still is, as AMD lost control of it.

          You do also know that i am RE-ing the arm mali, and that i actually have experience with bringing that hw up, from scratch, and that my research driver is matching performance of the binary driver, in the general case. And i did get this performance by doing very little work on optimizing things.

          I have actual hands on experience with these things, so yes, i think i do know what i am talking about in this situation.

          Comment


          • #25
            I don't think anybody here wants to argu with you about your level of experience.... I think many of us here could argu that some of your past projects were epic wastes of time.... But The fact is we all have to admit that you have contributed a great deal.

            My only contention is your stance on hardware. Not your acheivements.

            Comment


            • #26
              Originally posted by duby229 View Post
              I don't think anybody here wants to argu with you about your level of experience.... I think many of us here could argu that some of your past projects were epic wastes of time.... But The fact is we all have to admit that you have contributed a great deal.

              My only contention is your stance on hardware. Not your acheivements.
              What past projects were such a waste of time then?
              * My modesetting pioneering on unichrome? Turned out, several years later, that i was bang on with my views and direction.
              * Freeing ATI. Without radeonhd and the amazing work we did in getting a working driver out that solidly and that quickly, despite ATIs opposition, means that you still have some semblance of an open driver today. If we hadn't done that, then ATI would've buried this in August/September 2007, no matter what AMD was demanding.
              * The flashrom and coreboot things i pushed forwards. Sure, providing the code for VGA and full graphics hw enablement on a single chip on a single motherboard was not useful for everyone, but that's the nature of the beast. I proved that it was doable, and that there were little technical barriers besides just doing it.
              * Unified graphics driver stacks will happen, eventually. The pain of maintaining the current spread out crap is going to become too much. But again, i had to go pioneer it, and people had to go shoot it down. But it will happen.
              * Freeing Mali and kickstarting ARM GPU driver development. Yeah, sure, an absolute waste of time.

              So yeah, epic wastes of time. All of it.

              Now what did you do in the last 10 years?

              Comment


              • #27
                Originally posted by elanthis View Post
                The feature set of GL||ES 3 is still lacking compared to GL 4.3. There are some important additions over GL|ES 2.0, but there's a reason there's still separate standards.

                Patent-free texture compression is worthless if the hardware doesn't support it. The compression is not as good as contemporary compressed formats, so it's a bad choice for minimizing download time, and it can't be loaded directly into hardware and used, so it's a poor choice for efficiently using VRAM. However, implementation needs to happen soon, as delaying implementation just pushes the date we can use it further back. Driver/hardware has to support it today so the market begins getting saturated with capable devices 4-5 years from now.

                In any case, real OpenGL with all the patented crap is important. Without it, Mesa is mostly just good for running ancient Quake 3 derivatives and simple compositors, not modern AAA games or high-end visualizations. GL 4.3 adds some critical features that D3D has had for over 3 years, though it still has some catching up to do. GL|ES is a compromise for limited hardware, not the future of professional graphics on PCs.
                1) Fully agree GLES3.0 is compromise. But its good compromise, and you can do lot of things with it.
                2) Again I do agree.
                3) Again I do agree. And modern OpenGL also make easier porting efforts from DX (not to meantion that it make Linux more appalng market if that porting can be done for OSX once and reused for Linux after ward!)
                4) But OpenGL ES 3.0 is good enough for games. Very good in fact. Mobile is big market and will be even bigger (prediction is for 2bln of smartphones sold anually, and less than 1bln of PC sold anually, maybe as low as 500mln..)
                For once it will have better ecosystem for development than desktop OpenGL, just because Android developemnt is bigger market than OSX/Linux/Win markets (for OGL) combined.
                And becuase OpenGL ES 3.0 can be added to "desktop" drivers quite easily.

                But for content creation / visualization / whatever demanding task here i agree that OpneGL ES 3.0 is too little in the long run. But that is smaller market.

                OpenGL ES 3.0 matter more becuase it pose bigger market, which will mean better tools. And becuase its possible to support OGL ES on "desktop" OGL, but reverse is not true.
                5) What functionalyty OpenGL lack compared to DX? For that matter what OpenGL 4.2 lacked? (Apart from GPGPU solution closly tied to 3D API?)

                Comment


                • #28
                  Originally posted by libv View Post
                  What past projects were such a waste of time then?
                  * My modesetting pioneering on unichrome? Turned out, several years later, that i was bang on with my views and direction.
                  * Freeing ATI. Without radeonhd and the amazing work we did in getting a working driver out that solidly and that quickly, despite ATIs opposition, means that you still have some semblance of an open driver today. If we hadn't done that, then ATI would've buried this in August/September 2007, no matter what AMD was demanding.
                  * The flashrom and coreboot things i pushed forwards. Sure, providing the code for VGA and full graphics hw enablement on a single chip on a single motherboard was not useful for everyone, but that's the nature of the beast. I proved that it was doable, and that there were little technical barriers besides just doing it.
                  * Unified graphics driver stacks will happen, eventually. The pain of maintaining the current spread out crap is going to become too much. But again, i had to go pioneer it, and people had to go shoot it down. But it will happen.
                  * Freeing Mali and kickstarting ARM GPU driver development. Yeah, sure, an absolute waste of time.

                  So yeah, epic wastes of time. All of it.

                  Now what did you do in the last 10 years?
                  You write good code. There is no doubt about that... But the usefullness of some of your past projects is questionable. Why waste time writing good code that few or no people use? Your experience and skill is obvious.... But you use it on projects that don't matter.... When you bagan modularizing mesa it was excellent code but it had no chance of ever being adopted. When you implemented radeonhd and insisted on banging the modesetting hardware directly, you should have just used atombios from the beginning.

                  I don't want to insult your skill.. You have it... But I don't like your stance on hardware philosophy. You are in a position where people take you seriously and your stance on gpu's has the potential to hurt the high end desktop market.
                  Last edited by duby229; 22 February 2013, 09:10 PM.

                  Comment


                  • #29
                    Originally posted by libv View Post
                    I am sorry, but do you know that i was one of the key developers who pushed ATI open with radeonhd, right? I have had the misfortune to be forced to try to get any information out of ATI, to be able to write proper drivers, and have seen how badly organized ATI really was, and probably still is, as AMD lost control of it.

                    You do also know that i am RE-ing the arm mali, and that i actually have experience with bringing that hw up, from scratch, and that my research driver is matching performance of the binary driver, in the general case. And i did get this performance by doing very little work on optimizing things.

                    I have actual hands on experience with these things, so yes, i think i do know what i am talking about in this situation.
                    Yes, I think we all know who you are and what you've done. You are quite vocal about it at times.

                    It also seems like most of your peers (other Mesa/X developers) tend to disagree with you about a lot of what you say. Just because you've been involved in a lot of these projects doesn't mean you are automatically right about everything even tangentially related.

                    Comment


                    • #30
                      Originally posted by duby229 View Post
                      You write good code. There is no doubt about that... But the usefullness of some of your past projects is questionable. Why waste time writing good code that few or no people use? Your experience and skill is obvious.... But you use it on projects that don't matter.... When you bagan modularizing mesa it was excellent code but it had no chance of ever being adopted. When you implemented radeonhd and insisted on banging the modesetting hardware directly, you should have just used atombios from the beginning.

                      I don't want to insult your skill.. You have it... But I don't like your stance on hardware philosophy. You are in a position where people take you seriously and your stance on gpu's has the potential to hurt the high end desktop market.
                      Lot of harsh words here...

                      Whether the work done was a waste of time, is up to only two people. The person paying Luc for wasting his time and and Luc. I think banging the registers on the radeon was the proper way of doing things. That it would be slower in development time, very probably. Would in result in a more free design? Absolutely. And if liberty is one of your main concerns, you are doing it right. Sure libv may sit on a little loud and high horse at times, but he also has certainly earned it a little don't you think? As for 'the other mesa developers disagree' ... Just because many shout one thing, doesn't mean the one is wrong. In that regard, all us Open Source fanatics must be wrong, right? Many more companies and people don't care about open source or worse, shun it. Sometimes it works for the better, sometimes not. Can't know for sure until you've tried.

                      Comment

                      Working...
                      X