Announcement

Collapse
No announcement yet.

Mesa 9.0 Officially Released, Supports OpenGL 3.1

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by przemoli View Post
    If codecs are patented than they will not land in MESA.
    How do you explain mpeg2 being there? It's also patented.

    This isn't about patents, it's about one simple thing: GPUs aren't actually suitable for video decoding. A lot of effort required for not much gain. See this post for example, there's also a bit more discussion later in the thread.

    Comment


    • #12
      I do think that Xvid or x264 efforts are patented. Besides I would rather disagree there is not much to gain with hardware decoding. There is pleanty of devices which cope with decoding of Full HD material only if done by the graphics driver example of which would be many low powered devices including AMD E-350 which I am happy owner of.

      IMHO it would be great gain for Linux in general if one could use device like that to it full extend therefore I believe that mesa should extend its vdpau implementation or add a way to use external vaapi that is done so great by Intel guys

      Comment


      • #13
        OpenGL 3.1 yes, but not for Radeons

        The title says it all. If a multimillion company can't make a decent driver, then I guess I'll buy a NVIDIA card next time. Yes, their driver is closed source, but at least it works and is environment friendly.

        Comment


        • #14
          so what would i need to start playing with openCL and clover? I guess i'd need a newer card than a HD3650. is clover complete enough to compile GEGL and have an openCL enabled GIMP?

          Comment


          • #15
            *sigh*

            Originally posted by wargames View Post
            The title says it all. If a multimillion company can't make a decent driver, then I guess I'll buy a NVIDIA card next time. Yes, their driver is closed source, but at least it works and is environment friendly.
            End the F.U.D.

            FGLRX supports OGL 3.1. FGLRX supports up into the OGL 4.x ranges. You're getting things mixed which shouldn't be mixed.

            We're talking about the open source drivers here, in which AMD has in a lot of ways started over. Who knows when, but the OSS driver will eventually reach feature-parity with AMD's closed driver.

            Comment


            • #16
              Originally posted by ssam View Post
              so what would i need to start playing with openCL and clover? I guess i'd need a newer card than a HD3650. is clover complete enough to compile GEGL and have an openCL enabled GIMP?
              Not sure how complete it is, but you'll need a Radeon HD 5000 class or higher.

              Comment


              • #17
                Originally posted by ryszardzonk View Post
                I do think that Xvid or x264 efforts are patented. Besides I would rather disagree there is not much to gain with hardware decoding. There is pleanty of devices which cope with decoding of Full HD material only if done by the graphics driver example of which would be many low powered devices including AMD E-350 which I am happy owner of.

                IMHO it would be great gain for Linux in general if one could use device like that to it full extend therefore I believe that mesa should extend its vdpau implementation or add a way to use external vaapi that is done so great by Intel guys
                I think (please someone correct me if im wrong :P) but what is being talked about here is shader based acceleration?? Not via the video decoding unit in amd/nVidia gpu's (for OSS drivers). So an E350 would have a lot harder time decoding than a desktop grade GPU with more shaders.

                Comment


                • #18
                  Originally posted by zeealpal View Post
                  I think (please someone correct me if im wrong :P) but what is being talked about here is shader based acceleration?? Not via the video decoding unit in amd/nVidia gpu's (for OSS drivers). So an E350 would have a lot harder time decoding than a desktop grade GPU with more shaders.
                  yes you are right. I tend to confuse those too as I am looking at things more from the user perspective who occasionaly does some bug reports and traslation stuff not the developer. Programing seems to technical for me, but what I am getting at is that one way or another users in general would welcome hardware / gpu decoding in the opensource drivers to the extend it is being done in closed source counter-parts and if am not mistaken there is no shader or any other acceleration for r600 driver that would let decode 1080p sources using that driver

                  Comment


                  • #19
                    Originally posted by ryszardzonk View Post
                    I do think that Xvid or x264 efforts are patented.
                    Yeah, they're patented. But that's irrelevant, mpeg2 is patented too and yet Gallium has a decoder for it.

                    Originally posted by ryszardzonk View Post
                    Besides I would rather disagree there is not much to gain with hardware decoding. There is pleanty of devices which cope with decoding of Full HD material only if done by the graphics driver example of which would be many low powered devices including AMD E-350 which I am happy owner of.
                    Your E-350 decodes Full HD using a dedicated hardware decoder (UVD) that the fglrx driver has access to. But I wasn't talking about a dedicated decoder, I was talking about the GPU. The GPU is bad at decoding.

                    Originally posted by ryszardzonk View Post
                    IMHO it would be great gain for Linux in general if one could use device like that to it full extend therefore I believe that mesa should extend its vdpau implementation or add a way to use external vaapi that is done so great by Intel guys
                    Adding vaapi to Gallium wouldn't change anything when it comes to AMD. It's not about the API, it's getting access to UVD. We don't have documentation to do that, so Gallium can only use the GPU (shaders) for decoding. And writing a shader-based h264 decoder would be a lot of effort for little gain, it's not worth it.
                    Last edited by Gusar; 09 October 2012, 11:08 AM.

                    Comment


                    • #20
                      Originally posted by dogsleg View Post
                      What is the state of radeonsi? When it is planned to be ready?
                      It currently runs lots of 3D demos and basic games, including piglit. I think the biggest thing left is shaking out the remaining bugs in the flow control code in the shader compiler. Once that's working properly we can enable more glamor features and more advanced games should start working.

                      For the current todo list see:

                      Comment

                      Working...
                      X