Announcement

Collapse
No announcement yet.

Open source Linux driver for Bobcat?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by Qaridarium
    they will build a shader based video acceleration solution for the radeon driver be sure.
    That is still over rated... UVD is MUCH better at the job. Plus, what happens if you want to decode movies AND play 3D games at the same time?!? Seriously though, being able to use UVD would be greatly appreciated.

    @Kano TOTALLY useless. The other thing to think about is to put it in X.Org, they might not want to use a proprietary video format. They may only include theora and dirac...

    Comment


    • #12
      Originally posted by LinuxID10T View Post
      Plus, what happens if you want to decode movies AND play 3D games at the same time?!?
      More than one application can use the GPU at the same time, you know.

      Games are already offloading more and more non-graphics work to the GPU, such as physics, animation, and even parts of AI. They're not expecting to get 100% utilization out of it for graphics. Or for anything, really.

      If the game always pegged the GPU to 100% just to maintain playable framerates then the game would be unplayable on even slightly weaker/older hardware. That would in turn mean that your hardware is nowhere close to the game's minimum requirements (which are set far above the "just barely able to play it" mark), and you need to either upgrade your machine or play a different game.

      Comment


      • #13
        Originally posted by elanthis View Post
        More than one application can use the GPU at the same time, you know.

        Games are already offloading more and more non-graphics work to the GPU, such as physics, animation, and even parts of AI. They're not expecting to get 100% utilization out of it for graphics. Or for anything, really.

        If the game always pegged the GPU to 100% just to maintain playable framerates then the game would be unplayable on even slightly weaker/older hardware. That would in turn mean that your hardware is nowhere close to the game's minimum requirements (which are set far above the "just barely able to play it" mark), and you need to either upgrade your machine or play a different game.
        I do realize that I can use the GPU for more than one app at a time. I did benchmarks on GPU usage though. Even using UVD, it still uses some shader power. Using all shader, it would use quite a great deal of power. BTW, most games I play require 100% GPU usage...

        Comment


        • #14
          Basically now VP8 is obsolete. Google bought it because next year H264 would have cost money for web streaming (commerical use already costs money). Last week this was changed that H264 will not require licences for this kind of use. So VP8 is there and nobody needs it anymore - well maybe you

          Also even if it would work then your idea is completely stupid. When a codec needs the power of 2 cores > 2.5 ghz to play 48mbit/s then you think you can accellerate that with a budget card? Without dedicated hardware that's impossible. If you want to use opencl with a high performance card everyboy would just laugh at you because thats even more inefficent than using the cpu for this task. Btw. the bluray DEcryption is always done in software, only HDCP is done in hardware. But HDCP != Bluray encryption... - as there are no certified Linux Bluray software players nobody knows to use HDCP with Linux anyway - maybe some OEMs?

          Comment


          • #15
            Originally posted by Kano View Post
            Basically now VP8 is obsolete. Google bought it because next year H264 would have cost money for web streaming (commerical use already costs money). Last week this was changed that H264 will not require licences for this kind of use. So VP8 is there and nobody needs it anymore - well maybe you
            You are wrong, please read here:

            Comment


            • #16
              Well did you ever pay for x264? I told you already for commercial use (incl. encoders) you have to pay already. But as you can compile x264 on your own this does not really affect you as home user. Why should you use VP8 when you can use H264 (via x264) as well with better quality?

              Comment


              • #17
                Originally posted by Kano View Post
                Well did you ever pay for x264? I told you already for commercial use (incl. encoders) you have to pay already. But as you can compile x264 on your own this does not really affect you as home user. Why should you use VP8 when you can use H264 (via x264) as well with better quality?
                x264 is only an encoder and yes you should pay for it even if you compile it from source. Compiling from source exempts x264 developers for paying themselves to distribute the binary, leaving the hurdle to end-users.

                Also for the decoder, you, Mozilla, Ubuntu or other Firefox distributors, should pay to integrate H.264 decoder. From the linked article:

                First, the H.264-format video needs to be created - but that isn't free under this move. Then it needs to be served up for streaming - but that isn't free under this move. There then needs to be support for decoding it in your browser - but adding that isn't free under this move. Finally it needs to be displayed on your screen. [...] The only part of this sequence being left untaxed is the final one. Importantly, they are not offering to leave the addition of support for H.264 decoding in your browser untaxed. In particular, this means the Mozilla Foundation would have to pay to include the technology in Firefox.

                Comment


                • #18
                  Originally posted by LinuxID10T View Post
                  BTW, most games I play require 100% GPU usage...
                  I'm intrigued. How exactly are you measuring your GPU usage?

                  Comment


                  • #19
                    Originally posted by Kano View Post
                    Also even if it would work then your idea is completely stupid. When a codec needs the power of 2 cores > 2.5 ghz to play 48mbit/s then you think you can accellerate that with a budget card? Without dedicated hardware that's impossible.
                    The developer of the Gallium3D video decoder has stated this in an update (the latest update, unfortunately) dating from January 2009:

                    [...] AthonXP 1.5 GHz + GeForce 6200 machine handle[s] 720p with plenty of CPU to spare.
                    http://www.bitblit.org/gsoc/g3dvl/
                    Is it really fair to consider 1080p decoding an impossible task, when this kind of performance is possible for 720p? A difficult task? Certainly. Challenging? No doubt. I wouldn't hesitate agreeing with you on that, but impossible? It seems clear that it won't be easy improving the driver from 720p-capable, to, as you say yourself, ~50 mbit/s 1080p, but I can't help but think that it's at least possible.

                    Comment

                    Working...
                    X