Announcement

Collapse
No announcement yet.

Gallium3D VDPAU On Radeon Starts Working

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • agd5f
    replied
    Before we had UVD (r1xx-r5xx), we used the 3D engine for video decode, so it's definitely viable.

    Leave a comment:


  • RealNC
    replied
    Originally posted by brent View Post
    It's easy to decode video with the ASIC (if you know how to), but that is not what is being used here. The Gallium implementation uses shaders, this is less efficient and it's not possible to offload the complete decode pipe. I'm sceptical, I doubt this is going to work well on low-end GPUs.
    That doesn't sound too good :-/

    Moreover, even the cheapest CPUs you can get nowadays (save for some of the remaining single-core surplus and netbook CPUs) are capable of decoding HD video at bluray bitrates just fine. You don't need to spend 200 bucks.
    Currently not though. Distros don't ship multithreaded players. That stuff is still locked inside experimental Git and SVN branches. And with only one core, not even a Core I7 Extreme can decode high bitrate H.264.

    Leave a comment:


  • brent
    replied
    It's easy to decode video with the ASIC (if you know how to), but that is not what is being used here. The Gallium implementation uses shaders, this is less efficient and it's not possible to offload the complete decode pipe. I'm sceptical, I doubt this is going to work well on low-end GPUs.

    Moreover, even the cheapest CPUs you can get nowadays (save for some of the remaining single-core surplus and netbook CPUs) are capable of decoding HD video at bluray bitrates just fine. You don't need to spend 200 bucks.

    Leave a comment:


  • RealNC
    replied
    Originally posted by brent View Post
    This is pretty cool, but I still doubt the usefulness of this. Power consumption aspects aside, if your GPU is powerful enough for accelerating HD playback, it's likely your CPU will be powerful enough to do it alone.
    You don't need a powerful GPU to accelerate HD video. You can get a cheap one for 50 bucks that does it. And then it's likely that your CPU is also on the cheap side, and therefore no good for HD video. Which is the whole point. You spend 50 bucks (GPU) to be able to play HD video, instead of 200 bucks (CPU) that could do it.

    Leave a comment:


  • brent
    replied
    This is pretty cool, but I still doubt the usefulness of this. Power consumption aspects aside, if your GPU is powerful enough for accelerating HD playback, it's likely your CPU will be powerful enough to do it alone. Also keep in mind that with Gallium's partial acceleration, a fair bit of CPU power is still needed, especially with H.264.

    Leave a comment:


  • droidhacker
    replied
    Originally posted by pingufunkybeat View Post
    Yeah, if he already has a CrystalHD, then why not keep using it?

    I'm happy about the shader decoding, it's a nice tool to have, and it's a flexible solution. Personally, given the power of my CPU, I don't find it all that important for me, but it might save some people with weaker CPUs.
    The point isn't to quit using it, its to not need one to begin with. They aren't exactly free.

    Leave a comment:


  • pingufunkybeat
    replied
    Yeah, if he already has a CrystalHD, then why not keep using it?

    I'm happy about the shader decoding, it's a nice tool to have, and it's a flexible solution. Personally, given the power of my CPU, I don't find it all that important for me, but it might save some people with weaker CPUs.

    Leave a comment:


  • curaga
    replied
    @pingufunkybeat

    I think we agree here, I was just replying because I do not understand a switch away from a superior solution that's also FOSS. Perhaps droidhacker cares about the firmware on the device, I don't know.

    Leave a comment:


  • pingufunkybeat
    replied
    CPU cracking BluRay encryption under Linux = even more Watts.
    The system + Wifi power costs needed to download a 9GB HD movie = more than a KWh.

    The extra 20W or whatever needed to decode the movie in software is a red herring IMHO.

    The shaders are not the biggest problem you have if you want to legally watch HD contents. The whole video decoding problem is not technological, it is legal, and Linux will ALWAYS consume more power than Windows doing this.

    There is a law which stipulates that you should be punished for watching movies on Linux. If you defeat that law, there are no technological issues left. It's not a G3D/AMD/Nvidia/UVD/h264 discussion, you are using more watts because the media industry made you.

    Leave a comment:


  • curaga
    replied
    CrystalHD - 1 watt
    GPU running all its shaders - two to three number watts

    Leave a comment:

Working...
X