Announcement
Collapse
No announcement yet.
Gallium3D VDPAU On Radeon Starts Working
Collapse
X
-
Before we had UVD (r1xx-r5xx), we used the 3D engine for video decode, so it's definitely viable.
-
Originally posted by brent View PostIt's easy to decode video with the ASIC (if you know how to), but that is not what is being used here. The Gallium implementation uses shaders, this is less efficient and it's not possible to offload the complete decode pipe. I'm sceptical, I doubt this is going to work well on low-end GPUs.
Moreover, even the cheapest CPUs you can get nowadays (save for some of the remaining single-core surplus and netbook CPUs) are capable of decoding HD video at bluray bitrates just fine. You don't need to spend 200 bucks.
Leave a comment:
-
It's easy to decode video with the ASIC (if you know how to), but that is not what is being used here. The Gallium implementation uses shaders, this is less efficient and it's not possible to offload the complete decode pipe. I'm sceptical, I doubt this is going to work well on low-end GPUs.
Moreover, even the cheapest CPUs you can get nowadays (save for some of the remaining single-core surplus and netbook CPUs) are capable of decoding HD video at bluray bitrates just fine. You don't need to spend 200 bucks.
Leave a comment:
-
Originally posted by brent View PostThis is pretty cool, but I still doubt the usefulness of this. Power consumption aspects aside, if your GPU is powerful enough for accelerating HD playback, it's likely your CPU will be powerful enough to do it alone.
Leave a comment:
-
This is pretty cool, but I still doubt the usefulness of this. Power consumption aspects aside, if your GPU is powerful enough for accelerating HD playback, it's likely your CPU will be powerful enough to do it alone. Also keep in mind that with Gallium's partial acceleration, a fair bit of CPU power is still needed, especially with H.264.
Leave a comment:
-
Originally posted by pingufunkybeat View PostYeah, if he already has a CrystalHD, then why not keep using it?
I'm happy about the shader decoding, it's a nice tool to have, and it's a flexible solution. Personally, given the power of my CPU, I don't find it all that important for me, but it might save some people with weaker CPUs.
Leave a comment:
-
Yeah, if he already has a CrystalHD, then why not keep using it?
I'm happy about the shader decoding, it's a nice tool to have, and it's a flexible solution. Personally, given the power of my CPU, I don't find it all that important for me, but it might save some people with weaker CPUs.
Leave a comment:
-
@pingufunkybeat
I think we agree here, I was just replying because I do not understand a switch away from a superior solution that's also FOSS. Perhaps droidhacker cares about the firmware on the device, I don't know.
Leave a comment:
-
CPU cracking BluRay encryption under Linux = even more Watts.
The system + Wifi power costs needed to download a 9GB HD movie = more than a KWh.
The extra 20W or whatever needed to decode the movie in software is a red herring IMHO.
The shaders are not the biggest problem you have if you want to legally watch HD contents. The whole video decoding problem is not technological, it is legal, and Linux will ALWAYS consume more power than Windows doing this.
There is a law which stipulates that you should be punished for watching movies on Linux. If you defeat that law, there are no technological issues left. It's not a G3D/AMD/Nvidia/UVD/h264 discussion, you are using more watts because the media industry made you.
Leave a comment:
-
CrystalHD - 1 watt
GPU running all its shaders - two to three number watts
Leave a comment:
Leave a comment: