06-28-2010, 05:27 PM
Of course I meant to downgrade to 2.6.31 (loving this time-to-edit limit... Just wondering how many times it needs to be repeated to fix this *BUG*)
06-28-2010, 05:34 PM
Guess what? Downgrading helped. This driver now supports 2.6.31, 2.6.34, 2.6.35... Wait, where is 2.6.32?
The second possibility - Ubuntu patched 2.6.32 to make it better -> break it with binary blob.
Yes, I won't stop complaining about this mess until FOSS r600 driver *works* (not only for basic examples); up-to-date r600g would satisfy me for now
06-28-2010, 05:42 PM
I had problems installing the fglrx_8.741 package from Ubuntu-X PPA when running 2.6.34 kernel. Downgraded for now since 10.6 doesn't help any of my issues, but I suspect that there's something wrong with the 2.6.34 patch in build 2 of sarvatt's package.
Originally Posted by Wielkie G
For interested parties:
06-29-2010, 08:30 AM
Just wanted to add my thoughts to this thread since i have the excellent 4850 card .
The 10.6 drivers at least to me are a significant change as far as 2d is concerned .
I had an nvidia card for 3 years and hated compiz performance on it and likewise with the ATI which i have had for some 6 months now. Of course, even without compiz they were quite bad(ATI)
But in 4 years of using ubuntu, first time i have kept compiz on for more than a week with ATI !!
I highly recommend it to users of newer ATI cards .
Strangely i never had video tearing issues with either cards ever , wonder why is that.
Right now the only issue i see is that the window borders have a distinct tear when using compiz effects .Started with the newer themes in ubuntu i would say.
Also, i never care about GPU video decoding, i can play big buck bunny on a laptop core 2 duo without issues and i want my CPU to be at 100 % always, i paid lot of money for it .
My GPU's are anyway busy with games and they dont last as long as a CPU anyway, so screw GPU decoding .Core 2 duos and athlon x3 does fine with videos.
06-29-2010, 08:37 AM
bbb is definitely no reference. that's low bitrate just like all youtube hd videos.
06-29-2010, 08:40 AM
Its a bit strange how a CPU can play 1080p video without stutter , is it because its running at a lower resolution ?
I think its independent of that am i correct ? I think video decode was moved on to the GPU many years before since CPU's werent that efficient at it and with single cores .
I have a bunch of core 2 duo laptops here with crappy Intel graphics and they seem to all play the bunny video fine. Maybe animations take less rendering effort ?
06-29-2010, 09:06 AM
It always depends on the BITRATE. There is a nice tool to check:
06-30-2010, 02:26 AM
Thanks for this, i will use for reference.
Originally Posted by Kano
How about 720 p videos? I have a couple of those and it looks and plays fine on my desktop .I have an x3 2.9 ghz on that one.
Iam assuming there is no GPU decode happening after reading all the UVD threads in this forum.So a CPU can decode 720p comfortably ?
06-30-2010, 05:16 AM
06-30-2010, 06:55 AM
I already TESTED webm and for multichannel ogg vorbis sound the playback was impossible. My vlc vaapi script installs ffmpeg compiled against vpx which can be used to create webm. Did YOU test webm yet? I don't think you ever produced a test file! When i talk about things then i usally tested em before. You talk about things you never ever used em at all! That only shows your complete incompetence. Also it is very unlikely that your "sharders" will accellerate anything. It is just like that youtube will use such low bitrates that even a slow netbook will be able to decode it using the cpu. When you look at youtube h264 bitrates you get the idea what i mean but you never anaylse anything. You just write crap and more crap without any verfification. All you do is dreaming.