For standard content just use opengl as output and amdcccle to force quality mode/vsync on. Interesting is only 1080p.
Announcement
Collapse
No announcement yet.
AMD Radeon HD 5750/5770
Collapse
X
-
Hi Yall & Kano,
openGL does get rid of the video-tearing and on my new build, works fine. I think I tried it with my "old build," and the hardware was too slow.
VLC barfs. Xine and SMP are good. But with .mkv, SMP will sometimes shrink the video playback size, in 16.9 aspect-which is odd.
Again, thanx for the heads up. :-)
Greekgeek. :-)
Comment
-
Well i often use VLC 1.0.2. You can set it to opengl output as well. My favorite player is mplayer however as i don't know how to enable something similar to -af volnorm with vlc (Xine has got a way to enable that too, just differently). Xine works with opengl with an additional override in .xine/config:
video.output.opengl_renderer:2D_Tex
it took a while till i found that out - otherwise i had no gui for vdr. But for 1090p even fast systems get out of sync sometimes when rendered to opengl. Saidly mplayer has got that problem too when the input is not 100% optimal, then a/v sync is lost and it does not resync. Win player do that, i really want that feature on linux too, that would fix lots of problems. It is useless to test 1080p docus when you can not see the speaker to find those issues however.Last edited by Kano; 16 October 2009, 08:28 AM.
Comment
-
Originally posted by rohcQaH View PostThe ability to connect three monitors is the reason I'd choose this over the 4770. No more crawling under the desk to attach the projector.
If you have more then 1 monitor connected, the idle power consumption goes from 18W to 50W!
Source (in German):
Nachrichten aus den Bereichen Computer, Hardware und Software, sowie professionelle Testberichte und Grafikkarten-Benchmarks zu neuester Hardware und Unterhaltungselektronik.
Unfortunalty this is not ATi specific. Nvidia has the same problem (in the article 275 and 295 are named).
Thank you graphic vendors for lieing to your advanced customers! Your idle power is NOT the defined idle power!
And thank you to all websites doing so called reviews! You ALL failed!
Comment
-
Originally posted by Hasenpfote View PostAnd you know what REALLY funny is:
If you have more then 1 monitor connected, the idle power consumption goes from 18W to 50W!
Source (in German):
Nachrichten aus den Bereichen Computer, Hardware und Software, sowie professionelle Testberichte und Grafikkarten-Benchmarks zu neuester Hardware und Unterhaltungselektronik.
Unfortunalty this is not ATi specific. Nvidia has the same problem (in the article 275 and 295 are named).
Thank you graphic vendors for lieing to your advanced customers! Your idle power is NOT the defined idle power!
And thank you to all websites doing so called reviews! You ALL failed!
For the single monitor case, you can use very low clocks quite safely. The more monitors you add the higher the memory clock (and possibly the engine clock) will need to be raised. Nothing sinister, but fundamental resource constraints.
Regards,
Matthew
Comment
-
Originally posted by Hasenpfote View PostAnd you know what REALLY funny is:
If you have more then 1 monitor connected, the idle power consumption goes from 18W to 50W!
I'd hope the drivers are smart enough to to power down when the monitors are off or not connected though. (for example when the computer is doing some simulations while I'm afk)
As that's a technical problem that won't change, I'd still pick the 5770. It's not 13W with 2 monitors, but still less than other comparable cards. And as I just bumped my head against the desk when switching connectors, I really want that third output
As soon as phoronix says "drivers are ready", I'm buying.
Comment
-
Originally posted by mtippett View PostFor the single monitor case, you can use very low clocks quite safely. The more monitors you add the higher the memory clock (and possibly the engine clock) will need to be raised.
So to quote rohcQaH: Are the drivers (now or in the future) intelligent enough to determine, if a monitor is active or not and adjust the clocks as needed (with steps like 50% for 2 monitors, 75% for three, etc.)? Are there any plans to let the user decide, if he wants to have lower clocks and problems (on 3dcenter.org they say something about flicker) or higher clocks and no problems?
Thank you in advance for an answer.
Comment
-
Originally posted by mtippett View PostThe more monitors you add the higher the memory clock (and possibly the engine clock) will need to be raised. Nothing sinister, but fundamental resource constraints.
Comment
-
Originally posted by Hasenpfote View PostBut why not being honest to the customer and saying "Our cards need idle 18W, when you only use one monitor! Otherwhise its 50W (or 60W or whatever)!"
Ati releases idle watts and TDP, those are two interesting baseline values that are useful enough to compare different cards (even against nvidia's cards, no cheating there). Marketing presentations just won't have huge tables with wattages under different conditions. Very few people would even read them.
Comment
-
Originally posted by rohcQaH View Postbecause that isn't the full truth either. Since memory bandwidth has to match output bandwidth, I'd expect a single 2560x1600-monitor (dual-link) to draw more power than your tiny 10" netbook display (if you can fit the 5870 into your netbook - or the other way round )
Ati releases idle watts and TDP, those are two interesting baseline values that are useful enough to compare different cards (even against nvidia's cards, no cheating there). Marketing presentations just won't have huge tables with wattages under different conditions. Very few people would even read them.
Well, in the end the difference in power consumtpion is no a real relevant buy factor if affects ATi and Nvidia. But I was quite shocked, that this was not discussed earlier (or I just overlooked it).Last edited by Hasenpfote; 17 October 2009, 01:59 PM.
Comment
Comment