Announcement

Collapse
No announcement yet.

AMD Radeon HD 5750/5770

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #61
    For standard content just use opengl as output and amdcccle to force quality mode/vsync on. Interesting is only 1080p.

    Comment


    • #62
      Hi Yall & Kano,

      openGL does get rid of the video-tearing and on my new build, works fine. I think I tried it with my "old build," and the hardware was too slow.

      VLC barfs. Xine and SMP are good. But with .mkv, SMP will sometimes shrink the video playback size, in 16.9 aspect-which is odd.

      Again, thanx for the heads up. :-)

      Greekgeek. :-)

      Comment


      • #63
        Well i often use VLC 1.0.2. You can set it to opengl output as well. My favorite player is mplayer however as i don't know how to enable something similar to -af volnorm with vlc (Xine has got a way to enable that too, just differently). Xine works with opengl with an additional override in .xine/config:

        video.output.opengl_renderer:2D_Tex

        it took a while till i found that out - otherwise i had no gui for vdr. But for 1090p even fast systems get out of sync sometimes when rendered to opengl. Saidly mplayer has got that problem too when the input is not 100% optimal, then a/v sync is lost and it does not resync. Win player do that, i really want that feature on linux too, that would fix lots of problems. It is useless to test 1080p docus when you can not see the speaker to find those issues however.
        Last edited by Kano; 16 October 2009, 08:28 AM.

        Comment


        • #64
          Originally posted by rohcQaH View Post
          The ability to connect three monitors is the reason I'd choose this over the 4770. No more crawling under the desk to attach the projector.
          And you know what REALLY funny is:
          If you have more then 1 monitor connected, the idle power consumption goes from 18W to 50W!

          Source (in German):
          Nachrichten aus den Bereichen Computer, Hardware und Software, sowie professionelle Testberichte und Grafikkarten-Benchmarks zu neuester Hardware und Unterhaltungselektronik.


          Unfortunalty this is not ATi specific. Nvidia has the same problem (in the article 275 and 295 are named).

          Thank you graphic vendors for lieing to your advanced customers! Your idle power is NOT the defined idle power!

          And thank you to all websites doing so called reviews! You ALL failed!

          Comment


          • #65
            Originally posted by Hasenpfote View Post
            And you know what REALLY funny is:
            If you have more then 1 monitor connected, the idle power consumption goes from 18W to 50W!

            Source (in German):
            Nachrichten aus den Bereichen Computer, Hardware und Software, sowie professionelle Testberichte und Grafikkarten-Benchmarks zu neuester Hardware und Unterhaltungselektronik.


            Unfortunalty this is not ATi specific. Nvidia has the same problem (in the article 275 and 295 are named).

            Thank you graphic vendors for lieing to your advanced customers! Your idle power is NOT the defined idle power!

            And thank you to all websites doing so called reviews! You ALL failed!
            For driving displays you need a particular bandwidth of memory to allow the display controllers to read out memory and still leave headroom for 3D and video clients to do initial rendering (until the clocks ramp up). This is immutable.

            For the single monitor case, you can use very low clocks quite safely. The more monitors you add the higher the memory clock (and possibly the engine clock) will need to be raised. Nothing sinister, but fundamental resource constraints.

            Regards,

            Matthew

            Comment


            • #66
              Originally posted by Hasenpfote View Post
              And you know what REALLY funny is:
              If you have more then 1 monitor connected, the idle power consumption goes from 18W to 50W!
              I was expecting it to increase, but that's more than I imagined. Thanks for the link! And thanks for the clarification, mtippett!

              I'd hope the drivers are smart enough to to power down when the monitors are off or not connected though. (for example when the computer is doing some simulations while I'm afk)

              As that's a technical problem that won't change, I'd still pick the 5770. It's not 13W with 2 monitors, but still less than other comparable cards. And as I just bumped my head against the desk when switching connectors, I really want that third output

              As soon as phoronix says "drivers are ready", I'm buying.

              Comment


              • #67
                Originally posted by mtippett View Post
                For the single monitor case, you can use very low clocks quite safely. The more monitors you add the higher the memory clock (and possibly the engine clock) will need to be raised.
                But why not being honest to the customer and saying "Our cards need idle 18W, when you only use one monitor! Otherwhise its 50W (or 60W or whatever)!" All reviews I read about the 5700/5800 series where saying, how great the idle power consumption is (which is true!). Especially with a feature like Eyefinity and prices for a 22" starting at 125Euros one can assume that people are using more than one monitor (if the girlfriend/wife does not have a different opinion). And suddenly it isnt so great anymore and people are asking, why their cards are so loud and being silent in the reviews. (In the link, they say, that the fans are running with 1400RPM.) The reviews a bought by AMD, AMD sucks, never again AMD, buy Nvidia, etc. pp. On heise.de in the comments people are still complaining about driver issues with AMD (on my side I had never any problems since my 9500pro). So some things can become kind of a longliving urban legend which can cost customers.

                So to quote rohcQaH: Are the drivers (now or in the future) intelligent enough to determine, if a monitor is active or not and adjust the clocks as needed (with steps like 50% for 2 monitors, 75% for three, etc.)? Are there any plans to let the user decide, if he wants to have lower clocks and problems (on 3dcenter.org they say something about flicker) or higher clocks and no problems?

                Thank you in advance for an answer.

                Comment


                • #68
                  Originally posted by mtippett View Post
                  The more monitors you add the higher the memory clock (and possibly the engine clock) will need to be raised. Nothing sinister, but fundamental resource constraints.
                  What if it were possible to lower the refresh rate when it's not needed? I assume there's at least some LCDs out there that can run at 24Hz for 1080p24 and the like. It could be an extra timeout in the same way the DPMS stuff works.

                  Comment


                  • #69
                    Originally posted by Hasenpfote View Post
                    But why not being honest to the customer and saying "Our cards need idle 18W, when you only use one monitor! Otherwhise its 50W (or 60W or whatever)!"
                    because that isn't the full truth either. Since memory bandwidth has to match output bandwidth, I'd expect a single 2560x1600-monitor (dual-link) to draw more power than your tiny 10" netbook display (if you can fit the 5870 into your netbook - or the other way round )

                    Ati releases idle watts and TDP, those are two interesting baseline values that are useful enough to compare different cards (even against nvidia's cards, no cheating there). Marketing presentations just won't have huge tables with wattages under different conditions. Very few people would even read them.

                    Comment


                    • #70
                      Originally posted by rohcQaH View Post
                      because that isn't the full truth either. Since memory bandwidth has to match output bandwidth, I'd expect a single 2560x1600-monitor (dual-link) to draw more power than your tiny 10" netbook display (if you can fit the 5870 into your netbook - or the other way round )
                      Ok, this is an argument.

                      Ati releases idle watts and TDP, those are two interesting baseline values that are useful enough to compare different cards (even against nvidia's cards, no cheating there). Marketing presentations just won't have huge tables with wattages under different conditions. Very few people would even read them.
                      But marketing can prepare slides to show how Nvidia currently sucks (http://tinyurl.com/yjzx42s ) and AMD prepares huge tables of technical information. And review sites and advanced users are interested in this technical information. Of course, if this got released in public documents, it's no fault of ATi but the reviewing sites and the users.

                      Well, in the end the difference in power consumtpion is no a real relevant buy factor if affects ATi and Nvidia. But I was quite shocked, that this was not discussed earlier (or I just overlooked it).
                      Last edited by Hasenpfote; 17 October 2009, 01:59 PM.

                      Comment

                      Working...
                      X