Announcement

Collapse
No announcement yet.

AMD Radeon HD 5750/5770

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    5770 Practical desktop usage with compiz?

    Any users of any of the 57xx or 58xx cards under linux out there?

    Obviously these cards are cheap, plenty fast, a good deal... especially if you care about noise/power/heat.

    But does HD playback work with compiz? Google earth? Games?

    I'm just hoping for something that works better than my old cheap 8600 gt.

    Comment


    • #32
      I'm happy with AMD/ATI performace across the board

      I'm real happy with the AMD/ATI hardware under linux.
      I plan on getting a 5XXX card in the near future, Probably after the first of the year. It's not just the hardware, I find CC works well under Linux. and the Driver release cycle has gotten better.. definitely better then nVidia. As of late I find the nVidia has been slipping, there cards are hot, over priced and their Linux drivers release schedule is poor.

      Windows will have Eyefinity first, but they should be released in the November update. I am really looking forward to that. main reason to upgrade.

      Comment


      • #33
        Originally posted by jstokes View Post
        I'm real happy with the AMD/ATI hardware under linux.
        I plan on getting a 5XXX card in the near future, Probably after the first of the year. It's not just the hardware, I find CC works well under Linux. and the Driver release cycle has gotten better.. definitely better then nVidia. As of late I find the nVidia has been slipping, there cards are hot, over priced and their Linux drivers release schedule is poor.

        Windows will have Eyefinity first, but they should be released in the November update. I am really looking forward to that. main reason to upgrade.
        So does HD playback, games, google earth and the like work with your setup and compiz?

        Comment


        • #34
          Besides offering competitive performance -- and superior in some cases -- to its predecessors, the Radeon HD 5700 series also has the advantages of moving to the 40nm fabrication process, dual-link DVI with DisplayPort and HDMI with Eyefinity support, reduced power consumption, each graphics card is only dependent upon a single 6-pin PCI-E power connector, TeraSCALE 2 Unified Processing Architecture, PCI Express 2.1, and support for DirectX 11 (if it matters to you).
          Nice summary! Let me comment it:
          1. What is the advantage for me of 40nm? I dont care for 40nm! They can produce in 130nm, if card is cheap, fast, silent and has a low power consumption. So overall: no advantage
          2. Dual Link DVI: Does it work under Linux without problems with Display Port and HDMI? Did you make any tests?
          3. Eyefinity: As you already mentioned: Not useable under Linux -> no advantage
          4. Reduced Power consumption: Does the card reduce clocks and speeds when being idle work under Linux? How is the consumption under Linux? Where are the TESTS(!!!)?
          5. Only 6-pin power connector! WOOT!!!! FANTASTATIC! I always have problems finding a power connector . Rest see point 1.
          6. TeraSCALE 2 Unified Processing Architecture: yeah right... hummm Can it be used under Linux?
          7. PCI express 2.1: What an adavantage... Unfortunatly the cards are so slow under Linux due to bad drivers, that they cant take adavantage of a bigger bandwidth. Correct me with TESTS(!!!) if I am wrong.
          8. DirectX11: Yeah! What an big advantage under Linux!

          So the summary is in reality:
          Nice card, nothing new besides the name and some internal stuff no real world linux user would see. If you are a Windows user, check reviews of other sides!

          Congratulations for such a nice and really usefull review!

          Comment


          • #35
            Originally posted by BillBroadley View Post
            So does HD playback, games, google earth and the like work with your setup and compiz?
            Well, if you patch xorg-server, you will get wonderfull performance.

            I got hd3650 on a laptop.

            - I can play hd movies with xv, without problems.
            - Wine works fine (there is alot of trouble with wine, when using fglrx < 9.7).
            - kwin (kde4) works very good.

            The only thing I can point my fingers on, is the 2d performance compared to the oss drivers. I use the oss driver right now with experimental 3d support. It works very stable, and compiz with 2d is like, YEAH!

            Comment


            • #36
              Originally posted by Hasenpfote View Post
              Nice summary! Let me comment it:
              1. What is the advantage for me of 40nm? I dont care for 40nm! They can produce in 130nm, if card is cheap, fast, silent and has a low power consumption. So overall: no advantage
              2. Dual Link DVI: Does it work under Linux without problems with Display Port and HDMI? Did you make any tests?
              3. Eyefinity: As you already mentioned: Not useable under Linux -> no advantage
              4. Reduced Power consumption: Does the card reduce clocks and speeds when being idle work under Linux? How is the consumption under Linux? Where are the TESTS(!!!)?
              5. Only 6-pin power connector! WOOT!!!! FANTASTATIC! I always have problems finding a power connector . Rest see point 1.
              6. TeraSCALE 2 Unified Processing Architecture: yeah right... hummm Can it be used under Linux?
              7. PCI express 2.1: What an adavantage... Unfortunatly the cards are so slow under Linux due to bad drivers, that they cant take adavantage of a bigger bandwidth. Correct me with TESTS(!!!) if I am wrong.
              8. DirectX11: Yeah! What an big advantage under Linux!

              So the summary is in reality:
              Nice card, nothing new besides the name and some internal stuff no real world linux user would see. If you are a Windows user, check reviews of other sides!

              Congratulations for such a nice and really usefull review!
              1. This implies point 4.

              2. Never had any problems with DVI or HDMI on Linux (be it fglrx, nvidia or the OSS drivers). I doubt 5xx0 will change this.

              3. FUD. AMD has already showcased X-Plane running on Linux with Eyefinity.

              4. Obviously yes. My X1950 and 4850 show *identical* idle power consumption on Linux and Windows.

              5. Yes, the single 6pin connector is significant. Other cards in the same performance bracket need more than that.

              6. This is marketing speak.

              7. OpenGL performance is identical on Windows and Linux (they use the same driver, duh). 2D performance is worse on Linux, but quite usable now (unlike, say, 6 months ago).

              8. Shader Model 5.0 is available through OpenGL, so yes this is a big advantage on Linux.

              You have no idea what you are talking about and it shows.

              Comment


              • #37
                Originally posted by Hasenpfote View Post
                Nice summary! Let me comment it:
                1. What is the advantage for me of 40nm? I dont care for 40nm! They can produce in 130nm, if card is cheap, fast, silent and has a low power consumption. So overall: no advantage
                The lower nm production comes generally hand-in-hand with lower power, or higher clock speeds (better performance). They may also produce more chips off the same wafer, allowing lower costs. So overall, advantage.

                2. Dual Link DVI: Does it work under Linux without problems with Display Port and HDMI? Did you make any tests?
                3. Eyefinity: As you already mentioned: Not useable under Linux -> no advantage
                Dunno about the dual link dvi stuff, but eyefinity doesn't work under linux yet - at least not for the consumer. AMD have it running internally I believe (someone correct me if I'm wrong there). Either way, it will likely run on linux eventually, so that's an advantage.

                4. Reduced Power consumption: Does the card reduce clocks and speeds when being idle work under Linux? How is the consumption under Linux? Where are the TESTS(!!!)?
                5. Only 6-pin power connector! WOOT!!!! FANTASTATIC! I always have problems finding a power connector . Rest see point 1.
                6. TeraSCALE 2 Unified Processing Architecture: yeah right... hummm Can it be used under Linux?
                Unified processing architecture will make it better for shaders and OpenCL. So yes, it can be used under Linux.

                7. PCI express 2.1: What an adavantage... Unfortunatly the cards are so slow under Linux due to bad drivers, that they cant take adavantage of a bigger bandwidth. Correct me with TESTS(!!!) if I am wrong.
                8. DirectX11: Yeah! What an big advantage under Linux!

                So the summary is in reality:
                Nice card, nothing new besides the name and some internal stuff no real world linux user would see. If you are a Windows user, check reviews of other sides!

                Congratulations for such a nice and really usefull review!
                Some of the DX11 features are also available as OpenGL extensions, and you can bet it will support all of OpenGL 3.2 and associated GLSL version.

                Comment


                • #38
                  Originally posted by Hasenpfote View Post
                  3. Eyefinity: As you already mentioned: Not useable under Linux -> no advantage
                  What is usable on linux is the amount of connectors on the back. The ability to connect three monitors is the reason I'd choose this over the 4770. No more crawling under the desk to attach the projector.
                  As others said, the full eyefinity stuff will work too, eventually. But since the eyefinity version of the hardware isn't out yet, that'll have to wait (on both windows and linux).

                  Originally posted by Hasenpfote View Post
                  4. Reduced Power consumption: Does the card reduce clocks and speeds when being idle work under Linux?
                  The review didn't say anything about power management, just power consumption. That's a hardware feature and a valid comparison to the other tested cards, which utilize the same driver and thus the same power management code.

                  Originally posted by Hasenpfote View Post
                  Congratulations for such a nice and really usefull review!
                  Congratulations for such a nice and really useful review of the nice and useful review.

                  For the record, the performance comparison on linux is something you don't see on any other site, and the results are useful to anyone who considers buying a new ATI card in the near future.

                  Comment


                  • #39
                    Originally posted by BlackStar View Post
                    1. This implies point 4.
                    No! As I wrote, I dont care for the fabrication process as long as the card is fast and consumes less power! I can remember where the new fabrication process did not bring any advantages in power supply. Or did you see any AMD AthlonX2 with 35W in 65nm? I only own a 90nm with 35W! And you only can buy 45W CPUs!

                    2. Never had any problems with DVI or HDMI on Linux (be it fglrx, nvidia or the OSS drivers). I doubt 5xx0 will change this.
                    Nice. Me neither. But I dont have a Display Port monitor. But I can assume a __REVIEW__ site has one and may do some TESTS with all kind of combinations. Did they do it in the review?

                    3. FUD. AMD has already showcased X-Plane running on Linux with Eyefinity.
                    Yeah. And for consumers?

                    4. Obviously yes. My X1950 and 4850 show *identical* idle power consumption on Linux and Windows.
                    Obviously? Well, obviously a new card is faster than an old card. So why do a review overall?

                    5. Yes, the single 6pin connector is significant. Other cards in the same performance bracket need more than that.
                    And other cards dont need it but only use it for safety or to reduce load on a line for the power supply. But yeah, you are right! This is an HUGE advantage over other cards.

                    6. This is marketing speak.
                    So I count this as an advantage and a big point FOR the 57x0 series.

                    7. OpenGL performance is identical on Windows and Linux (they use the same driver, duh). 2D performance is worse on Linux, but quite usable now (unlike, say, 6 months ago).
                    I dont believe this, but as far as I dont have a proof for this, you are right.

                    8. Shader Model 5.0 is available through OpenGL, so yes this is a big advantage on Linux.
                    Which games under Linux user Shader Model 5? Which games under Windows use DirectX11?

                    You have no idea what you are talking about and it shows.
                    Well.. You may be right. But then dont call this a review but a benchmark-marathon without deeper investigations!

                    Originally posted by mirv
                    The lower nm production comes generally hand-in-hand with lower power, or higher clock speeds (better performance). They may also produce more chips off the same wafer, allowing lower costs. So overall, advantage.
                    Well, lower power is wrong! (Athlon X2 EE SFF 35W in 90nm, 45W in 65nm!). The fact, that ATi can cut more out of a wafer is not an advtange of the costumer.
                    Unified processing architecture will make it better for shaders and OpenCL. So yes, it can be used under Linux. ... Some of the DX11 features are also available as OpenGL extensions, and you can bet it will support all of OpenGL 3.2 and associated GLSL version.
                    Is it usable NOW? If not, dont name it as advatange but a future feature or something like that. I bought a 3870 2,5 years ago with the advantage of having open source drivers. WEll, when the drivers are there, the card is old and I can buy a new one. So IMO future feature dont count!

                    Originally posted by rohcQaH
                    What is usable on linux is the amount of connectors on the back. The ability to connect three monitors is the reason I'd choose this over the 4770. No more crawling under the desk to attach the projector.
                    Does it work? Normally this should be __tested__ in a so called review! Not only copying from marketing slides!
                    The review didn't say anything about power management, just power consumption. That's a hardware feature and a valid comparison to the other tested cards, which utilize the same driver and thus the same power management code.
                    Sorry, but it seems I missed the table where the power comsumption is compared of all tested cards. Yeah, I think its on page 9 or 10, which are missing btw...
                    Congratulations for such a nice and really useful review of the nice and useful review.
                    You're all welcome
                    For the record, the performance comparison on linux is something you don't see on any other site, and the results are useful to anyone who considers buying a new ATI card in the near future.
                    For the record: It is only useful if he already owned an ATi card. I missed some cards from Nvidia. But fortunalty, Nvidia play no significant role on the graphic market.

                    Comment


                    • #40
                      Shader model 5.0, and dx11 are new - who seriously expects a great range of games to use them just yet? Battleforge, btw, does use something from dx11 (dxcl I think), and yes, the new radeon cards shine when that's used (only used in high detail settings if I remember rightly).
                      Low power generally comes from reduced nm fabrication - but they can also jam more features in (which in turn, may increase power usage). You really can't expect them to just make the same thing at lower nm - there's no point, it doesn't move the tech forward.
                      OpenGL 3.2 is probably already supported with the latest catalyst drivers - I don't know, I don't use them yet. All extensions are in place already, however, so you can program with it now.
                      Unified stuff as far as I understood it means the same calculation unit can be used for vertex or pixel (or geometry) shaders. That means it should be able to balance loads better, or re-use calc units for later pipeline stages - but that's more a hardware thing and is already being used by actually using the card.

                      Comment


                      • #41
                        Originally posted by Hasenpfote View Post
                        Nice. Me neither. But I dont have a Display Port monitor.
                        So it just happens that Michael doesn't have a DisplayPort-Monitor, either. Do you expect him to spend big money just to verify an advertised feature?
                        I've read quite some windows reviews of the card, not a single one actually mentioned if DisplayPort works or not. Why are your standards for a small site like phoronix significantly higher?

                        Originally posted by Hasenpfote View Post
                        Which games under Linux user Shader Model 5? Which games under Windows use DirectX11?
                        So what if you personally don't care about SM5 right now? Does that mean that nobody else is allowed to care, either? Does that mean that a review mustn't mention the card's capabilities?
                        Developers care about SM5, they want cards that can run their newest code. It's a real advantage to a graphics developer, right now.
                        If you really need to complain about little semantic details, DX11 support is not a "future" thing, the card actually supports DX11 today. It doesn't matter to you today, but this isn't an article about DX11, but about the GPU.
                        In this case the feature was even qualified with the phrase "if it matters to you". Duh.

                        Originally posted by Hasenpfote View Post
                        The fact, that ATi can cut more out of a wafer is not an advtange of the costumer.
                        40nm is not a direct advantage for the customer. But the results of 40nm are, that is low power consumtion (and less cooling problems), smaller cards and cheaper production. You get more GPU power for less money.
                        Sure, you can say "But a GPU in 55nm with equal features is exactly as good". Right. But you're missing the point that an equal GPU in 55nm simply isn't possible.

                        Originally posted by Hasenpfote View Post
                        For the record: It is only useful if he already owned an ATi card. I missed some cards from Nvidia. But fortunalty, Nvidia play no significant role on the graphic market.
                        Fun fact: I own a nvidia card, and I still think the review was useful.

                        From previous reviews I know how different cards compare (including nVidia cards). I have a basic understanding about the speed of a HD4870. With this comparison, I now have a basic understanding about the speed of the HD5770. That's all I wanted to know.


                        If you want to know more, you could try to educate yourself by reading more reviews. You could also try asking nicely, that usually gets you an answer. Or you can just make some unrealistic demands and then complain that they weren't met. Which is obviously way more fun.

                        Comment


                        • #42
                          Originally posted by rohcQaH View Post
                          If you want to know more, you could try to educate yourself by reading more reviews. You could also try asking nicely, that usually gets you an answer. Or you can just make some unrealistic demands and then complain that they weren't met. Which is obviously way more fun.
                          I was only critizing (one can say in a unfriendly way), that the advantages from the summary are mostly not tested in the review.
                          The whole review was just benchmarking. When I read the review, I know the cards is as fast as a 4870. Nothing more. For the rest, I have to trust the advertisement or other reviews, which are mostly done on Windows and have to trust, that the behaviour in Windows and Linux is the same.

                          So a company gets a Phoronix Award for bringing out new midrange hardware which is as fast as last generation performance hardware. Ok. So please give AMD and Nvida awards for their previous generations of hardware, too (just kidding )

                          PS: And of course its fun to make AMD-fanboys upset (Kano is becoming too friendly... )

                          Comment


                          • #43
                            Originally posted by jstokes View Post
                            As of late I find the nVidia has been slipping, there cards are hot, over priced and their Linux drivers release schedule is poor.
                            Come on, they released a new version every week.

                            Comment


                            • #44
                              Originally posted by Apopas View Post
                              Isn't opengl performance good enough?
                              Actually...it's important to have good Xv support because many video playback apps use Xv instead of alternatives because it works right (and better than the other options...) on other devices/drivers. Kano's right in his gripe- it DOES need to be fixed however it's arrived at on that one.

                              Comment


                              • #45
                                The biggest question I would have would be of whether or not fglrx happens to support the OpenGL 2.1 ARB extensions for supporting the bulk of DirectX 10/11 functionality under OpenGL... I know that there were already proposed extensions by the ARB at the time that DirectX 10 came out...

                                Comment

                                Working...
                                X