Announcement

Collapse
No announcement yet.

AMD Radeon HD 5750/5770

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by Hasenpfote View Post
    Nice. Me neither. But I dont have a Display Port monitor.
    So it just happens that Michael doesn't have a DisplayPort-Monitor, either. Do you expect him to spend big money just to verify an advertised feature?
    I've read quite some windows reviews of the card, not a single one actually mentioned if DisplayPort works or not. Why are your standards for a small site like phoronix significantly higher?

    Originally posted by Hasenpfote View Post
    Which games under Linux user Shader Model 5? Which games under Windows use DirectX11?
    So what if you personally don't care about SM5 right now? Does that mean that nobody else is allowed to care, either? Does that mean that a review mustn't mention the card's capabilities?
    Developers care about SM5, they want cards that can run their newest code. It's a real advantage to a graphics developer, right now.
    If you really need to complain about little semantic details, DX11 support is not a "future" thing, the card actually supports DX11 today. It doesn't matter to you today, but this isn't an article about DX11, but about the GPU.
    In this case the feature was even qualified with the phrase "if it matters to you". Duh.

    Originally posted by Hasenpfote View Post
    The fact, that ATi can cut more out of a wafer is not an advtange of the costumer.
    40nm is not a direct advantage for the customer. But the results of 40nm are, that is low power consumtion (and less cooling problems), smaller cards and cheaper production. You get more GPU power for less money.
    Sure, you can say "But a GPU in 55nm with equal features is exactly as good". Right. But you're missing the point that an equal GPU in 55nm simply isn't possible.

    Originally posted by Hasenpfote View Post
    For the record: It is only useful if he already owned an ATi card. I missed some cards from Nvidia. But fortunalty, Nvidia play no significant role on the graphic market.
    Fun fact: I own a nvidia card, and I still think the review was useful.

    From previous reviews I know how different cards compare (including nVidia cards). I have a basic understanding about the speed of a HD4870. With this comparison, I now have a basic understanding about the speed of the HD5770. That's all I wanted to know.


    If you want to know more, you could try to educate yourself by reading more reviews. You could also try asking nicely, that usually gets you an answer. Or you can just make some unrealistic demands and then complain that they weren't met. Which is obviously way more fun.

    Comment


    • #42
      Originally posted by rohcQaH View Post
      If you want to know more, you could try to educate yourself by reading more reviews. You could also try asking nicely, that usually gets you an answer. Or you can just make some unrealistic demands and then complain that they weren't met. Which is obviously way more fun.
      I was only critizing (one can say in a unfriendly way), that the advantages from the summary are mostly not tested in the review.
      The whole review was just benchmarking. When I read the review, I know the cards is as fast as a 4870. Nothing more. For the rest, I have to trust the advertisement or other reviews, which are mostly done on Windows and have to trust, that the behaviour in Windows and Linux is the same.

      So a company gets a Phoronix Award for bringing out new midrange hardware which is as fast as last generation performance hardware. Ok. So please give AMD and Nvida awards for their previous generations of hardware, too (just kidding )

      PS: And of course its fun to make AMD-fanboys upset (Kano is becoming too friendly... )

      Comment


      • #43
        Originally posted by jstokes View Post
        As of late I find the nVidia has been slipping, there cards are hot, over priced and their Linux drivers release schedule is poor.
        Come on, they released a new version every week.

        Comment


        • #44
          Originally posted by Apopas View Post
          Isn't opengl performance good enough?
          Actually...it's important to have good Xv support because many video playback apps use Xv instead of alternatives because it works right (and better than the other options...) on other devices/drivers. Kano's right in his gripe- it DOES need to be fixed however it's arrived at on that one.

          Comment


          • #45
            The biggest question I would have would be of whether or not fglrx happens to support the OpenGL 2.1 ARB extensions for supporting the bulk of DirectX 10/11 functionality under OpenGL... I know that there were already proposed extensions by the ARB at the time that DirectX 10 came out...

            Comment


            • #46
              Originally posted by Svartalf View Post
              The biggest question I would have would be of whether or not fglrx happens to support the OpenGL 2.1 ARB extensions for supporting the bulk of DirectX 10/11 functionality under OpenGL... I know that there were already proposed extensions by the ARB at the time that DirectX 10 came out...
              Fglrx supports 3.1 + most of the 3.2 spec (full support is bound to come soon). This covers ~99% of the DX10 functionality.

              Some DX11 features are already supported via extensions (and have been supported for a long time - tesselation comes to mind). However, there's a lot of new stuff in DX11 that will take time to become formulized into OpenGL extensions or core features. My guess is that we will see a new OpenGL 3.3 release around Q1 or Q2 2010 that brings OpenGL into rough feature parity with DX11.

              Most DX11-level features should become available as extensions before that - the issue is that Nvidia does not have DX11 hardware yet, so they don't have the incentive to create new OpenGL extensions for those features right now. The actual 3.3 spec will likely be delayed until Nvidia has hardware ready for release (sad but true).

              I was only critizing (one can say in a unfriendly way), that the advantages from the summary are mostly not tested in the review.
              The whole review was just benchmarking. When I read the review, I know the cards is as fast as a 4870. Nothing more. For the rest, I have to trust the advertisement or other reviews, which are mostly done on Windows and have to trust, that the behaviour in Windows and Linux is the same.
              Well, starting with Catalyst 9.x, the behavior between Windows and Linux pretty much *is* the same (2d and video performance excepted, but Michael hinted that the latter may finally be resolved soon). How do I know? I have tested Ati cards on both operating systems: 3d performance, power consumption and OpenGL driver bugs are pretty identical.

              4. Obviously yes. My X1950 and 4850 show *identical* idle power consumption on Linux and Windows.
              Obviously? Well, obviously a new card is faster than an old card. So why do a review overall?
              Do you have any indication that power consumption may be different between OSes on 5xx0? Because all the indications *I* have is that it will be identical.

              5. Yes, the single 6pin connector is significant. Other cards in the same performance bracket need more than that.
              And other cards dont need it but only use it for safety or to reduce load on a line for the power supply. But yeah, you are right! This is an HUGE advantage over other cards.
              Are you really implying that te 4870 has 2x 6-pin connectors just for fun? The GT260 also has 2x 6-pin connectors. Is that just for fun too?

              The fact that 5770 gets you the same or better performance with a single connector indicates a HUGE advantage in power consumption over both cards - which most reviews confirm.

              You can twist the facts all you like, but the numbers speak for themselves. The 5770 *is* more power efficient and the single 6-pin connector is the direct consequence of that.
              Last edited by BlackStar; 14 October 2009, 10:35 AM.

              Comment


              • #47
                Direct3D 11 brings compute shading support. While the 48xx can already work with OpenCL, the 5xxx will do better probably.

                In the Gallium3D era the extra shader flexibility will probably be even *more* useful under Linux. I think we'll see very interesting applications in a year or two...

                Comment


                • #48
                  So is there any documentation released ? Is there any start on the opensource driver ? Sounds like nice hardware ( minus the fan and extra power connector), but seems like it will be 2 or 3 years before it has a stable (open source) driver ? I am still waiting for r600 and r700 to become stable enough that i would buy.

                  matt

                  Comment


                  • #49
                    Originally posted by Chad Page View Post
                    Direct3D 11 brings compute shading support. While the 48xx can already work with OpenCL, the 5xxx will do better probably.

                    In the Gallium3D era the extra shader flexibility will probably be even *more* useful under Linux. I think we'll see very interesting applications in a year or two...
                    I believe the r700 cards support DC4.1 and OpenCL1.0, while the r800 cards will give full DC5 and a to-be-created OpenCL 1.1 version.

                    Comment


                    • #50
                      The 5xxx is close enough to the 4xxx that it won't be too far behind, esp since AMD people are working on the open source drivers.

                      I figure it'll be a few weeks/months? until the documentation's released though. Legal can be odd.

                      Comment

                      Working...
                      X