Announcement

Collapse
No announcement yet.

AMD Radeon HD 5750/5770

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • I'm willing to consider an ATI card when I upgrade my current 7950 GT. I plan to stick my Nvidia card into a future budget build but in the meantime it can be my backup if I have to wait for future improvements in the drivers.

    I have a question, though, but it's not too technical. I am comparing the new Evergreen HD 5770 cards with the Radeon HD 4890 (R700/RV790XT?) cards. The older, 4890, seem to be the performance leader in that price category (<$200). The only negatives I see is the lack of Direct X 11 support and the increased power (wattage). I think that card uses up 190w so comparable to its Nvidia competitor, the GTX260 but in benchmarks, it usually is in front of that card in most games/categories at all resolutions.

    Which ATI card would you choose then for sub-$200? The older one or the newer? From my reading, performance-wise, the older card exceeds the newer but also in power consumption.

    Comments?

    It does sound like good progress with support.
    Last edited by Panix; 12-16-2009, 07:27 AM.

    Comment


    • Originally posted by Panix View Post
      I'm willing to consider an ATI card when I upgrade my current 7950 GT. I plan to stick my Nvidia card into a future budget build but in the meantime it can be my backup if I have to wait for future improvements in the drivers.

      I have a question, though, but it's not too technical. I am comparing the new Evergreen HD 5770 cards with the Radeon HD 4890 (R700/RV790XT?) cards. The older, 4890, seem to be the performance leader in that price category (<$200). The only negatives I see is the lack of Direct X 11 support and the increased power (wattage). I think that card uses up 190w so comparable to its Nvidia competitor, the GTX260 but in benchmarks, it usually is in front of that card in most games/categories at all resolutions.

      Which ATI card would you choose then for sub-$200? The older one or the newer? From my reading, performance-wise, the older card exceeds the newer but also in power consumption.

      Comments?

      It does sound like good progress with support.
      Personally, I'd go with the 5770, mainly because I'm a developer and I value the newer capabalities more than raw performance. I also value accoustics a lot (I've modded my 4850 to remove its fan) and the 5770 surpasses both competitors on that front.

      Finally, the 5770 can run everything at max quality settings at 1680x1050, so it's not exactly strapped for performance. Even 1980x1200 is doable without AA and maybe some lower quality settings on the latest games.

      Comment


      • yes, the 4890 is faster in many situations. On Linux, DX11 isn't worth the shiny sticker they put on the box. And 4xxx is already supported by the OS drivers, while starting to support the 5xxx series seems to be low-priority for now - although they never officially said it, they seem to focus on getting 4xxx as bug-free as possible before moving on. Even fglrx doesn't officially support 57xx yet (but should with the next release sometime this month).

        Then again, you mentioned power consumption which correlates with heat dissipation, noise and higher power bills (80W over 3 years at 5h/day with 22c per kW/h are ~97€). Whether DX11 will matter for you during the next 3 years is anyone's guess. It probably will if you're dual-booting to windows for games. Otherwise, probably not.


        Which card would I choose? I ordered a 5770 a few days ago. If I had choosen a 4xxx chip, it would have been a 4770, because quiet operation and room-temperatures below 40C in the summer are more important to me than ultra-high-detail settings. 5770 won simply because I can connect three displays instead of two.

        Comment


        • I just checked the video card benchmarks at the Passmark site. There are a bunch of cards ahead of the Radeon 5770 including the HD 4890 and even the Geforce GTX 260. Considering the 5770 is almost the same price as both cards, it really boils down to whether you prefer performance for your dollar or the energy/temp benefits. Tough call, imho. The higher temps and power probably are only a factor if you max all settings and are using a high-powered game or application.

          I doubt I'd need DirectX 11. I am not a gamer and although I'd probably want to get a few if I buy such a high level card, I think I'd be satisfied with the games that are out now. My monitor is 22" although I have an extra 20" LCD, too. That gives some options, right? Both resolutions are 1680 x 1050. I wouldn't be going past that.

          I don't know much about games but if I consider getting a few, these are probably ones I'd look at:
          WOW (to try only)
          Call of Duty
          NHL Hockey (NHL 10?)
          Need for Speed and/or Grand Theft-Auto
          Flight Simulator

          As you can tell, I have no preference or lean towards any type. But, the price of $200 is probably quite fixed. I like the performance specs of the Radeon 5850 but the price is too expensive to justify. That seems to be strictly a gamer's card! I'd only be playing games occasionally but it's nice to have the capability of good performance.

          Which is the better card for the price/performance?

          I can wait for driver maturity as long as I can watch movies/videos with the card without problems. I need clear, watchable video.

          Comment


          • I can wait for driver maturity as long as I can watch movies/videos with the card without problems. I need clear, watchable video.
            I have a nvidia 9500GT and an Ati 4850. Both play videos fine as long as you don't use Compiz. Turn on Compiz and you lose vsync on videos (which may or may not be an issue for you - I keep Compiz disabled on my nvidia HTPC because of that).

            Nvidia cards can decode HD videos - in theory at least, because I've yet to find a stable program to do that for me (MPlayer crashes like crazy, XBMC loses sync all the time...) I haven't tried to enable xvba on my 4850, but I've been able to watch up to 1080p on my 2.66GHz Core 2 so it's not that big of a deal to me.

            In short, don't set your expectations too high, but there's no perfect solution at this time. If you can deal with "good enough", both nvidia and ati cards are there.

            Comment


            • About the DX11 is useless on Linux, you are not thinking far ahead enough. These capabilities directly affect OpenCL performance and OpenGL capabilities (for future OpenGL versions).

              Comment


              • For me Mplayer decodes pretty fine the 1080p videos with my 8500GT at 20% CPU.
                Even Xine with xv doesn't uses over 70% CPU for big back bunny.

                Comment


                • Originally posted by Apopas View Post
                  For me Mplayer decodes pretty fine the 1080p videos with my 8500GT at 20% CPU.
                  Even Xine with xv doesn't uses over 70% CPU for big back bunny.
                  What CPU do you have? For me, XBMC decodes 1080p with <10% CPU, but MPlayer crashes very very quickly (does it try to use CPU opcodes that I don't have? No idea). I use an Athlon64 3200+ Venice core (SSE2-capable).

                  Comment


                  • Originally posted by BlackStar View Post
                    Nvidia cards can decode HD videos - in theory at least, because I've yet to find a stable program to do that for me (MPlayer crashes like crazy, XBMC loses sync all the time...)
                    I have been running XBMC with vdpau for a long time now with multiple systems and vdpau cards and haven't had any sync issues at all across many type of video formats. This sounds more like pulseaudio resampling the stream causing sync to go out of whack.

                    Comment


                    • I have a similar CPU as you. Athlon64@3000+. As I mentioned before Big Back Bunny runs at 20% maximum with vdpau and my crappy vga No crashes or anything.
                      Never tried XBMC or any other similar app though.

                      Comment


                      • Originally posted by deanjo View Post
                        I have been running XBMC with vdpau for a long time now with multiple systems and vdpau cards and haven't had any sync issues at all across many type of video formats. This sounds more like pulseaudio resampling the stream causing sync to go out of whack.
                        Why would Pulse resample a 48KHz stream when the sink is also 48KHz? (spdif out)

                        To me this looks more like a network buffering issue, since XBMC sometimes decides to stop and buffer some more, fixing the issue. (Unfortunately, this usually happens only after 15-30 minutes of bad playback). MPlayer is better in this regard, but it crashes a lot so that's no solution. Totem works, but HD video causes 100% CPU usage so that's not an option either.

                        Comment


                        • Originally posted by BlackStar View Post
                          I have a nvidia 9500GT and an Ati 4850. Both play videos fine as long as you don't use Compiz. Turn on Compiz and you lose vsync on videos (which may or may not be an issue for you - I keep Compiz disabled on my nvidia HTPC because of that).

                          Nvidia cards can decode HD videos - in theory at least, because I've yet to find a stable program to do that for me (MPlayer crashes like crazy, XBMC loses sync all the time...) I haven't tried to enable xvba on my 4850, but I've been able to watch up to 1080p on my 2.66GHz Core 2 so it's not that big of a deal to me.

                          In short, don't set your expectations too high, but there's no perfect solution at this time. If you can deal with "good enough", both nvidia and ati cards are there.
                          Oh, I won't have them too high. As long as I see dedication to work on it and don't have to worry about dropped support.

                          If you disable Compiz, does that mean all of 3D capability is disabled, too?

                          I guess it's tolerable as long as there is an option you can set to get clear video whether it's using a driver instead of another or modifying a setting.

                          I only need 720p for now but I guess I should expect 1080p capability for a $200 card.

                          I was comparing GTX 260 since Nvidia is said to still be good enough and driver updating seems quick and the ATI series of 5770/4890 since they are very close in price and of course, the OSS potential plus if fglrx development has some dedication, there is more than one option which should, in theory, be good.

                          I like low temps/power consumption too but at the expense of performance, well, maybe, it depends. Video playing without issue is probably *my* priority overall if the choices are comparable or too close to call.

                          Comment


                          • Originally posted by BlackStar View Post
                            About the DX11 is useless on Linux, you are not thinking far ahead enough. These capabilities directly affect OpenCL performance and OpenGL capabilities (for future OpenGL versions).
                            Well, one idea might be to get a cheaper DX10 card and then upgrade again later. Video cards are still very much in demand. Probably easy to sell a recent one or recycle it into a '2nd system.' I thought of that but not sure I'll be building a 2nd system any time soon and if it is, it will be a budget system probably not needing a high-end card.

                            My computer is a Quad Core Q6600 w/ P35 mobo so still with the 775 LGA architecture. I'm not bottlenecking the video card at all, right? It should still be good hardware for a while, eh?

                            Oh yeah, the size of the 4890 and 5770 are both around 9.5", right? That is also good since my case I upgraded to is an Antec 300 (from Antec Solo). I understand a Nvidia GeForce 260 GTX will also fit but needing organized drive placing.
                            Last edited by Panix; 12-16-2009, 01:15 PM.

                            Comment


                            • Originally posted by BlackStar View Post
                              Why would Pulse resample a 48KHz stream when the sink is also 48KHz? (spdif out)
                              Did you change the default pulseaudio default sample rate in the /pulse/daemon.conf? By default pulse resamples to 44.1 IIRC.

                              Comment


                              • http://linux.die.net/man/5/pulse-daemon.conf

                                Default Device Settings

                                Most drivers try to open the audio device with these settings and then fall back to lower settings. The default settings are CD quality: 16bit native endian, 2 channels, 44100 Hz sampling.

                                default-sample-format= The default sampling format. Specify one of u8, s16le, s16be, float32le, float32be, ulaw, alaw. Depending on the endianess of the CPU the formats s16ne, s16re, float32ne, float32re (for native, resp. reverse endian) are available as aliases. default-sample-rate= The default sample frequency. default-sample-channels The default number of channels.

                                Comment

                                Working...
                                X