Announcement

Collapse
No announcement yet.

GeForce 700 vs. Radeon Rx 200 Series With The Latest Linux Drivers

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    I have an HD 4670. It still works like it did when I first got it. Open-source drivers are pretty good for that card from what I'm reading here, but I'm running Debian Wheezy, and don't want to go into testing or unstable, because that's made a big mess of things for me before.
    I always ran Catalyst on it since I got it in late 2009. One bug that never got fixed was: X11 would sometimes freeze when I tried to wake xscreensaver. l also could never see the unlock-screen dialog box-the screensaver would just stop, and I'd have to press numlock to see if it had locked up.
    And it stuttered in games.

    I bought a used Geforce 6800 from an electronic junk store for a different computer, and installed nVidia's legacy drivers. Acutal computing experience was comparable to superior to the HD 4670, outside of modern games.

    The difference was the reason I went with a GTX 770.

    It's using nVidia's very shiny (expensive!) reference cooler, and is very quiet. No, that's not a mistype: the GTX 770's reference cooler is actually very quiet until the GPU temp goes above 80C (after which it's still reasonably quiet), and most of the heat goes out the back of my case. Of course, you can drive the blower faster by enabling Coolbits, and it won't be quiet for you if you do that.

    The big thing I don't like about the 770 is this: if your desktop isn't OpenGL-accelerated, it will tear things like window moves and browser scrolling. [url=https://devtalk.nvidia.com/default/topic/543305/?comment=4192942nVidia knows it's there[/url], and thinks it's a problem with the Kepler architecture. You also have to make sure your movie player is set for OpenGL output, or it will tear your movie something awful.

    Comment


    • #52
      Originally posted by A Laggy Grunt View Post
      The big thing I don't like about the 770 is this: if your desktop isn't OpenGL-accelerated, it will tear things like window moves and browser scrolling. [url=https://devtalk.nvidia.com/default/topic/543305/?comment=4192942nVidia knows it's there[/url], and thinks it's a problem with the Kepler architecture. You also have to make sure your movie player is set for OpenGL output, or it will tear your movie something awful.
      Not a Kepler arch problem, as I see it on my old GTS 450. That's why I run compton instead of Xfwm's own compositor.

      Comment


      • #53
        xfwm is terribly slow. I managed to double my frame rate and remove all graphics lag on Catalyst by disabling composisting

        xfce hasnt received hardly any development in a long time and the rendering side is poor and its upto either using compton or even better (apart from all the related packages) KWin with xfce.

        Im not sure why people ship xfce given all the need for multimedia and games these days. It's stable, i like that but it has such a poor compositor and no effects to speak of, its all a bit outdated without mods.

        Comment


        • #54
          Originally posted by phill1978 View Post
          Im not sure why people ship xfce given all the need for multimedia and games these days. It's stable, i like that but it has such a poor compositor and no effects to speak of, its all a bit outdated without mods.
          I'm sure it baffles the mind but not everybody shares your enthusiasm for desktop effects (and not every graphics driver is equally bad at XRender acceleration). Personally I feel Xfce provides a very efficient and productive working environment. Multimedia and games seem to be just fine as well. The only reason I run an external compositor is to reduce tearing in the very occasional web video I happen to watch, as all my local media has been running tear free with XBMC for years now.

          We Linux users are a very diverse bunch with very diverse needs. There will never be a single DE everybody's happy with. (The same goes for Windows and Mac users, but they have to bend over and make do with whatever MS/Apple decide is best for them.) I know this shouldn't be news to anyone and has been repeated ad nauseam, but there you go anyway.

          Comment

          Working...
          X