Announcement

Collapse
No announcement yet.

NVIDIA's Proprietary Driver Is Moving Closer With Kernel Mode-Setting

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by duby229 View Post
    Well, I suppose it raises a question.... How well does the OSS driver stacks power management fair in comparison to Catalyst? I suspect it's about the same.
    I did not see a big difference in comparison with windows catalyst
    Fan speeed and temperature on radeonsi Linux vs Windows - http://www.gearsongallium.com/?p=2131

    Comment


    • #42
      Originally posted by duby229 View Post
      You guys are nuts. If I looked and seen 60C and 70C, I'd be doing everything I could to raise the fan speed. I know GPU's are designed for higher operating temps, but there is no way in hell I would let it run that hot.
      it's at 44C right now (more or less idle), with a huge cooler and open case
      if i turned the fans up by double it would probably go down to ~35C and the fans would be audible
      it's just power hungry, not much i can do about it
      (at full load it doesn't get much hotter)

      but yes, chips can in theory go up to 150C and still work
      chip life expectancy is approximated in hours as current^2 * temperature
      at low frequencies it draws less current, so it working a bit hotter then it could is not a big thing

      another thing is that those temperatures are probably not that precise, as the NTC is not in between the transistors themselves but a bit off to the side
      so at load id guess it's 10C hotter in the chip then it says (as there are load spikes and much more stuff)

      so ye, it depends on your card
      a friend had a card that ran games at a bit over 90C, and that was normal for that card
      for new (28nm) cards i'd get a bit worried, but not much

      Comment


      • #43
        Originally posted by ciupenhauer View Post
        Glad this conversation is open. I was just about to upgrade to an amd m290x hoping to get away from the horrible DE performance i have with my hd 5870m and pretty much any desktop environment. But now im worried that the radeonSI based m290 is going to be just as problematic. Does anyone here recommend i should switch to a nvidia card instead? With this latest news, it sounds more interesting. Im not interested in gaming performance, but desktop performance, browser, etc.
        Ps: yea, firefox sux for me right now, i thought that was just firefox, but someone mentioned it's the driver..
        I use Linux exclusively for work and play. For desktop, nothing beats NVidia closed drivers + hardware. They just work, and significantly outperform AMD in the same price range. For example, a 750Ti ($150) generally beats a 290x ($300) at less than 1/3 the power, and the NV driver is much more stable and feature complete.

        Things get a little muddier on laptops. There I have stuck with Intel and am generally pleased with performance. NV drivers are some trouble due to the "optimus" situation where the driver switches between igpu and dgpu depending on use. However, there are solutions, and Maxwell architecture is so efficient that the benefit of switching is less dramatic.

        Comment


        • #44
          Originally posted by ciupenhauer View Post
          Glad this conversation is open. I was just about to upgrade to an amd m290x hoping to get away from the horrible DE performance i have with my hd 5870m and pretty much any desktop environment. But now im worried that the radeonSI based m290 is going to be just as problematic. Does anyone here recommend i should switch to a nvidia card instead? With this latest news, it sounds more interesting. Im not interested in gaming performance, but desktop performance, browser, etc.
          Ps: yea, firefox sux for me right now, i thought that was just firefox, but someone mentioned it's the driver..
          I've never noticed 2D lag with the open source AMD drivers. Of course, the only r600g graphics cards that I have are the 4870 and 6850 which both work just fine without visible lag, but my RadeonSI-based APUs (A4-5400, A6-6600K and A8-7600) and 7950/280X runs even better. GLAMOR is actually really great now with the latest version of Xorg.

          You will, however, find Chromium/Chrome to display much more smoothly than Firefox with GPU accel on Linux right now. By default, both Firefox and Chromium have GL accel disabled on Linux, but you can override that in the options. You can also try the latest nightly of Firefox which features much better multithreaded processing with Electrolysis, which gives one process to every tab much like Chrome does.
          Last edited by mmstick; 24 May 2015, 03:47 PM. Reason: Added additional information about web browser support on Linux.

          Comment


          • #45
            Originally posted by blackout23 View Post
            I think NVIDIA gained OpenCL 2.0 support a few releases ago, but it did not get much attention. Someone mentioned it on the forums a few days ago.
            What they did was finally add OpenCL 1.2 support in their 350.05 driver early last month, over three years after the spec was released. No word about OpenCL 2.0 yet.

            Comment


            • #46
              Originally posted by duby229 View Post
              You guys are nuts. If I looked and seen 60C and 70C, I'd be doing everything I could to raise the fan speed. I know GPU's are designed for higher operating temps, but there is no way in hell I would let it run that hot.
              I used to run my old GTX275 at 96C through summer. I don't think the fan could have gone any faster because it sounded like a hair dryer. No airconditioning but lots of sun in this room unfortunately, so ambient temps of 30C are not uncommon in summer.

              Comment


              • #47
                Originally posted by randomizer View Post

                I used to run my old GTX275 at 96C through summer. I don't think the fan could have gone any faster because it sounded like a hair dryer. No airconditioning but lots of sun in this room unfortunately, so ambient temps of 30C are not uncommon in summer.
                It's not always possible to set up in ideal conditions. But, in your case I'd have definitely been looking at after market coolers. Most times the stock cooler is fine, but 96C is beyond nuts, so a giant copper cooler is the only answer I can think of. It's almost summer again, so it's time to consider it.

                EDIT:, I'm using the non-LED version of this cooler.

                Last edited by duby229; 25 May 2015, 09:43 AM.

                Comment


                • #48
                  I'm glad to see a discussion of GPU temperatures -- I signed up just to say the following:

                  I'm using a GTX970 on proprietary drivers. During a regular desktop session, with ~20 firefox tabs, 5-7 large pdfs, numerous gvim and terminal windows, thunderbird, a vm in qemu-kvm using the spice server, the steam client etc., etc., etc., I get these respective GPU temps in each DE/WM:

                  -----
                  GNOME 3.14 (Fedora 21; openSUSE Tumbleweed) ~60-65C
                  GNOME 3.16 (openSUSE Tumbleweed) ~60-65C

                  Openbox alone, i3 alone (openSUSE Tumbleweed) ~35-40C

                  XFCE 4.12 (openSUSE Tumbleweed) ~60-65C

                  KDE - Plasma 5.3.0; Qt 5.4.1(openSUSE Tumbleweed) ~40-50C

                  and I have a windows partition (not proud of it), where the GPU is at ~35-40C under similar loads. Running GTA V for a couple of hours, it goes up to ~65-70C.
                  -----

                  I don't know what to say about XFCE; I used it for a very long time on my laptop (openSUSE 13.1 & 2 + XFCE was the only combination that would keep temperatures inside my sys76 gazp9 at sane levels consistently), and used to like it a lot better than MATE. But I can't see why on earth it would heat up the GPU as much as a AAA open-world game!!!

                  I've spent a fair bit of time using, and tweaking Openbox and various tiled managers; and while they're all great, there's always one or two nagging inconveniences that pull me back into a full-featured DE.

                  I used to be a GNOME3 'fan' of sorts, since 3.12, IIRC. Since Tumbleweed rolled out Plasma 5, though, I've become a KDE convert -- and the GPU temperatures played no small part in that. I'm not 'technical' enough to tell exactly what, but obviously the KDE & Qt guys are doing something right, where the GNOME & Gtk people seem to be failing.

                  I'm wondering what your experiences are, with respect to GPU temperatures in various DE/WMs.
                  Last edited by wvstolzing; 26 May 2015, 11:02 AM.

                  Comment


                  • #49
                    That is an awesome post. It raises a number of questions that are well worth looking into.

                    Comment

                    Working...
                    X