Announcement

Collapse
No announcement yet.

Nouveau Driver Remains Much Slower Than NVIDIA's Official Driver

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by Sonadow View Post
    just because they return a 15% slower performance than the blob when underclocked does not (and will not) mean that these numbers will scale accordingly when reclocking support is finally (if ever) implemented.
    The benchmark has one card that does clock properly. And, unfortunately, it does suggest that it doesn't scale well.

    Comment


    • #32
      Originally posted by liam View Post
      Perhaps I missed something but why doesn't Michael run the Nouveau drivers at their highest speeds when possible?
      As stated in the article's introduction, Michael does run the Nouveau drivers at the highest speeds when possible, ie. on the 9800 GTX. It doesn't work on the other two cards.

      Originally posted by liam View Post
      I've a Quadro FX 570M that has three clock levels. The highest level is core 475 shader 950 memory 700, while stock level is 275, 550, 300, respectively. A rather massive difference.
      If used this for gaming I'd always put it at the highest speed levels. I'd imagine most gamers would do the same.
      I'd imagine most gamers would not run their cards at the highest speed levels if it crashes immediately, or is unstable, or produces rendering errors.

      I note that your Quadro FX 570M has an older NV84 (G84) chipset, while the 9800 GT and 9800 GTX have the NV92 (G92) chipset. The GT 220 has the NVA5 (GT216) chipset. If you have been successful in reclocking the Quadro FX 570M please let us know which kernel version you are using as this would be good news and might also be useful to someone else.

      If Michael had tested with a GT 240 (I can't remember if he owns one), which has the NVA3 (GT215) chipset, things might have been different, as better NVA3 memory re-clocking was introduced in the 3.5 kernel. I'm not sure how much difference that would make without also reclocking the core though.

      Comment


      • #33
        Originally posted by asdx
        Nouveau will have optimus support in distros *first* and by *default* real soon, and Nvidia won't.

        Ha ha, take that Nvidia.
        And Nvidia's blob has SLi which Nouveau can never even hope to implement any time soon in the immediate future.

        And Nvidia has the resources to re-implement a DMA-BUF solution into the blob if they want to. And they might just do that, which will make DMA-BUF irrelevant.

        And Nvidia has supported Optimus on Windows for...what, 3 years? While Linux still hasn't even got it? Oh dear...
        Last edited by Sonadow; 05 January 2013, 05:08 AM.

        Comment


        • #34
          Originally posted by asdx
          Nouveau will have optimus support in distros *first* and by *default* real soon, and Nvidia won't.

          Ha ha, take that Nvidia.
          I wouldn't bet my ass on that. Nvidia is working hard for some month already to get optimus working in their blob.
          Honestyl spoken,imo, optimus is useless with nouveau, cause the performance gain using the noveau driver+nvidia gfx insteadof the intel gfx is likely to low for thinking about gaming with it .

          Comment


          • #35
            Originally posted by christian_frank View Post
            I wouldn't bet my ass on that. Nvidia is working hard for some month already to get optimus working in their blob.
            Honestyl spoken,imo, optimus is useless with nouveau, cause the performance gain using the noveau driver+nvidia gfx insteadof the intel gfx is likely to low for thinking about gaming with it .
            Good point that.

            If memory serves, the whole point of Optimus was to have the figurative best of both worlds by wiring the display to the Intel GPU which is just good enough for everything that is not games or CAD related, and swap over to the dedicated chip when more crunching power is needed.

            With the Nouveau driver not being close to delivery the expected performance from the Nvidia chip it makes little sense to even bother with switching if the performance gains do not match up with the increase in power consumption.

            Comment


            • #36
              Originally posted by gens View Post
              8800 was a revolution in gpu design, at least as far as nvidia is concerned
              The steam engine was a revolution in engineering, but I don't think many use them anymore..

              Originally posted by gens View Post
              next step was in the 400 series
              next step is in the 600 series, and only the high end part
              but nothing revolutionary in them
              "only the high end", yes. You think 8800GTX was a low end card when it was released? Compare apples and apples please.

              Originally posted by gens View Post
              also the low end cards in the newer series are utter crap
              Again with the apples. And were low end cards ever anything but crap?

              The point is that doing a "benchmark" of 5+ year old GPUs is a pretty daft idea in the first place. Nobody cares if you happen to think it's the best thing evarrr.

              Comment


              • #37
                Originally posted by netrage View Post
                The steam engine was a revolution in engineering, but I don't think many use them anymore..
                As far as I know, almost all power plants use steam engines... From coal to atomic energy. There are of course some exceptions, for instance the natural gas-driven otto engines which drive the dynamos directly.

                Comment


                • #38
                  Originally posted by thofke View Post
                  As far as I know, almost all power plants use steam engines... From coal to atomic energy. There are of course some exceptions, for instance the natural gas-driven otto engines which drive the dynamos directly.
                  A Steam turbine != steam engine. But let's not nitpick any further...

                  Comment


                  • #39
                    Fascinating how all the nvidia trolls come out of the woodwork to troll about nouveau and insult its developers.

                    Nobody is forcing you to use the open source driver, Nvidiots!

                    Comment


                    • #40
                      Originally posted by GT220 View Post
                      Nouveau developers are retarded, trying to reinvent the wheel, wasting valuable resource and time instead of working on other parts of Linux that needs improvement.

                      Time to take this garbage out and throw it into the dustbin of history where it belongs.
                      Well, if they are retarded, you are braindead.

                      You HAVE to have control over your hardware. It wasn't just once that people have found hairraising covert security holes in binary blobs, and nVidia has been mentioned more than once in that context.

                      While I understand some of the woes at open source effort, I have to acknowledge difficulty of working with so cpmplex stuff under conditions of cronic and critical lack of documentation all the time.

                      Just look at AMD's documentation ( since they claim publically support for open source and open documentation) and try to make head and tails out of that.

                      nVidia and AMD get plenty of cash from hardvare. Compared to that, Open-source developers effort lives almost from public mercy.

                      And still ( while I don't know aboot Nouveau ) open source radeon driver is not half bad. Eyefinity works for me superbly ( 3 x 1600x1200 ). 3D is not that bad. 2D is great. It works without a problem on all kernels I cared to try, even the freshest ones, compiled straight a few hours after release or git copies. ONly thing that is still missing for me on radeon is good GPGPU interface.

                      Not so with closed-source fglrx. It's full of bugs, some of them persist through many versions over multiple year timespan and quite annoying. It is not always easy ( or even possible) to nail right combination of kernel, driver, libraries and xorg server to make it work.

                      I switched recently from radeon to fglrx only to get choked on bugs and instabilities. For the moment I am persisting as I want to play with GPGPU stuff, but at very first day radeon includes support for it, I am switching back.
                      Last edited by Brane215; 05 January 2013, 08:39 AM.

                      Comment

                      Working...
                      X