Page 1 of 2 12 LastLast
Results 1 to 10 of 12

Thread: NVIDIA 340.17 Linux Beta Driver Brings Initial G-SYNC Support

  1. #1
    Join Date
    Jan 2007
    Posts
    14,378

    Default NVIDIA 340.17 Linux Beta Driver Brings Initial G-SYNC Support

    Phoronix: NVIDIA 340.17 Linux Beta Driver Brings Initial G-SYNC Support

    The first beta driver in NVIDIA's forthcoming "Release 340" driver series for blob-using Linux users is now available...

    http://www.phoronix.com/vr.php?view=MTcxNDk

  2. #2
    Join Date
    Apr 2014
    Posts
    70

    Default Great support!

    Regardless of how anyone feels about NVidia or GSync, I think we have to admit that they are treating Linux as a top priority. Over the past few years, their driver support for hardware has at worst trailed the Windows drivers by week. Compare that to other GPU vendors which often lag by many months or years (if at all).

  3. #3
    Join Date
    Jul 2013
    Posts
    2

    Default

    Wayland, NOW OR NEVER (most likely never for pre-Fermi...)!

  4. #4
    Join Date
    Jul 2012
    Posts
    709

    Default

    Quote Originally Posted by deppman View Post
    Regardless of how anyone feels about NVidia or GSync, I think we have to admit that they are treating Linux as a top priority. Over the past few years, their driver support for hardware has at worst trailed the Windows drivers by week. Compare that to other GPU vendors which often lag by many months or years (if at all).
    Doesn't matter. Some people will still desperately try to find ways to bad mouth NVIDIA.

  5. #5
    Join Date
    Apr 2013
    Posts
    97

    Default

    Quote Originally Posted by blackout23 View Post
    Doesn't matter. Some people will still desperately try to find ways to bad mouth NVIDIA.
    They have top support with great performances, but it doesn't support latest technologies like KMS or dma-buf, so thier driver will never work with wayland and will never support optimus. I own an optimus laptop and I can say it's quite a bummer, as my video card is 100% of the time shudown and uses my intel instead. If they would just release some documentation to help nouveau developpers (which is not quite hard) and it would upgrade greatly the nvidia experience and every GPU will work out of the box. I owned a laptop with a gts360m and the screen was corrupt so it was impossible to boot in live CD with hardware acceleration. So yes, their driver are greatly supported, but they are part of a bigger environment and they don't bother to work with us, which is not better.

  6. #6
    Join Date
    Jul 2012
    Posts
    709

    Default

    Quote Originally Posted by gufide View Post
    They have top support with great performances, but it doesn't support latest technologies like KMS or dma-buf, so thier driver will never work with wayland and will never support optimus. I own an optimus laptop and I can say it's quite a bummer, as my video card is 100% of the time shudown and uses my intel instead. If they would just release some documentation to help nouveau developpers (which is not quite hard) and it would upgrade greatly the nvidia experience and every GPU will work out of the box. I owned a laptop with a gts360m and the screen was corrupt so it was impossible to boot in live CD with hardware acceleration. So yes, their driver are greatly supported, but they are part of a bigger environment and they don't bother to work with us, which is not better.
    1) KMS DMA-BUF are not "latest technologies" 2) You don't need KMS for Wayland. 3) Have you tried this: http://us.download.nvidia.com/XFree8...E/optimus.html?

  7. #7
    Join Date
    Feb 2012
    Posts
    424

    Default

    Quote Originally Posted by gufide View Post
    They have top support with great performances, but it doesn't support latest technologies like KMS or dma-buf, so thier driver will never work with wayland and will never support optimus.
    KMS is not a prerequisite for Wayland. It's just one possible method of doing modesetting, but it's not "the one true way". Why does this KMS/Wayland thing have to be corrected over and over again? /rant

    Also, they do have access to dma-buf - they were blocked by some symbols being GPL-only, but they bypassed that by creating their own hooks into dma-buf. The reason there's no proper Optimus support (just running everything on the Nvidia GPU) is that you can have only one GL implementation active at the same time, something Nvidia is working on with their vendor-neutral GL library. Well, they were working on it, there hasn't been activity for many months now - https://github.com/NVIDIA/libglvnd

  8. #8
    Join Date
    Apr 2013
    Posts
    97

    Default

    Quote Originally Posted by blackout23 View Post
    1) KMS DMA-BUF are not "latest technologies" 2) You don't need KMS for Wayland. 3) Have you tried this: http://us.download.nvidia.com/XFree8...E/optimus.html?
    As far as I know, the support from optimus is quite limited, and not allowing dynamic switching, which means huge battery drain. Additionally, the driver can't sync the framebuffer so there's tearing on all the screen. I think I'm better off with primusrun from bumblebee, unless there's a significant improvement, like the feature to select application you want to run on the GPU. Only bumblebee allow that. In a very hackish way, but at least it's usable.

    EDIT: thanks Gusar for clarifying, it's much better.
    Last edited by gufide; 06-09-2014 at 03:34 PM.

  9. #9
    Join Date
    May 2011
    Posts
    1,450

    Default

    Quote Originally Posted by Shiba View Post
    Wayland, NOW OR NEVER (most likely never for pre-Fermi...)!
    Isn't there already support for Wayland? What else is needed?

  10. #10
    Join Date
    Feb 2012
    Posts
    424

    Default

    Quote Originally Posted by johnc View Post
    Isn't there already support for Wayland? What else is needed?
    Nvidia's EGL implementation currently only works with X, as far as I know.

    Also, there's the issue of modesetting. Compositors are tied to KMS currently. Nvidia doesn't expose any API that compositors could hook into, but even if they did, it's quite inefficient if each compositor had to implement support for different modesetting APIs. Some standard is needed in this area, something that abstracts away low-level details. I recall Nvidia talking about having modesetting in EGL.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •