Page 2 of 2 FirstFirst 12
Results 11 to 20 of 20

Thread: NVIDIA Optimus On Ubuntu 13.10 Linux vs. Windows 8.1

  1. #11
    Join Date
    Oct 2011
    Posts
    224

    Default

    Quote Originally Posted by marccollin View Post
    except using bumblebee there are no other way to run an application on the gpu you want?
    What do you mean? Run an application normally on a bumblebee system, the integrated gpu is used. Run an application under primusrun (or the old optirun) command and the dedicated gpu will be powered on and the application will be run there.

    With nvidia prime you can choose to use the nvidia gpu all the time. You can switch to the intel but that requires a reboot afaik (or maybe just end your session and restart x? Still huge PITA).

  2. #12
    Join Date
    May 2013
    Posts
    10

    Default

    Quote Originally Posted by n3wu53r View Post
    What do you mean? Run an application normally on a bumblebee system, the integrated gpu is used. Run an application under primusrun (or the old optirun) command and the dedicated gpu will be powered on and the application will be run there.

    With nvidia prime you can choose to use the nvidia gpu all the time. You can switch to the intel but that requires a reboot afaik (or maybe just end your session and restart x? Still huge PITA).
    i would like to allow the desktop to be rendered on an Intel GPU while certain applications are rendered on the NVIDIA GPU and piped to the Intel chip for display without bumblebee

  3. #13
    Join Date
    Oct 2011
    Posts
    224

    Default

    Quote Originally Posted by marccollin View Post
    i would like to allow the desktop to be rendered on an Intel GPU while certain applications are rendered on the NVIDIA GPU and piped to the Intel chip for display
    This is exactly the definition of what bumlebee does. It also powers off the nvidia card when not in use.

    So if you want to do it without bumblebee, go write your own project form scratch the does the exact same thing. Nvidia does not officially have anything comparable yet on linux.

  4. #14
    Join Date
    Aug 2012
    Posts
    28

    Default

    ...dynamic switching "out of the box" like found on Windows...
    I wish people would stop making this mistake. It's not a switchable graphics solution. You can utilize both at the same time.

  5. #15
    Join Date
    Aug 2012
    Posts
    28

    Default

    Quote Originally Posted by aigarius View Post
    At least from this test it looks like there is simply no point in even having an NVidia 620 video card in a laptop - the Intel GPU is almost as fast.
    In my experience using Bumblebee using either Primus or VGL, there's a serious performance hit using the nVidia GPU compared to Windows. There's something wrong with the drivers under Linux.

    However, I can't test using nvidia-prime or by manually editing xorg.conf because it seems like the new features that support rendering using (only) the nVidia GPU under Linux with the proprietary drivers don't support my system.
    Last edited by blacqwolf; 12-17-2013 at 07:13 PM.

  6. #16
    Join Date
    Apr 2011
    Posts
    82

    Default

    It would be nice if you test it with Mesa 10.1, according to users/developers on the bumblebee IRC there is a bug in Mesa 9.2 creating a bottleneck. My nvidia card increased the performance under Bumblebee for about 29% after I updated to 10.1

  7. #17
    Join Date
    Oct 2009
    Posts
    343

    Default

    when we test fps in games, who give a fck about energy saving? no, iam not idiot, but looks like you are.
    its faster than bumblebee, render is correct.

    this test was about SPEED not SWITCHING GPU.

  8. #18
    Join Date
    Oct 2011
    Posts
    224

    Default

    Quote Originally Posted by NomadDemon View Post
    when we test fps in games, who give a fck about energy saving? no, iam not idiot, but looks like you are.
    its faster than bumblebee, render is correct.

    this test was about SPEED not SWITCHING GPU.
    Except it wouldn't matter how fast it is when it's unusable in the real world. Genius.

  9. #19
    Join Date
    Oct 2009
    Posts
    343

    Default

    sure it is, you even used it?
    new version of nvidia-prime allows switch.

    so stop insulting me and read more

  10. #20
    Join Date
    Dec 2013
    Posts
    21

    Default

    Well, it has now switching support: http://www.webupd8.org/2013/12/more-...imus.html#more

    This is for Trusty, but I've backported it to Saucy: https://launchpad.net/~joern-schoeny...prime-backport
    (There are also packages for Precise, but the blob is still missing and it's not tested yet)

    I can switch easily - okay okay, I have to log out and in again. But that's totally fine for me. Michael, would you be so kind and benchmark this, too? :-)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •