Announcement

Collapse
No announcement yet.

Optimus/Primus Regresses On Latest Mesa 10.5.5 Release

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Optimus/Primus Regresses On Latest Mesa 10.5.5 Release

    Phoronix: Optimus/Primus Regresses On Latest Mesa 10.5.5 Release

    A Phoronix reader has pointed out that a regression has slipped into the Mesa 10.5.5 point release that negatively affects users of dual-GPU laptop owners with NVIDIA Optimus technology that are using the open-source "Primus" code for running OpenGL games on the alternate graphics processor...

    http://www.phoronix.com/scan.php?pag...10.5.5-Regress

  • #2
    I'm running Mesa 10.6 with Ubuntu Wily (15.10 alpha + xorg-edgers) and can confirm. Primus lost about 100-150fps on glxspheres, right out of the gate.

    Comment


    • #3
      Intel/AMD Prime is still working great for me here - git mesa, kernel, xorg etc

      Comment


      • #4
        Does someone know if Bumblebee is still useful? Or is this stuff supposed to replace that? I've never been able to wrap my head around the complexity of the Linux graphics stack, and I'm a OpenGL graphics programmer.

        I'm one of the unlucky souls to have bought a laptop with two GPUs. Works fine on Windows, but on Linux I'm stuck on the crappy Intel GPU...
        Last edited by Remdul; 05 June 2015, 12:55 PM.

        Comment


        • #5
          Originally posted by Remdul View Post
          Does someone know if Bumblebee is still useful? Or is this stuff supposed to replace that? I've never been able to wrap my head around the complexity of the Linux graphics stack, and I'm a OpenGL graphics programmer.

          I'm one of the unlucky souls to have bought a laptop with two GPUs. Works fine on Windows, but on Linux I'm stuck on the crappy Intel GPU...
          primusrun _needs_ bumblebee to work, there is no replacement to power on/off the card depending on the app.

          Comment


          • #6
            Originally posted by Remdul View Post
            Does someone know if Bumblebee is still useful? Or is this stuff supposed to replace that? I've never been able to wrap my head around the complexity of the Linux graphics stack, and I'm a OpenGL graphics programmer.

            I'm one of the unlucky souls to have bought a laptop with two GPUs. Works fine on Windows, but on Linux I'm stuck on the crappy Intel GPU...
            bumblebee is needed for dynamic usage, nvidia-prime works in ubuntu but you need to log out to change from intel to nvidia

            Comment


            • #7
              Originally posted by Remdul View Post
              Does someone know if Bumblebee is still useful? Or is this stuff supposed to replace that? I've never been able to wrap my head around the complexity of the Linux graphics stack, and I'm a OpenGL graphics programmer.

              I'm one of the unlucky souls to have bought a laptop with two GPUs. Works fine on Windows, but on Linux I'm stuck on the crappy Intel GPU...

              Nvidia-prime is your best bet. It requires a log out, but I leave my system in nvidia mode 24/7 to avoid this. In my experience, nvidia-prime has much higher performance and stability than bumblebee. I forget that I even have two cards, since everything just works.

              Comment


              • #8
                Originally posted by dh04000 View Post


                Nvidia-prime is your best bet. It requires a log out, but I leave my system in nvidia mode 24/7 to avoid this. In my experience, nvidia-prime has much higher performance and stability than bumblebee. I forget that I even have two cards, since everything just works.
                It also kills the battery and has tearing. On my old laptop after I've been using it for a while it it greatly lowered the battery life, now it doesn't even last 30 minutes.

                Comment


                • #9
                  Originally posted by Remdul View Post
                  Does someone know if Bumblebee is still useful? Or is this stuff supposed to replace that? I've never been able to wrap my head around the complexity of the Linux graphics stack, and I'm a OpenGL graphics programmer.

                  I'm one of the unlucky souls to have bought a laptop with two GPUs. Works fine on Windows, but on Linux I'm stuck on the crappy Intel GPU...
                  It's not that complex AFAIK it is every bit as complex as everywhere else, here you just can get exposed to it.

                  To understand how stuff works, I suggest you experiment with only the OSS drivers, the blobs bring in a completely new world of hurt...
                  WIth OSS drivers, to render on the dedicated GPU, simply set xrandr options :
                  --setprovideroutputsource provider source
                  --setprovideroffloadsink provider sink

                  Then launch an app with DRI_PRIME environment variable set.
                  Read about PRIME, vgaswitcheroo and dma_buf to understand how it works.

                  Then introduce the hacks needed for blobs into your knowledge base At least that worked for me as a learning method...
                  Last edited by Serafean; 06 June 2015, 06:24 AM.

                  Comment


                  • #10
                    Originally posted by Serafean View Post
                    It's not that complex AFAIK it is every bit as complex as everywhere else, here you just can get exposed to it.

                    To understand how stuff works, I suggest you experiment with only the OSS drivers, the blobs bring in a completely new world of hurt...
                    WIth OSS drivers, to render on the dedicated GPU, simply set xrandr options :
                    --setprovideroutputsource provider source
                    --setprovideroffloadsink provider sink

                    Then launch an app with DRI_PRIME environment variable set.
                    Read about PRIME, vgaswitcheroo and dma_buf to understand how it works.

                    Then introduce the hacks needed for blobs into your knowledge base At least that worked for me as a learning method...
                    With some of the open source drivers you don't even need any xrandr options IIRC, just DRI_PRIME.

                    Comment

                    Working...
                    X