Announcement

Collapse
No announcement yet.

PRIME Synchronization & Double Buffering Land In The X.Org Server

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • PRIME Synchronization & Double Buffering Land In The X.Org Server

    Phoronix: PRIME Synchronization & Double Buffering Land In The X.Org Server

    For those making use of DRI PRIME for multi-GPU systems (mainly in the context of iGPU + dGPU notebooks), the xorg-server's PRIME code now has synchronization support and double buffering...

    http://www.phoronix.com/scan.php?pag...-Double-Buffer

  • #2
    Awesome!

    I recently got Optimus hardware. Was tricky to set-up, but somewhat ok to manage now that I know what's going on. The tearing is definitely not too nice.

    Comment


    • #3
      The input lag patch and now this, the next Xorg 1.19 looks very interesting, but it doesn't seems to arrive soon

      Comment


      • #4
        Sorry this wasn't explained for a amateur like me, but this means switching from intel to nvidia is automatic, or it can just use intel and nvidia at once for different applications?

        So play a game with nvidia, but run the desktop on intel?

        Comment


        • #5
          Originally posted by Goddard View Post
          Sorry this wasn't explained for a amateur like me, but this means switching from intel to nvidia is automatic, or it can just use intel and nvidia at once for different applications?

          So play a game with nvidia, but run the desktop on intel?
          It'll never be "automatic", there's environment variables and window hints that can be used to declare "This app needs to be run on the dedicated GPU."

          This is about not making the image tear, like it currently does, when you DO use the dedicated GPU.
          All opinions are my own not those of my employer if you know who they are.

          Comment


          • #6
            So does this mean that it'll be possible to use the nvidia dGPU with the binary drivers for specific apps by simply setting an envvar? Last time i checked one needed to mess around with hacks like bumblebee/primusrun which spawned a secondary xserver and did not yield optimal performance.

            Comment


            • #7
              Originally posted by Ericg View Post

              It'll never be "automatic", there's environment variables and window hints that can be used to declare "This app needs to be run on the dedicated GPU."

              This is about not making the image tear, like it currently does, when you DO use the dedicated GPU.
              Oh yeah? Know how this can be done and tested to be working? I have a Razer Blade 14 2015 and I like running on intel to save battery, but then it would be nice to not have to log in and out to run games.

              Comment


              • #8
                Code:
                v1: N/A
                v2: N/A
                v3: N/A
                v4: Initial commit
                v5: Move disabling of reverse PRIME on sink to sink commit
                v6: Rebase onto ToT
                v7: Unchanged
                oO

                Anyway, I suppose the situation is unchanged and this does NOT apply to intel + radeon with dri3, right?

                edit: Relevant issue: https://bugs.freedesktop.org/show_bug.cgi?id=95472

                Comment


                • #9
                  So..... would it be possible to run a desktop on Intel HD 530 and switch to GTX 970 for GPU intensive apps that way I can take advantage of the Intel Open Drivers since nVidia has issues with graphics corruption on state change and wakeup after night?

                  My connection ins HDMI-1 - I'm guessing the answer is no, but wanted to ask anyways.

                  Comment


                  • #10
                    Originally posted by haagch View Post
                    Code:
                    v1: N/A
                    v2: N/A
                    v3: N/A
                    v4: Initial commit
                    v5: Move disabling of reverse PRIME on sink to sink commit
                    v6: Rebase onto ToT
                    v7: Unchanged
                    oO

                    Anyway, I suppose the situation is unchanged and this does NOT apply to intel + radeon with dri3, right?

                    edit: Relevant issue: https://bugs.freedesktop.org/show_bug.cgi?id=95472
                    Correct, it's for using the nvidia binaries only from what I've been led to believe from the creator of the patches

                    Comment

                    Working...
                    X