Announcement

Collapse
No announcement yet.

RandR 1.5 Works: GPU Offloading, USB Hotplugging

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • RandR 1.5 Works: GPU Offloading, USB Hotplugging

    Phoronix: RandR 1.5 Works: GPU Offloading, USB Hotplugging

    RandR 1.5 paired with the latest X.Org Server advancements allows for GPU offloading and USB GPU hot-plugging under Linux. Here are some demos of this technology finally working...

    http://www.phoronix.com/vr.php?view=MTEyMjc

  • #2
    I thought that optimus was gpu switching rather than offloading.... or are these two terms being used interchangably?
    It was my understanding that switching would allow one or the other GPU to be actually turned off (thus not using power doing nothing), whereas offloading would be more like parallel computing, thus different tasks could be executed on different GPUs.

    Comment


    • #3
      Go hotplugging!

      This is very necessary since there is a lot of common functionality between hotplugging and using multiple GPU's and switching between them for systems with an IGP and a discrete card!

      Shouldn't there be api's and ways implemented in a general way for doing this kind of stuff for any kind of device?

      Comment


      • #4
        Originally posted by droidhacker View Post
        I thought that optimus was gpu switching rather than offloading.... or are these two terms being used interchangably?
        It was my understanding that switching would allow one or the other GPU to be actually turned off (thus not using power doing nothing), whereas offloading would be more like parallel computing, thus different tasks could be executed on different GPUs.
        Optimus is just offloading. Switching is what was done before, where there was some way to choose which GPU got the display.
        With optimus, you normally use the weak integrated GPU that is connected to the display (usually intel), and, as needed, the external GPU renders games to a buffer, which gets displayed by the weak GPU (which is still doing work: displaying prerendered stuff).

        Comment


        • #5
          Originally posted by droidhacker View Post
          I thought that optimus was gpu switching rather than offloading.... or are these two terms being used interchangably?
          It was my understanding that switching would allow one or the other GPU to be actually turned off (thus not using power doing nothing), whereas offloading would be more like parallel computing, thus different tasks could be executed on different GPUs.
          There are two types of optimus. One where you actually switch between the cards and one where the dedicated card just renders the stuff and the integrated card is used to actually display it. The latter is the "evil" one since it only worked with hacks like starting a second xserver in the background for the dedicated card to render to and then somehow display the rendered stuff on the first xserver. I'm glad there is now a proper (as proper as such hardware hackery can be) solution for this.

          In my layman's point of view with both methods you should be able to electrically turn the dedicated card off while it is not used to render or displaying stuff.

          Comment


          • #6
            Originally posted by ChrisXY View Post
            There are two types of optimus. One where you actually switch between the cards and one where the dedicated card just renders the stuff and the integrated card is used to actually display it. The latter is the "evil" one since it only worked with hacks like starting a second xserver in the background for the dedicated card to render to and then somehow display the rendered stuff on the first xserver. I'm glad there is now a proper (as proper as such hardware hackery can be) solution for this.

            In my layman's point of view with both methods you should be able to electrically turn the dedicated card off while it is not used to render or displaying stuff.
            Right, but only in switching, can you power off the IGP.
            Of course, what is much more interesting than GPU switching or offloading, is GPGPU on the dedicated unit, graphics on the integrated.

            Comment


            • #7
              Does XRandR support multiple video cards these days? A short googling returned only complaints about v1.2 and that 1.3 was supposed to deliver but didn't.

              Comment


              • #8
                I didn't even realize 1.4 was out already.

                Comment


                • #9
                  Nice. I'll probably never use any of these, but nice.

                  Comment


                  • #10
                    Originally posted by droidhacker View Post
                    Right, but only in switching, can you power off the IGP.
                    Of course, what is much more interesting than GPU switching or offloading, is GPGPU on the dedicated unit, graphics on the integrated.
                    you can't really power the IGP off in a lot of cases at least to the same extent as you can with the discrete. Since the IGP is part of the chipset or CPU.

                    But optimus doesn't allow for ever powering off the IGP, on most systems the IGP is connected to the laptop panel and the nvidia isn't. On Apples they can reduce IGP power alright, but no idea what they do when they turn if off.

                    You can already do GPGPU on discreete and graphics on IGP just load the nvidia binary on the discrete and use CUDA or its OpenCL.

                    Dave.

                    Comment


                    • #11
                      I haven't seen much plans for PRIME power management outside of maybe upstreaming bbswitch into the kernel.

                      Is this something that won't be tackled until after on-the-fly switching?

                      Comment


                      • #12
                        Originally posted by LLStarks View Post
                        I haven't seen much plans for PRIME power management outside of maybe upstreaming bbswitch into the kernel.

                        Is this something that won't be tackled until after on-the-fly switching?
                        plan is just to have nouveau turn the gpu off when certain conditions are met after a small timeout.

                        no crtcs active, no recent activity on channels from userspace, then turn it off, any activity from userspace, then block and turn it back on.

                        Dave.

                        Comment


                        • #13
                          Originally posted by airlied View Post
                          you can't really power the IGP off in a lot of cases at least to the same extent as you can with the discrete. Since the IGP is part of the chipset or CPU.

                          But optimus doesn't allow for ever powering off the IGP, on most systems the IGP is connected to the laptop panel and the nvidia isn't. On Apples they can reduce IGP power alright, but no idea what they do when they turn if off.

                          You can already do GPGPU on discreete and graphics on IGP just load the nvidia binary on the discrete and use CUDA or its OpenCL.

                          Dave.
                          I only mention optimus because the article talks about it. I'm on Linus' side when it comes to nvidia. It is my understanding that the AMD A6+ APUs actually can switch off the IGP when not in use, although I admit that the reference materials I've looked at may be incorrect for the sake of simplicity.

                          Comment


                          • #14
                            Nice job, Dave.

                            It's nice to see the fruits of your labor, starting to pay off. I, like someone else pointed out, probably won't be using this anytime soon - but it is something that is wanted/needed by others.

                            great stuff.

                            Comment


                            • #15
                              nvidia driver?

                              First of all, I'd like to give a heartfelt thanks and congratulations to everyone involved in this. It is a big achievement and is extremely important to me and many other users of notebooks with optimus graphics.

                              I'd like to know if this solution will work with the nvidia driver. I know there was a big license issue, and I think that this is what it was about. From what I could tell, the eventual decision was to allow the nvidia driver to use PRIME. Will it require a simple update to the nvidia driver to work, or is this something that won't be working for a considerable while?

                              What all would I need to install to get this running with nouveau in the meantime?

                              Best regards.

                              Comment

                              Working...
                              X