Announcement

Collapse
No announcement yet.

RADV Vulkan Driver's PRIME Code Rewritten

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by funfunctor View Post
    iame6162013 Maybe you should read the Vk spec before spewing stuff verbatim and your understand what I mean.

    bug77 seems very unwise to me.
    Well it's pretty obvious, not every GPU has display hardware, and so when one GPU does the rendering no matter which GPU it is, it always needs to be sent to the GPU which has the display hardware connected to it. I don't think that display hardware configurations is something that the VK specs should ever try covering..

    Comment


    • #12
      Ok, ty. Googling fail, sorry.

      Comment


      • #13
        And 200 lines is a very small amount as far as C code goes. If the PRIME support only requires that, it sounds like very good engineering.

        Comment


        • #14
          Originally posted by funfunctor View Post
          I don't understand why you would ever want this? The application is meant to choice the best GPU to use as per the Vk spec.
          This is not about deciding which GPU does the rendering, it's about sending the rendered content to the GPU that has display control.

          Comment


          • #15
            Originally posted by cj.wijtmans View Post

            More or less how it should be. I still want to use my IGP for my desktop and activate my gpu only as needed for my games.
            Even that is debatable. Considering how little power an idle dGPU sips, this whole switching thing looks like an obsolete contraption to me.

            Comment


            • #16
              Originally posted by bug77 View Post

              Even that is debatable. Considering how little power an idle dGPU sips, this whole switching thing looks like an obsolete contraption to me.
              I have some recent experience with that which shows your statement is false.

              I have a Dell m3800 for work, which has an Nvidia dGPU and a Haswell iGPU. With recent Linux kernels, Wayland, gnome-shell, and force enabling PSR, FBC and RC6 it gets down to 8 watts while viewing a web page.

              I also have a Razer Pro (2016) with a Nvidia 1080 as its only GPU. It idles at considerably more than 8 watts. I don't have exact numbers because it's running Windows and I don't know a good power analysis tool. But it has a 99 W-Hr battery and only gets 4 hours doing light web browsing while the Dell, which is older with an aged battery now rated at 66 Wh (was originally 90 Wh) can do 5 hours.

              Both laptops have 4K displays and SSD storage which should eliminate those variables.

              Comment


              • #17
                Originally posted by Zan Lynx View Post

                I have some recent experience with that which shows your statement is false.

                I have a Dell m3800 for work, which has an Nvidia dGPU and a Haswell iGPU. With recent Linux kernels, Wayland, gnome-shell, and force enabling PSR, FBC and RC6 it gets down to 8 watts while viewing a web page.

                I also have a Razer Pro (2016) with a Nvidia 1080 as its only GPU. It idles at considerably more than 8 watts. I don't have exact numbers because it's running Windows and I don't know a good power analysis tool. But it has a 99 W-Hr battery and only gets 4 hours doing light web browsing while the Dell, which is older with an aged battery now rated at 66 Wh (was originally 90 Wh) can do 5 hours.

                Both laptops have 4K displays and SSD storage which should eliminate those variables.
                "Considerably more than 8 watts" is suspicious.

                According to this, a desktop GTX 1080 idles at 8W: https://www.techpowerup.com/reviews/...dition/28.html
                Though from that graph we can also see what a factory overclock will do to idle power usage. A laptop idling as high as yours doesn't sound right.

                So I'm going to repeat myself: when the top desktop GPU idles at 8W, do we really need all the headaches from GPU switching?

                Comment


                • #18
                  Originally posted by bug77 View Post

                  "Considerably more than 8 watts" is suspicious.

                  According to this, a desktop GTX 1080 idles at 8W: https://www.techpowerup.com/reviews/...dition/28.html
                  Though from that graph we can also see what a factory overclock will do to idle power usage. A laptop idling as high as yours doesn't sound right.

                  So I'm going to repeat myself: when the top desktop GPU idles at 8W, do we really need all the headaches from GPU switching?
                  That is 8W just for the GPU. My Dell laptop using the Intel GPU including screen, backlight, CPU, Wifi, Bluetooth and RAM uses 8W. If we assume 6W of that 8W is everything else then the Razer has to be using 6 + 8 or 14W at minimum. Considering the Razer's 4 hr runtime (could be more but I plug it in at 10-15%) it's at least 20W.

                  8W to 20W is "considerably more" yeah.

                  Comment


                  • #19
                    Originally posted by bug77 View Post
                    So I'm going to repeat myself: when the top desktop GPU idles at 8W, do we really need all the headaches from GPU switching?
                    Considering that Intel desktop processors +GPU usually idle at around 2 watts, and with a good desktop board you can stay within 5 watts of idle power, yeah, why the fuck not.

                    Also please note that "top desktop GPU" is also the one using the best manufacturing process and most modern design. Whatever bullshit they repackage as laptop GPUs in midrange and low-midrange is not as efficient.

                    Although I agree that dual-GPU designs will be less and less relevant as time goes on. But not now.

                    Comment


                    • #20
                      Originally posted by starshipeleven View Post
                      Considering that Intel desktop processors +GPU usually idle at around 2 watts, and with a good desktop board you can stay within 5 watts of idle power, yeah, why the fuck not.

                      Also please note that "top desktop GPU" is also the one using the best manufacturing process and most modern design. Whatever bullshit they repackage as laptop GPUs in midrange and low-midrange is not as efficient.

                      Although I agree that dual-GPU designs will be less and less relevant as time goes on. But not now.
                      Wth are you talking about, Pascal is 16nm across the board. Here's how it works: https://www.techpowerup.com/reviews/...rix_OC/27.html
                      You get 4W for a GTX1050 or 5W for a GTX1060. Overclocking still wreaks havoc. Probably the power profile selected makes a lot of difference, too. Form what I have seen (casual look at Nvidia's CP, nothing scientific), the default "balanced" profile tends too keep the GPU at a few hundred MHz, but will happily boost video memory to the max as soon as you start something as trivial as scrolling.

                      Also see this: https://www.extremetech.com/gaming/2...e-form-factors
                      If you're looking for "bullshit" repackaged as mid and low-end cards, you know you have to look at the other camp

                      Comment

                      Working...
                      X