Originally posted by funfunctor
View Post
Announcement
Collapse
No announcement yet.
RADV Vulkan Driver's PRIME Code Rewritten
Collapse
X
-
- Likes 1
-
Originally posted by funfunctor View PostI don't understand why you would ever want this? The application is meant to choice the best GPU to use as per the Vk spec.
- Likes 4
Comment
-
Originally posted by cj.wijtmans View Post
More or less how it should be. I still want to use my IGP for my desktop and activate my gpu only as needed for my games.
Comment
-
Originally posted by bug77 View Post
Even that is debatable. Considering how little power an idle dGPU sips, this whole switching thing looks like an obsolete contraption to me.
I have a Dell m3800 for work, which has an Nvidia dGPU and a Haswell iGPU. With recent Linux kernels, Wayland, gnome-shell, and force enabling PSR, FBC and RC6 it gets down to 8 watts while viewing a web page.
I also have a Razer Pro (2016) with a Nvidia 1080 as its only GPU. It idles at considerably more than 8 watts. I don't have exact numbers because it's running Windows and I don't know a good power analysis tool. But it has a 99 W-Hr battery and only gets 4 hours doing light web browsing while the Dell, which is older with an aged battery now rated at 66 Wh (was originally 90 Wh) can do 5 hours.
Both laptops have 4K displays and SSD storage which should eliminate those variables.
Comment
-
Originally posted by Zan Lynx View Post
I have some recent experience with that which shows your statement is false.
I have a Dell m3800 for work, which has an Nvidia dGPU and a Haswell iGPU. With recent Linux kernels, Wayland, gnome-shell, and force enabling PSR, FBC and RC6 it gets down to 8 watts while viewing a web page.
I also have a Razer Pro (2016) with a Nvidia 1080 as its only GPU. It idles at considerably more than 8 watts. I don't have exact numbers because it's running Windows and I don't know a good power analysis tool. But it has a 99 W-Hr battery and only gets 4 hours doing light web browsing while the Dell, which is older with an aged battery now rated at 66 Wh (was originally 90 Wh) can do 5 hours.
Both laptops have 4K displays and SSD storage which should eliminate those variables.
According to this, a desktop GTX 1080 idles at 8W: https://www.techpowerup.com/reviews/...dition/28.html
Though from that graph we can also see what a factory overclock will do to idle power usage. A laptop idling as high as yours doesn't sound right.
So I'm going to repeat myself: when the top desktop GPU idles at 8W, do we really need all the headaches from GPU switching?
Comment
-
Originally posted by bug77 View Post
"Considerably more than 8 watts" is suspicious.
According to this, a desktop GTX 1080 idles at 8W: https://www.techpowerup.com/reviews/...dition/28.html
Though from that graph we can also see what a factory overclock will do to idle power usage. A laptop idling as high as yours doesn't sound right.
So I'm going to repeat myself: when the top desktop GPU idles at 8W, do we really need all the headaches from GPU switching?
8W to 20W is "considerably more" yeah.
Comment
-
Originally posted by bug77 View PostSo I'm going to repeat myself: when the top desktop GPU idles at 8W, do we really need all the headaches from GPU switching?
Also please note that "top desktop GPU" is also the one using the best manufacturing process and most modern design. Whatever bullshit they repackage as laptop GPUs in midrange and low-midrange is not as efficient.
Although I agree that dual-GPU designs will be less and less relevant as time goes on. But not now.
Comment
-
Originally posted by starshipeleven View PostConsidering that Intel desktop processors +GPU usually idle at around 2 watts, and with a good desktop board you can stay within 5 watts of idle power, yeah, why the fuck not.
Also please note that "top desktop GPU" is also the one using the best manufacturing process and most modern design. Whatever bullshit they repackage as laptop GPUs in midrange and low-midrange is not as efficient.
Although I agree that dual-GPU designs will be less and less relevant as time goes on. But not now.
You get 4W for a GTX1050 or 5W for a GTX1060. Overclocking still wreaks havoc. Probably the power profile selected makes a lot of difference, too. Form what I have seen (casual look at Nvidia's CP, nothing scientific), the default "balanced" profile tends too keep the GPU at a few hundred MHz, but will happily boost video memory to the max as soon as you start something as trivial as scrolling.
Also see this: https://www.extremetech.com/gaming/2...e-form-factors
If you're looking for "bullshit" repackaged as mid and low-end cards, you know you have to look at the other camp
Comment
Comment