Originally posted by aufkrawall
View Post
Whether you like it or not, birdie is largely correct here. Windows has, more or less, fixed the GPU acceleration problem for everything that you would sanely would want to be sanely GPU accelerated. Its of course not perfect but considering the backwards compatibility that Windows has to also maintain its miles ahead of Linux whos only success story for GPU acceleration seems to be terminal rendering? (which is hilarious in a tragic way).
Actually just yesterday I was having a meeting with a colleague who is running Fedora and he had to switch from Wayland to X11/Gnome to get OBS to work (with Wayland he was having a delay between sound and video recording). The switch to X11/Gnome fixed that problem however he now has the problem that when he uses the blue filter (which is meant to use GPU acceleration) in Google meet with Firefox the video freezes for ~5-6 seconds at a time because its actually using the CPU rather than the GPU.
And before anyone blames NVidia, this is on an Intel laptop using the discrete GPU on Fedora...
So yeah sorry for the reality check, but GPU acceleration story in Linux is complete and utter shambles and its actually personally my last straw that convinced me to get a macbook pro for work because I was sick of tired of having stuttering in video conferencing because it was using the CPU (and since I am not so young anymore, I don't really have the time to fuk around with all of the flags/settings/GPU decoder APIs with browser to get it to work).
Comment