I currently use an NVIDIA card because of CUDA and Tensorflow.
As soon as AMD's software ecosystem ROCm and ML frameworks start supporting it, I'll switch to an AMD Vega card.
NVIDIA's attitude towards Linux, forcing the community to support them instead of working with the community, has led me to say F**k you to Nvidia. I can get that the EGLStream API suits them better, and I believed in their efforts to work on a new API that would replace EGLStreams and GBM. But I see no progress, and it seems that Nvidia has lost the sense of urgency once Red Hat added support for their proprietary API in Mutter, which should be only a stopgap solution. They definitely lost me when they announced that they had no plans to support Xwayland. Wine, as well as most games and proprietary apps do not support Wayland and run only under XWayland. I doubt that they many ever will, so no acceleration for those, ever. Using Nvidia under Wayland is thus not an option.
They can get away with their binary driver, but not supporting Wayland/Xwayland is totally a f**k you, Nvidia.
As soon as AMD's software ecosystem ROCm and ML frameworks start supporting it, I'll switch to an AMD Vega card.
NVIDIA's attitude towards Linux, forcing the community to support them instead of working with the community, has led me to say F**k you to Nvidia. I can get that the EGLStream API suits them better, and I believed in their efforts to work on a new API that would replace EGLStreams and GBM. But I see no progress, and it seems that Nvidia has lost the sense of urgency once Red Hat added support for their proprietary API in Mutter, which should be only a stopgap solution. They definitely lost me when they announced that they had no plans to support Xwayland. Wine, as well as most games and proprietary apps do not support Wayland and run only under XWayland. I doubt that they many ever will, so no acceleration for those, ever. Using Nvidia under Wayland is thus not an option.
They can get away with their binary driver, but not supporting Wayland/Xwayland is totally a f**k you, Nvidia.
Comment