Originally posted by mdedetrich
View Post
AMD Radeon RX 7900 XTX + RX 7900 XT Linux Support & Performance
Collapse
X
-
Last edited by NeoMorpheus; 13 December 2022, 03:37 AM.
-
-
Originally posted by JPFSanders View PostYour thoughts are a carbon copy of mine.
Once AMD released their open driver there was no way back to the proprietary blob.
What NVidia did back in the day was commendable and technically great (I was an NVidia user for years), but at the end of the day their solution was a bolted closed driver on top of the kernel to solve Linux's lack of proper graphics infrastructure and video device management, as Linux evolved and the graphics infrastructure within the kernel improved that bolt-on technology became both a kludge and a serious handicap to wider Linux adoption.
The people who like NVidia did like it because it worked well (and bugs, kludge and Linux handicap aside still works well for most user applications) but the majority of people don't understand how much of a drag the current NVidia software ecosystem based on the closed driver is for Linux, NVidia (in my opinion) is along with Gnome (this is another story for another day) responsible of delaying the adoption of Linux desktop technologies for at least 5 to 10 years. Yes, it is that bad, but oh well, muh 50 FPS using RTX wasting 800 watts will improve game play so much.
The sad part is that there is no reason for this, NVidia could have had their kernel driver completely open source and maintain their proprietary user-space stack, they could had their market dominance without chocking Linux development, their proprietary compute infrastructure is great for example, but there was no reason whatsoever they couldn't have an open source kernel-space open driver and proprietary user-space stacks. They could have made everybody happy and had less headaches themselves with an upstream kernel driver.
The good news is that they seem to have begun to correct this and in a couple of years once they mature their new open-source driver I will be first in line to buy an NVidia card and test it.
Comment
-
-
Originally posted by WannaBeOCer View PostRay tracing isn’t a gimmick, Nvidia’s 3D Vision/AMD’s HD3D were gimmicks. Ray tracing has been in development for the last 2 decades. I’d argue and say ray tracing has been the second major graphic change to photo realism since tessellation was introduced in regards to games.
Last I checked ray tracing isn’t locked to Nvidia’s hardware. Games either use Microsoft’s DxR or Vulkan’s VK_KHR_ray_tracing. Nvidia’s hardware currently does it better.
Comment
-
-
Originally posted by finalzone View PostRay-tracing is much older technology since 1950s. The reason it was not used on consumers devices is mainly due to power requirement notably for both CPU and later GPU until recently. Raster was developed to address that shortcoming. Even the 4090 needed more power to render real-time ray tracing and needed upscale technique to compensate. The current method is just a brute force approach just for a beautiful lighting effect easily reproducible on raster. Full real-time ray-tracing is still a long way to go for the consumer market.
Nobody talks about ray-tracing locked to Nvidia. The issue is applying Nvidia's specific method (clever use of driver) to AMD hardware. Real-time ray-tracing in gaming world will only gain adoption when available to mainstream hardware (currently both Playstation 5 and XBox Series 5) with an effective approach.
Nvidia’s method was creating dedicated cores, AMD followed with RDNA2 introducing their “Ray Accelerators.” Are you saying AMD’s 2nd generation Ray Accelerators match Nvidia’s 3rd generation RT cores but AMD’s drivers are lacking?
I understand what’s causing the lack of new technologies being mainstream. The other issue is that PS4/Xbox One are also being developed for.
Me personally I’m only purchasing DX12/Vulkan titles with ray tracing and HDR. If they lack the latest API, HDR or ray tracing I won’t buy the title. Next title I’ll be buying is Forspoken.
Comment
-
-
Originally posted by Grinness View Post
rx 6800 dual monitor, youtube playing in firefox: 15 - 16 W ...
Comment
-
-
Originally posted by dimko View PostI have thsi crazy idea, HOW ABOUT DEVS STOP SCRATCHING ARSE and start develop better games with better engines so we don't have to upgrade every year with 2k video cards.
Also, Nvidia will happily create newer gimmicks that only works on their overpriced hardware. Its a race, which you as customer, WILL NEVER WIN.
Edit: And by "never" I mean "since I started playing on PCs in the early 90s".
Comment
-
Comment