Originally posted by abott
View Post
Announcement
Collapse
No announcement yet.
AMD Radeon RX 5600 XT Linux Gaming Performance
Collapse
X
-
Originally posted by smitty3268 View Post
Their cards right now are very competitve with NVidia's current lineup, in price, performance, and power usage. Not winning, but competitive. Certainly not 2 years behind.
They are two years behind. Right now.
Comment
-
Originally posted by bridgman View PostWhat about something like the RX550 ? IIRC it's a bit faster than the GTX 1030 and in the same price range.
- Likes 2
Comment
-
Originally posted by abott View Post
What card at they competing with? 2070? Sure, but that card is two years old.
They are two years behind. Right now.
That'd be like saying the Ryzen 2 CPUs suck because they're competing with a 5 year old cpu architecture from Intel. If that's all your competition has, that's all they have.Last edited by smitty3268; 22 January 2020, 03:15 AM.
- Likes 8
Comment
-
Originally posted by betam4x View Post
It works fine resuming from sleep?
Quadro K3000M is handled by NVIDA legacy driver, so that driver itself might be an issue, as nvidia have included some kind of support service for newer driver.Last edited by blacknova; 22 January 2020, 03:28 AM.
Comment
-
With the 5700 coming in at $300 it makes me wonder, why the Vega 64 is still around $350 mostly. A price drop seems to be expected here as well.
I'll still stick to the Vega 64 until big navi gets released, cause the linux driver is just perfect for it atm.
Little side note: in Shadow of the Tomb Raider I get about 10 fps more on a watercooled Vega 64 with the latest amdgpu-pro 18.50 (just the vulkan icd extracted from the package, rest on mesa 20-devel).Last edited by ntropy; 22 January 2020, 03:46 AM.
Comment
-
Total War and Strange Brigade go to unacceptable minimum FPS levels (sub 25) across almost every card.
I wonder if that's game design errors where poorly optimized scenes are being hit during the test, or if those minimum are quite frequent stuttering events that a player would experience.
If I was developing a game, I'd setup tooling to spot whenever the game dips below AVERAGE_FPS/2. I'd consider it a bug either in the code or the scene/assets.
- Likes 2
Comment
-
Originally posted by birdie View Post....As for playing shenanigans - almost all commercial companies are not immune to that. I vividly remember how AMD released $1000 AMD64 CPUs when Intel was stuck with their Pentium 4 architecture. AMD fans forget that so easily.
btw Nvidia... 3.5GB = 4GB
- Likes 1
Comment
-
Originally posted by birdie View Post
I see you can't live without personal insults. Nice! That's very endemic to fanboys from any camp, be it open source, Mac, Windows, Android, iOS. Shows really nothing except that you're a hateful mentally challenged individual who's incapable of having a quality conversation.
A random mostly empty article pans NVIDIA. Wow. A revelation. And it contains mostly outdated info but since NVIDIA users simply don't care and AMD fans love to see the criticism, so we have what we have here. Meanwhile, the games that are using NVIDIA-works feature hair, fur, and grass which most other games still don't have in any capacity or quality.
..
What a poor discussion we have here but I'm not surprised.
Therefore you might also provide higher quality posts.
TressFX is under MIT - Hairworks as a part of Gameworks is closed. Just opensource some code samples on git doesnt make it an open implementation.
- Likes 6
Comment
Comment