Originally posted by ryao
View Post
Announcement
Collapse
No announcement yet.
The Radeon RX Vega Performance With AMDGPU DRM-Next 4.21 vs. NVIDIA Linux Gaming
Collapse
X
-
Originally posted by tildearrow View Post
If that is the case then how does NVIDIA perform better than AMD in some compute benchmarks?Last edited by shmerl; 02 December 2018, 05:53 PM.
- Likes 4
Comment
-
Originally posted by debianxfce View Post
All amdgpu users should use a rolling release system so that they do not make bug reports with old kernels and Mesa. When you use Debian Sid Xfce, Oibaf ppa and latest kernels, you have the best system and do not care about point release software.
We all know you were suggesting Debian Testing a month or two ago, but now Testing is going stale so you're back to Unstable until the next Debian point release and then you'll switch back to Testing...been there, done that, why I use Antergos.
Anterogs contains all the packages you suggest, they're all under centralized repositories, and there's easy access to AMDVLK and other AUR goodies.
- Likes 2
Comment
-
Originally posted by skeevy420 View PostWe all know you were suggesting Debian Testing a month or two ago, but now Testing is going stale so you're back to Unstable until the next Debian point release and then you'll switch back to Testing...been there, done that, why I use Antergos.
- Likes 1
Comment
-
Originally posted by ryao View Post
That is a Windows thing from 15 years ago. Nvidia’s first Direct3D graphics card was the GeForce FX 5800 Ultra. They made it by bolting on Direct3D 9 onto their GeForce 4 series in a way that was not performant, presumably under the assumption that it would never matter. When this made their hardware perform terribly well before the GeForce 6 series was ready, they resorted to cheating to improve the performance of Direct3D games. As far as I know, the practice died out due to backlash. I have never heard of them doing this in their Linux drivers or in OpenGL games for that matter.
For what it is worth, ATI has been caught cheating on numerous occasions:
https://forums.anandtech.com/threads...eating.638328/
https://forums.anandtech.com/threads...-2003.1087045/
http://www.tomshardware.com/forum/75...2001se-figures
http://www.tomshardware.com/forum/80...aught-cheating
http://www.tomshardware.com/forum/298524-33-cheating-benchmarks-degrading-game-quality-nvidia
Anyway, all of this nonsense is Windows specific. It was never a factor on Linux as far as I know. If anything, the lack of cheating on Linux is yet another reason why performance has always been lower on Wine than on Windows.
And yet WINE is also cheat, it is not Windows
Linux is also cheating, it is not UNIX just Unix-like
GNU is not UNIX you know
No, i wouldn't continue, as birdie does this the best
I don't think ATi cheated, they just make app profiles to improve this or that to workaround this or that or whatever They do the same still, games carry profiles and sometimes some profiles at some point of time are just broken
And then someone spot this and make a beautuful click bait story, but nothing really unusual happenedLast edited by dungeon; 02 December 2018, 06:53 PM.
- Likes 1
Comment
-
-
Originally posted by shmerl View Post
I've seen the opposite. AMD is better in compute benchmarks, because they support asynchronous compute in the hardware, and Nvidia do not. If anything, AMD cards are more compute cards than gaming ones, while Nvidia are more gaming cards than compute ones.
Some developers attempted to implement compute method on their codes before getting overridden by publisher. Still not forgetting how some benchmarks were caught intentionally cheating by deliberating ignoring a hardware capable of performing some features by using specific codes harming it.
Overall, the ball is to the PC gaming industries changing their approach otherwise their market share keep on shrinking.
Comment
-
Great to see Vega64 vulkan performance improving on Linux. Well done AMD. Once it gets to the point where it's beating the gtx 1080 in Vulkan we will know it's pretty much where it should be.
Originally posted by tuxd3v View PostAMD, has a lot of things to do first than bring more hardware to the table, without full feature support.. They need to stabilize the drivers first, bring speede and features on par with windows, preferably IMO share code with Windows driver, so that will be easier for them to manage the projects.. After that, new Hardware is a good option.
also new hardware and software development is happening continiously in parallel. Various different teams with different expertize working on both... You don't put one on hold for the other. Anyway the primary target when launching new hardware is windows users whewe AMD already has fantastic drivers, it doesn't make sense for them to hold up anything because x feature is not yet working in Linux.Last edited by humbug; 02 December 2018, 10:16 PM.
- Likes 2
Comment
-
Originally posted by pal666 View Postthey use same blob on all systems. so cheating on windows is enough of evidence
btw, you sound like butthurt nvidiot
Seriously, can you point to an actual instance where anything was shown to happen on Linux in the last 10 years? "It might have happened" is pretty BS if you have no evidence. It also might not have happened.
Also, if they've never been caught because no one can see any difference - that's called optimization and not cheating. It's only cheating if there's a difference in the output.Last edited by smitty3268; 02 December 2018, 11:15 PM.
- Likes 2
Comment
Comment