Originally posted by bridgman
View Post
Announcement
Collapse
No announcement yet.
NVIDIA's Proprietary Driver Is Moving Closer With Kernel Mode-Setting
Collapse
X
-
Last edited by carewolf; 24 May 2015, 07:11 AM.
-
Nope, nvidia is awfull for 2D... i guess people tend to switch and use intel for 2D, but nvidia or amd dedicated for 3D on laptops. Other way around as you are not a gamer, dedicated is not really needed.
For radeon i guess you tried glamor on newer distro, glamor from xserver 1.16 or better 1.17 should be the best... but can't say for 1.15 with separate glamor is awfull in comparison.Last edited by dungeon; 24 May 2015, 07:01 AM.
Leave a comment:
-
Originally posted by dungeon View Post
If you are not interested in gaming, well you should go for intel or amd apus... those have better 2D performance then mentioned dedicated chips by nature, intel driver even has 2D most optimized.
If you use radeon driver you might try glamor acceleration on that hd 5870m, that might now be better then EXA which is default for those chips.
believe it or not, glamor is still not as good as exa for me. Overall performance is ok, but it really turns my fan on for the simplest window resize, it s impossibly annoying.
Recently switched on hyperz, also turns the fan on too much, but DE performance is better indeed. Nothing but the standard EXA is usable, and that one is kinda slow
So nvidia is not a better option?Last edited by ciupenhauer; 24 May 2015, 07:00 AM.
Leave a comment:
-
Originally posted by ciupenhauer View PostDoes anyone here recommend i should switch to a nvidia card instead? With this latest news, it sounds more interesting. Im not interested in gaming performance, but desktop performance, browser, etc.
If you use radeon driver you might try glamor acceleration on that hd 5870m, that might now be better then EXA which is default for those chips.Last edited by dungeon; 24 May 2015, 06:34 AM.
Leave a comment:
-
Glad this conversation is open. I was just about to upgrade to an amd m290x hoping to get away from the horrible DE performance i have with my hd 5870m and pretty much any desktop environment. But now im worried that the radeonSI based m290 is going to be just as problematic. Does anyone here recommend i should switch to a nvidia card instead? With this latest news, it sounds more interesting. Im not interested in gaming performance, but desktop performance, browser, etc.
Ps: yea, firefox sux for me right now, i thought that was just firefox, but someone mentioned it's the driver..
Leave a comment:
-
Originally posted by dungeon View PostNothing and everything. But answer is simple why they shouldn't do it - engines should be properly multithreaded, not a driver to workaround some (not all) of those that aren't
Oh wait, this happens all the time on Windows, on both sides, drivers are released on game launch with advertised optimisations for that game, and now when we have like 2 games that benefit from that one feature ( BTW, Metro2033/LL Redux have a performance LOSS with this optimisation FYI ) you say that it should not be done, because driver/game programmers are supposed to be gods and once they release the first driver/game version that is the one and only driver and they'll amend only new cards IDs and nothing more.
Or maybe that drivers should behave correctly on ideal programs because games/applications that are used in the real (not perfect) world should just STFU or "be properly multithreaded'?
And I do agree they should "be properly multithreaded" but I do understand this will not be a fact any time soon, hence any driver tricks that help me as a consumer are welcomed.
And no, this is not a GameWorks vs AMD thing like we now have on Windows with Witcher3/ProjectCarsLast edited by Licaon; 24 May 2015, 05:43 AM.
Leave a comment:
-
Originally posted by dungeon View Post
Nothing and everything. But answer is simple why they shouldn't do it - engines should be properly multithreaded, not a driver to workaround some (not all) of those that aren't
With that non default "optimization" variabile nvidia driver moves some draw calls out of main thread which help those not properly multithreaded games AFAIK... that is not "optimization" but a hack which works for their driver and mostly helps only big chips too. On other drivers moving exact those *some* draw calls, might behave entirely different and game porters just tested nvidia variabile etc... So it helps just nvidia driver, there is not guarantee it will work and behave fine or gain same performance for any other driver on same games.
Leave a comment:
-
Originally posted by Licaon View Post
So what is keeping AMD from implementing their own variable that activates the same ( threaded optimisations ) for their driver?
With that non default "optimization" variabile nvidia driver moves some draw calls out of main thread which help those not properly multithreaded games AFAIK... that is not "optimization" but a hack which works for their driver and mostly helps only big chips too. On other drivers moving exact those *some* draw calls, might behave entirely different and game porters just tested nvidia variabile etc... So it helps just nvidia driver, there is not guarantee it will work and behave fine or gain same performance for any other driver on same games.Last edited by dungeon; 24 May 2015, 04:55 AM.
Leave a comment:
-
Originally posted by birdie View Postthe hell froze over
Originally posted by dungeon View PostSo you can't say fglrx driver is quilty in that case (nor to expect amd will somehow fix that), but those game porters are lazy because they just add nvidia variabile for those games... which is easy for them, it is good for nvidia driver users... but it can't be every other driver is quilty because they doing that
- Likes 1
Leave a comment:
-
Bridgman, are there any plans to pour some real love into fglrx?...it really needs it. As you've seen in past benchmarks, many OpenGL 4 games seem to hit a driver bottleneck and get FPS capped even upto R9 290.
http://www.phoronix.com/scan.php?pag...-preview&num=3
Then there is things like "GL_EXT_texture_compression_rgtc" broken on Terascale class GPUs that would be nice to get fixed.
Leave a comment:
Leave a comment: