Announcement

Collapse
No announcement yet.

GNOME X.Org vs. Wayland Performance + Power Usage On Fedora 32 With AMD Renoir Laptop

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • nomadewolf
    replied
    Originally posted by pranav View Post
    Does that mean I should stop using Wayland?
    Why there is no improvement yet?
    Is this mainly because of Firefox not being so great on Wayland?
    Even though Wayland itself is getting production ready, all the apps will need porting to wayland (future).
    That means that even though you're running Wayland natively, X will still need to be 'emulated' under Wayland to run any X-only app.
    The fact that there is barely any increase runing Wayland under these conditions is a very good sign.

    Leave a comment:


  • ArchLinux
    replied
    https://web.basemark.com/
    asd
    X XWayland Wayland
    Chrome Nvidia 895 Freeze -
    Chrome Nouveau Freeze Freeze -
    Firefox Nvidia 610 225 240
    Firefox Nouveau 385 375 375

    Leave a comment:


  • ArchLinux
    replied
    Originally posted by Volta View Post

    The point was Wayland is quite young. It also matters when Gnome, KDE, Firefox started implementing it. X is 30 years with us and it's still far from perfect.
    Well it fundamentally can't be perfect, hence Mir & Wayland.

    Leave a comment:


  • ArchLinux
    replied
    Originally posted by royce View Post

    Wayland is not a thing that works more fluently than some other thing. It's a protocol.
    So they've done everything wrong, it's just up for the compositor + wm implementer to fix it?

    Leave a comment:


  • ArchLinux
    replied
    Originally posted by raster View Post

    IF you had read the text rather than just some numbers, you wouldn't need that reply as I clearly said it's not a brilliant benchmark but an indicator... perhaps you should read a bit more? Like the bits where I say I work on said code so you can be sure this indicator is not just a "on startup" but a general trend over a longer runtime too which I clearly indicate as well as this is just an indicator of where things are going. I actually profile my code regularly in gory detail from a range of angles, and an app is running - a terminal, not empty. You didn't read the bit where I said '"I spent all of 5 minutes on this and i need to go to sleep now" which is why it's not clearly not exhaustive. I clearly qualified and detailed the situation already - thus text to go read. It'd up to a much more involved benchmark to do that and I had and still have many other more important things to do, but I do know Wayland can easily do better than X11.

    I've been doing this for over 25 years. Wayland is the first X alternative that has gotten any traction and it is decently designed. I've watched the hopefuls come and go and leave X11 still standing, until now. X11 is still standing but Wayland has not gone... it's growing. I was totally "Oh ANOTHER X11 replacement? Do you want to join the corpses of Y-windows, Berlin and DirectFB and possibly some other lesser knowns? Ummm no thanks" for the first few years because of this. I changed my mind as time went on for good reasons.
    Still way too much text.

    Leave a comment:


  • duby229
    replied
    Originally posted by kravemir View Post

    Nvidia doesn't care about 1% of non-users,... Not much lost for them. Nvidia does cover and support their target customers. This phoronix screaming minority is very much non important, don't fancy yourself,... Don't act like entitled child, that Nvidia should listen to you,... You're not that important, and Nvidia shows it right to yours face. Deal with it.
    Just plain retarded....

    Leave a comment:


  • Guest's Avatar
    Guest replied
    Originally posted by duby229 View Post

    Ok, so nVidia doesn't care about its users... And that's a good thing how???
    Nvidia doesn't care about 1% of non-users,... Not much lost for them. Nvidia does cover and support their target customers. This phoronix screaming minority is very much non important, don't fancy yourself,... Don't act like entitled child, that Nvidia should listen to you,... You're not that important, and Nvidia shows it right to yours face. Deal with it.

    Leave a comment:


  • mdedetrich
    replied
    Originally posted by duby229 View Post

    Nope, you're wrong. The OSS drivers have had working multimonitor support for decades, they have had working Xrender acceleration since at least 2007, they have had working and standards compliant (though incomplete) OpenGL acceleration since at least 2010. -ONLY- nVidia stands out in these bugs and problems.
    My laptop/s says otherwise.

    Also note that XRender was expiremental and buggy for some time before it stabalized just like any new X extensions (see https://forum.kde.org/viewtopic.php?f=111&t=88842 ) . Also some bugs were due to KWin and not NVidia for example (i.e. see https://forum.kde.org/viewtopic.php?f=111&t=83835) where even Intel had issues with XRender). Broken XOrg configs also caused issues.

    You have to forgive me for being skeptical of what you are complaining because historically people just blamed NVidia for everything just because its NVidia even though there were actual bugs in compositing engines and/or XOrg. AMD also has had issues (fglrx or otherwise).

    Do note that I was talking about video hardware acceleration, that is not XRender. XRender is hardware acceleration for compositing which is something else entirely. You kind of derailed the discussion.

    Originally posted by duby229 View Post
    And yet RadeonSI has had standards compliant OpenGL from it's very inception. Why do -you- think -only- nVidia drivers can't impliment standards compliant OpenGL? And why do you defend that? Your stance on nVidia is totally asinine.
    Its also the same reason why AMD drivers are shit on windows when it comes to games where as NVidia drivers just work (almost all of the time). You can blame game developers not following OpenGL spec properly so it you either have standards compliant drivers that don't work with a lot of games (including AAA ones) or drivers that have been designed to work with games.

    In the end this is the argument that standards end up being how people use the software and not how its specified on paper. Vulkan solves this problem entirely BTW (Vulkan was deliberately designed to not work unless you use it properly because of this specific issue).
    Last edited by mdedetrich; 17 June 2020, 05:44 PM.

    Leave a comment:


  • duby229
    replied
    Originally posted by mdedetrich View Post

    Thats because they were mainly X11 bugs, AMD also had similar issues.

    Regarding OpenGL spec, they don't follow the OpenGL spec completely because a lot of games don't follow it either. I mentioned it in some other thread.



    Even the open source drivers currently have a lot of issues. Note that I have been running X11 + Nvidia on a laptop with multiple displays for the past decade. The biggest issue I have experienced is actually optimus/render offload.


    Note that I typically ran KDE if that makes a difference.
    Nope, you're wrong. The OSS drivers have had working multimonitor support for decades, they have had working Xrender acceleration since at least 2007, they have had working and standards compliant (though incomplete) OpenGL acceleration since at least 2010. -ONLY- nVidia stands out in these bugs and problems.

    And yet RadeonSI has had standards compliant OpenGL from it's very inception. Why do -you- think -only- nVidia drivers can't impliment standards compliant OpenGL? And why do you defend that? Your stance on nVidia is totally asinine.

    Leave a comment:


  • duby229
    replied
    Originally posted by kravemir View Post

    Nvidia doesn't need to care about that,... They focus at gamers, and CUDA solutions. Gamers (mostly) use only single monitor to get the best performance. And, CUDA users don't care that much, as they just want to get acceleration working for their projects.

    However, I hope AMD gets much better position in the market. I like AMD's OSS support. I wouldn't go with Nvidia GPU.
    Ok, so nVidia doesn't care about its users... And that's a good thing how???

    Leave a comment:

Working...
X