Announcement

Collapse
No announcement yet.

Ubuntu 22.04 LTS Changes Default For NVIDIA Driver Back To Using X.Org Rather Than Wayland

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • mdedetrich
    replied
    Originally posted by wertigon View Post

    Look man, it's perfectly simple.

    Get an AMD laptop. Install Linux on it. Play around with it. Understand what leaps have been made the last couple of years, especially in respect to user experience. Then get an Intel laptop and do the same thing there. Then go to an AMD+Nvidia or Intel+Nvidia laptop. That experience leaves a lot to be desired.

    Is the AMD / Intel laptop a bug free experience? No. Is the experience objectively speaking a hell of a lot better than Nivida? Why, yes! Yes it is. And getting better by every new Distro release.

    It's the Betamax vs VHS debate all over again. You sure Nvidia want to keep being stubborn about it?
    I have all 3 laptops (amd with integrated graphics, intel with iGPU) and an Intel laptop with NVidia discrete graphics (quadro actually). With Manjaro distribution which set up optimus prime I had no real problems with it. It seems like unlike other people on Phoronix, I don't make my own problems. I also have a desktop with an Intel graphics card and 1080Ti, again works fine.

    So yes, its perfectly simple, if you have an NVidia card use X11 and a non shit distribution that knows how to set it up (after all that is the job of a distro). And if you are wondering, I do classify Ubuntu as a shit distribution (in this regards) because evidently they don't care about usability and they break things all the time, i.e. VNC for latest version of Ubuntu stable (21.04) which has Wayland as default for non NVidia machines is completely broken (and I know this because I am currently using it and I will end up having to change it to X11)

    Case closed.
    Last edited by mdedetrich; 05 May 2022, 07:22 AM.

    Leave a comment:


  • wertigon
    replied
    Originally posted by mdedetrich View Post
    --- A bunch of baloney justification drivel ---
    Look man, it's perfectly simple.

    Get an AMD laptop. Install Linux on it. Play around with it. Understand what leaps have been made the last couple of years, especially in respect to user experience. Then get an Intel laptop and do the same thing there. Then go to an AMD+Nvidia or Intel+Nvidia laptop. That experience leaves a lot to be desired.

    Is the AMD / Intel laptop a bug free experience? No. Is the experience objectively speaking a hell of a lot better than Nivida? Why, yes! Yes it is. And getting better by every new Distro release.

    It's the Betamax vs VHS debate all over again. You sure Nvidia want to keep being stubborn about it?

    Leave a comment:


  • skeevy420
    replied
    Originally posted by mangeek View Post

    Consider that *my* electricity (New England) comes almost entirely from natural gas, nuclear, and hydroelectric, not coal. Also consider that the efficiency of a power plant vs an individual ICE is very different, enough to overcome transmission and storage losses on the EV side. An EV run entirely off coal electricity likely pollutes less than a gas engine over its lifetime because the power plant is converting more of the coal->heat reaction to power than cars can convert gas->motion.
    I can't argue with any of that, but, still, the majority of power in America (and the world) comes from burning fossil fuels. Switching from an Internal Combustion Engine to an External Combustion Engine doesn't change the burning of fossil fuels to power a Combustion Engine.

    Yes, a fossil fuel based power plant is an External Combustion Engine. It may even be a more efficient Combustion Engine, but it still massively pollutes the air because it's a Big Ass Combustion Engine.

    My power is mostly coal. This is my state's info:

    Annual Energy Production
    Electric Power Generation:65 TWh (2% total U.S.)
    Coal:28.4 TWh, 44% [5.5GW totalcapacity]
    Petroleum:0TWh, 0% [0GW total capacity]
    Natural Gas:17.1 TWh, 26% [8.9GW total capacity]
    Nuclear:15.5 TWh, 24% [1.8GW total capacity]
    Hydro:2.2 TWh, 3% [1.3GW total capacity]
    Other Renewable:0TWh, <1% [0.4GW total capacity]

    As you can see, The Natural State isn't very naturally powered.

    Fucked up thing is they could build a geothermal power plant where I live in Hot Springs. Ain't hard to build either. Just a closed loop system. Send water down, earth boils it, sends up steam, steam powers turbines, steam liquefies, process repeats. Once built all you'd need is maintenance people for upkeep and repairs and you'd get constant free electricity.

    But that's too simple. Drill, baby, drill.
    Last edited by skeevy420; 03 May 2022, 08:05 AM.

    Leave a comment:


  • mdedetrich
    replied
    Originally posted by wertigon View Post

    It does not matter that Nvidia believes they have a superior technical solution. The reality is, a consumer and/or system integrator can choose between a hardware with great driver support or a hardware with decent driver support.

    Guess which ones Chromebooks choose? What about System76? Dell? Valve? Lenovo? List goes on. As more SIs offer Linux based offerings, the hardware support will matter more and more.

    Nvidia is free to delude themselves thinking that the desktop doesn't matter. But fact of the matter is, an AMD laptop with AMD graphics is a superior experience to Nvidia laptop, despite having the inferior render path.

    Feel free to keep deluding yourself that render path is everything. It isn't, it is only a small piece of the puzzle, and if the end user need to jump through 15 hoops to get better render performance for no better reason than Nvidia being stubborn about it, let's see how long that market share lasts.

    Like I said, your move Nvidia. Downgrade or bleed market share, which one is it gonna be?
    You are making a mountain of a molehill. The current state is this, if you are a distribution and the machine has an NVidia graphics card just use XOrg until Wayland is ready, its not hard. Even using Wayland by default for non NVidia graphics cards right now is quite controversial considering things such as VNC are completely broken in Wayland (at least with latest Ubuntu stable 21.04 for example).

    Even steam deck (which I personally own) does not use Wayland, it uses Xorg (I verified this by switching the steam deck into desktop mode and running Linux commands).

    If distributions switch default to Wayland when its clearly not usable, at least if those distros care about usability then I am sorry but they are a shitty distro. The main one I personally use (Manjaro) hasn't switched to Wayland (including non NVidia) for exactly this reason, its simply not ready yet.

    Also regarding the comment about your hardware, have you heard of NVidia shield or Nintendo Switch. I am pretty sure that NVidia doesn't have any problems in this segment . NVidia just doesn't like going for the ultra low cost market that is dominated by things like Chromebooks, there isn't a lot of profit there. And if you are talking about general system integrator's, I am sorry to burst your bubble but they don't give a flying f**k about what you are complaining about.
    Last edited by mdedetrich; 03 May 2022, 07:07 AM.

    Leave a comment:


  • mangeek
    replied
    Originally posted by skeevy420 View Post



    Pounds of CO2 emitted per million British thermal units (Btu) of energy for various fuels
    Coal (anthracite) 228.60
    Coal (bituminous) 205.40
    Coal (lignite) 216.24
    Coal (subbituminous) 214.13
    Diesel fuel and heating oil 163.45
    Gasoline (without ethanol) 155.77
    Propane 138.63
    Natural gas 116.65
    Basically, using gasoline and internal combustion engines is more environmental friendly than EVs and coal plants. And since it's basically established science that gasoline and internal combustion engines are bad for the environment, replacing them with EVs powered by coal that's 50-200% more dirty no matter how you fudge the numbers is clearly not the correct solution.

    Using green vehicles without creating green energy first is like putting a fresh coat of paint over black mold spores. You didn't fix anything; you just hid the problem from view.
    Consider that *my* electricity (New England) comes almost entirely from natural gas, nuclear, and hydroelectric, not coal. Also consider that the efficiency of a power plant vs an individual ICE is very different, enough to overcome transmission and storage losses on the EV side. An EV run entirely off coal electricity likely pollutes less than a gas engine over its lifetime because the power plant is converting more of the coal->heat reaction to power than cars can convert gas->motion.

    Leave a comment:


  • wertigon
    replied
    Originally posted by mdedetrich View Post

    Yeah and the problem is its an outdated design using old concepts that would have forced NVidia to recode their drivers because they dropped those old concepts decades ago. Just because its a standard or its in Linux doesn't automatically mean it should be followed.

    Again point is that NVidia has standards and holding the fort/being dogmatic in the ways that typical Linux devs are only works when you happen to be technically correct which doesn't always happen.
    It does not matter that Nvidia believes they have a superior technical solution. The reality is, a consumer and/or system integrator can choose between a hardware with great driver support or a hardware with decent driver support.

    Guess which ones Chromebooks choose? What about System76? Dell? Valve? Lenovo? List goes on. As more SIs offer Linux based offerings, the hardware support will matter more and more.

    Nvidia is free to delude themselves thinking that the desktop doesn't matter. But fact of the matter is, an AMD laptop with AMD graphics is a superior experience to Nvidia laptop, despite having the inferior render path.

    Feel free to keep deluding yourself that render path is everything. It isn't, it is only a small piece of the puzzle, and if the end user need to jump through 15 hoops to get better render performance for no better reason than Nvidia being stubborn about it, let's see how long that market share lasts.

    Like I said, your move Nvidia. Downgrade or bleed market share, which one is it gonna be?

    Leave a comment:


  • mdedetrich
    replied
    Originally posted by ssokolow View Post

    Well, nVidia had plenty of opportunity, what with KWin devs sending the EGLStreams guy to go talk to the driver guys.



    That's beside the point. I'm arguing against your "(which btw EGLStreams is an open standard, check Kronos group).", argument from authority.

    Suppose Khronos accepted My Immortal as some kind of standard. That wouldn't change the fact that it's one of the most famous bad fanfics of all time.
    Sure, but even that point about argument about authority is missing full context because EGLStreams wasn't just poppoed out of nowhere from NVidia's behind. It was actually based off Android's graphics stack at the time, which used explicit sync and was actually the most widely used Linux based graphics stack, and still is (and this was a graphics stack being used on phones where performance/battery usage is even more critical).

    In any case this hypothetical of NVidia should have "contributed the linux way" to the graphics stack in the past is complete hogwash. Look at this way, even now there is pushback against explicit sync from within linux graphics stack developers (as evidenced earlier). What in your right mind would make you think that NVidia would have had any chance whatsoever 10 years ago? It would have been a completely fruitless exercise and likely would have just exacerbated the tensions between NVidia and the OS community even more (and I wouldn't be surprised if this was the main reason why NVidia just used the "hands off and lets wait approach").

    At that point in time no one was getting anywhere on this topic, let alone NVidia and James Jones alluded to this on gitlab's mesa, i.e.

    Explicit sync everywhere. Of course, it would help if our driver supported sync FD first. Working on that one. Then, X devs would need to relent and let the present extension support sync FD or similar. I'm not clear why there has been so much pushback there. Present was always designed to support explicit sync, it just unfortunately predated sync FD by a few months. glamor would also need to use explicit sync for internal rendering. I believe it has some code for this, but it uses shmfence IIRC, which in turn relies on implicit sync.
    Agreed that's just for our driver stack, and we are indeed working on patches to add the necessary support to X. We've only been wary of this because similar proposals have died in review before because there didn't seem to be sufficient resolve to close on some of the interaction issues, unless there was other offline conversation not reflected on-list:
    Basically back then the Linux graphics developers dug their heels so hard in the "implicit sync is right" camp (mainly because it nicely fit the Linux/Unix mantra of everything is a file or a process) that they refused to budge on anything regardless of who it came from. We only started making progress on this like 3 years ago and thats mainly thanks to Jason Ekstrand.
    Last edited by mdedetrich; 01 May 2022, 04:35 AM.

    Leave a comment:


  • ssokolow
    replied
    Originally posted by mdedetrich View Post
    Sure there were issues but it wasn't insurmountable, i.e. NVidia could have just added hooks whenever a cleanup happened and save the associated state to disk (as an example)
    Well, nVidia had plenty of opportunity, what with KWin devs sending the EGLStreams guy to go talk to the driver guys.

    Originally posted by mdedetrich View Post
    Yeah and the problem is its an outdated design using old concepts that would have forced NVidia to recode their drivers because they dropped those old concepts decades ago. Just because its a standard or its in Linux doesn't automatically mean it should be followed.

    Again point is that NVidia has standards and holding the fort/being dogmatic in the ways that typical Linux devs are only works when you happen to be technically correct which doesn't always happen.
    That's beside the point. I'm arguing against your "(which btw EGLStreams is an open standard, check Kronos group).", argument from authority.

    Suppose Khronos accepted My Immortal as some kind of standard. That wouldn't change the fact that it's one of the most famous bad fanfics of all time.

    Leave a comment:


  • mdedetrich
    replied
    Originally posted by ssokolow View Post
    My understanding is that EGLStreams's design is pretty fundamentally at odds with things like the KWin developers' plans to spec out a system for compositor crash recovery. I remember reading something to the effect that "with EGLStreams, the resources get cleaned up when the compositor dies and there's no way to opt out of that because it's tied deeply into the shape of the API".
    Sure there were issues but it wasn't insurmountable, i.e. NVidia could have just added hooks whenever a cleanup happened and save the associated state to disk (as an example)

    Originally posted by ssokolow View Post
    It's not about things originally being backed by single companies. It's about nVidia's decision to submit EGLStreams because "Ignore that GBM is a de facto standard. It's not a de jure standard. This is actually a standard!"
    Yeah and the problem is its an outdated design using old concepts that would have forced NVidia to recode their drivers because they dropped those old concepts decades ago. Just because its a standard or its in Linux doesn't automatically mean it should be followed.

    Again point is that NVidia has standards and holding the fort/being dogmatic in the ways that typical Linux devs are only works when you happen to be technically correct which doesn't always happen.
    Last edited by mdedetrich; 30 April 2022, 01:52 PM.

    Leave a comment:


  • mdedetrich
    replied
    Originally posted by wertigon View Post

    Man, are you dense on purpose or what. :P

    The problem here isn't that the current state sucked. The problem is that Nvidia did not start from a broken system and tried to fix it, instead opting to directly inject a mutually exclusive solution from the get-go that would have broken the entire Linux ecosystem for at least 20 kernel versions (e.g. 5.0 -> 5.20), and the proposed framework would've heavily discouraged any kind of collaboration on the graphics stack.

    So, kernel devs could choose between going with what they had, which was not perfect but atleast produced a usable result, or run with a completely broken graphics stack for at least five years. For everyone except Nvidia graphics - those cards would've had an imperfect, but usable result. I am not surprised that kernel devs went with what they had already and told Nvidia to stuff it.

    No one is really contesting that explicit sync is or isn't a good idea. What I want to know is how do we go from the current render model of implicit sync, to a render model with explicit sync, without breaking everything for another 10 years?

    Nvidia is invited to discussions about this. So far these invitations have been met with very little enthusiasm. This is the current state of affairs. Can we now please stop beating this dead horse?



    And this is why Nvidia will keep losing market share in the Linux Desktop world. Any true engineer knows when there is a time to step back, look at if the current approach is reasonable, and take the freaking hit if it aint.
    Again to be fair, no one took Wayland seriously until somewhat recently (like 5 years ago?), it was evident that NVidia just waited to see how it goes. No major desktop environments bothered to actually implement Wayland until that time period. The reason why we are beating a dead horse is because you keep on bringing it up. Its clear what happened so stop bringing it up. Some companies managed to easily adapt their drivers for Wayland, others didn't and the great irony of why NVidia can't easily implement Wayland is because the concepts that Wayland/Linux are built around is 20 years old which are concepts that NVidia dropped in their driver ages ago.

    If you actually read the comments in the mailing lists and on Mesa gitlab the NVidia devs were very clear about this (even a decade ago), they are not going to recode their driver to introduce old concepts just to support Wayland.

    Also the Linux desktop marketshare is completely miniscule and tbh no one really cares about it significantly. As long as it can display a GUI thats pretty much as far as the care factor goes and if its not the case as explained before companies literally rebuilt parts of the stack because of how historically a joke the graphics stack is (including Wayland).

    Originally posted by wertigon View Post

    This will not happen for at least another 3-4 years, which means at least two generations worth of Nvidia cards. Meanwhile Wayland is ready on AMD and Intel systems, today.

    If I were Nvidia I would start implementing a Vulkan mesa driver as a stopgap measure, use that + mesa Zink and Kopper for the OpenGL implementation, and then focus on getting the explicit render path ready as soon as possible. The current modus operandi of waiting until Linux devs fix their crap is not going to do anything meaningful and Nvidia will keep losing market share on Linux as long as they refuse to properly support Wayland.

    This is the current situation. You can take it or keep bleeding market share. Your move Nvidia.
    Yeah and they can't do this without the severe performance hit, hence the problem. We are going around circles here.
    Last edited by mdedetrich; 30 April 2022, 06:32 PM.

    Leave a comment:

Working...
X