Announcement

Collapse
No announcement yet.

KDE Is Down To Just One Wayland Showstopper Bug Remaining

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by mSparks View Post
    good job there is only one bug left in KDE though, or that problem would mean a bug free wayland KDE is still decades away, especially since it only works on AMD cards right
    There is a very big problem you have missed something. You seam not to beaware that KDE has full time developer from Nvidia working on their code base to make the Wayland side work with Nvidia hardware.

    So by personal provided the reality is KDE should work the best with Nvidia but Nvidia cards are is not the best. The problem here is Nvidia wasted 10 years attempt to get eglstreams to work. Reality eglsteams by design never can work correct. Design a system with zero separation in memory allocations between processes is a really bad idea and that was eglstreams is.

    Nvidia developer at KDE openly admits that Nvidia hardware will only come decent for KDE wayland once Nvidia has sorted out their bugs in their drivers with DMABUF and complete implementing the GBM required bits and pieces. The change over to DMABUF/GBM in the Nvidia kernel drivers will fix up some of the Nvidia Xxerver memory leaks and different ways you can cause Nvidia kernel driver to panic the kernel when using bare metal X11 server as well.

    Nvidia has some quite major bugs that are taking years to fix because it requiring a redo of the complete way Nvidia driver managers it memory. eglstreams limitations were design so Nvidia driver developers did not have to fix a set of core stack problems.

    The issue with switch between X11 servers by cntl-alt-fx keys resulting on Nvidia crashing is in fact Nvidia driver memory management issues that have existed as long as Nvidia has made drivers for Linux.

    Originally posted by mSparks View Post
    The problem with AMD is their driver team while good, are massively overworked and underfunded,
    The broken state of Nvidia drivers with Wayland is basically the same problem except on the Nvidia side. One of Nvidia problems is the person who designed the way Nvidia gpu organize memory retired from the company over 10 years ago and did not leave behind the best documentation.

    Nvidia with wayland is more when will they manage to work out how their GPU memory management in fact works and write interfaces that are in fact sane that do maintain process to process memory separation. Yes this when not if.

    Comment


    • Originally posted by oiaohm View Post

      So by personal provided the reality is KDE should work the best with Nvidia but Nvidia cards are is not the best.
      Or
      it just doesnt work on either, and nvidia users have a much lower tolerance for things not working, because unlike AMD users they are used to them working.

      Comment


      • Originally posted by mSparks View Post
        Or
        it just doesnt work on either, and nvidia users have a much lower tolerance for things not working, because unlike AMD users they are used to them working.
        That arguement really does not hold. Its like you want me to start queing Nvidia developers at KDE telling uses at this stage to use their AMD GPU if they want performance under Wayland.

        Nvidia users to tolerate particular defects like not being able to stable switch between bare metal servers that AMD and Intel uses don't tolerate that well if switch between xservers has been part of their workflow.

        The reality like or not the Nvidia developers tell you at this stage their driver is broken. When you get into the details you find out its not just broken with Wayland. It is broken with X11 as well. The problem is process memory separation based. Basically the lack of memory seperation between processors something Intel and AMD addressed back when DRI3/DMABUF appeared and that Nvidia goes and writes in the documentation promoting eglstreams as something need so disregarding all the stablity problems this fault was causing.

        Basically every Nvidia users who claims the X11 driver is perfect and is better than the AMD one is really proceeding to ingore a set of fault Nvidia developers admit to themselves with the Nvidia driver now that have been there for decades.

        Yes Nvidia is working on fixing the broken but its taking a lot of time.

        Comment


        • Originally posted by oiaohm View Post

          I will give example of what Nvidia users over look. 20 years ago Nvidia opengl driver broken it implicit sync support. This is a bug. Nvidia goes basically says if you don't want you application using implicit sync to crash go recode it to use the new explicit sync functions.

          AMD does take a long time to fix bugs but when it case what happen in in breach of opengl standard/vulkan standard... They don't turn around and say sorry we AMD do what we say. Yes Nvidia will do something in breach of standards and go we are Nvidia alter your code.



          June is the most recent lot of benchmarks we have got on Linux that was done in controlled testing. Yet mspark was attempting to make point with even older benchmarks from January. Yes the current situation has changed.

          Useful point is between Jan and jun you can see the Nvidia performance disadvantage when using Wayland going away. So at some point Wayland and Nvidia will get along. The question is not if Nvidia and Wayland will get along but how long until they do.




          Be warned of grass is greener the reverse.
          I sent this to [email protected] about a week ago, but I’m not sure if anyone actually checks that inbox anymore. I apologize if it is still actively monitored. Video of the issue at 60hz, look at the top UFO (epilepsy warning?) The issue happens in all programs, and does not happen on Windows or with nouveau drivers. I believe the stutters happen for a single frame of the application’s framerate, rather than a single refresh cycle. So, the stutters are not perceptible on my 240hz monitor ...


          Nvidia users when go looking are complaining about lot of the same bugs under Windows and Linux as what AMD owners suffer from.

          Reality here is AMD and Nvidia drivers are not perfect.

          Do remember people like msparks will claim that Nvidia drivers are basically perfect. Or like your claim that AMD takes forever and completely overlook all the cases where Nvidia takes forever or 20+ years to provide a feature or fix a bug. Yes switching between X11 servers or to text based terminal causing the Linux kernel to panic with Nvidia drivers and Nvidia instruction is just don't don't do that for 20 years instead of fixing the cause of the problem.

          There is a big difference you know all that AMD stuttering and issues that does not cause kernel panics vs Nvidia issues that cause kernel panics. Nvidia driver as a long list of things that you must not do or you will cause the Linux kernel to panic.

          AMD slower to give you features on Linux than Nvidia but they class kernel panics caused by their drivers as must be fixed.

          AMD and Nvidia is basically choose your poison.
          If I decide to try an AMD card, what bugs can I expect? One reason that I am not leaning AMD is I think the only worthwhile gpu to consider is the 7900 series - so, that means a 7900 XT or 7900 XTX - those cards are expensive for me. Even, used, they start at $950-$1K. I assume, they have their own quirks or bugs in Linux? Also, I don't know what to expect when kernel 6.7 is released - the AMD 'devs' on here and those who are knowledgeable assert that there will be more hardware control options then - the programs that can adjust voltage and fans - should work with those cards? That would be good... :-/

          The other reason that makes me reluctant is the support in video editing and 3D modeling - which isn't great - Blender, video editors like Davinci Resolve, Premiere Pro - the former, works in Linux with some manual tinkering so I'm told. Sure, I will try programs like KDenLive but I want to use the others, for sure.

          As for gaming, I guess those are the bugs that the average Linux user is most concerned about? I noticed on the Nvidia forum, a lot of ppl reporting various bugs - many that remain or either worsened or showed up on the most recent driver. So, yeah, even with Nvidia's budget and manpower - I'm sure there's a lot of bugs that remain or carry over - and new ones that appear. Theoretically, though, they should be able to fix most? As for AMD, their sketchy history - of mostly supporting gaming - especially Windows - is disconcerting. Although, in theory, they should be able to address bugs more easily due to the Linux ecosystem and FOSS state.

          I've said too much - I'd consider AMD since I'm not sure if I'll need CUDA - but, the price of 7900 cards are still really high and the uncertainty with productivity support and hardware tweaking options makes me hesitant. I'm currently using a 1660 Ti (not mine) so I'll try to see what I experience - although, I am on the fence and I agree, that bugs are on both sides - pick your poison, yep.

          Comment


          • Originally posted by Panix View Post
            7900 XTX -
            Is probably the only AMD card I would consider owning.

            But

            A couple of Linux devs I work with had 5700 and 6700 (iirc) cards, but returned them because they were constantly kernel panicking.... rest of the AMD cards are either for testing purposes or the users stomach the problems.

            The annoying thing is there is no good choices, because of the AI gold rush and the likes of

            NVIDIA's new Eos supercomputer uses more than 10,000 H100 Tensor Core GPUs to train a 175 billion-parameter GPT-3 model in under four minutes.


            everything from nvidia is 2 or 3 times the price it should be (pub game, see how many fellow drinkers were disinfo'd into thinking it was cryptominers buying them)

            RTX4090 is a god for VR.

            RTX3050 has generally higher perf than an RTX2090, the leap from 2 to 3 was that big.

            Originally posted by Panix View Post
            1660 Ti -
            solid card aiui. just missing a lot of the silicon for the new fun stuff.

            1,536 cores

            vs the

            16,384 cores in a 4090 (plus all the tensor and RT cores).

            basically go by the specs, none of the benchmarks Ive seen are making use of even 1% of the stuff nvidia started introducing with the 2000 series.

            Comment


            • Originally posted by mSparks View Post

              That nvidia fixes bugs quickly and AMD doesnt shouldnt be news. claiming it is the other way round is another lie that gets perpetuated by dishonest commentators.

              But once again, no wall of text is going to change the fact there is no performance gains to be had moving away from xorg server, claiming otherwise is a also a lie.

              Just the fact that you can very easily measure xorgs performance impact (none) is enough to see its a lie.

              Like what is the theoretical foundation for wayland being able to achieve better performance than xorg? X11 has had no noticeable performance impact on desktops since a time when supercomputers ran at 10Mhz, and its really not changed all that much since then, if anything now we get much much higher IPC than back then.
              Sorry to intrude, but I think you're having a pointless discussion.
              Wayland was not created to have better performance, but was created to be more secure using modern standards and because Xorg has become unmanageable.
              Especially when it comes to games, it is known that drivers are optimized and until recently the reference was Xorg, but today things are changing, Xorg is about to be abandoned by many distros and therefore we are starting to look at wayland, often then the problem does not concern wayland which is a protocol, but rather the various composers.
              So now we are working to have a good experience with the games also in wayland, but I imagine that the performance may change depending on the composer used, but we will see the results later.​

              Comment


              • Originally posted by woddy View Post

                Sorry to intrude, but I think you're having a pointless discussion.
                Wayland was not created to have better performance, but was created to be more secure using modern standards and because Xorg has become unmanageable.
                Especially when it comes to games, it is known that drivers are optimized and until recently the reference was Xorg, but today things are changing, Xorg is about to be abandoned by many distros and therefore we are starting to look at wayland, often then the problem does not concern wayland which is a protocol, but rather the various composers.
                So now we are working to have a good experience with the games also in wayland, but I imagine that the performance may change depending on the composer used, but we will see the results later.​
                Java and html5 already solved that one a very very long time ago.

                Comment


                • Originally posted by mSparks View Post

                  Is probably the only AMD card I would consider owning.

                  But

                  A couple of Linux devs I work with had 5700 and 6700 (iirc) cards, but returned them because they were constantly kernel panicking.... rest of the AMD cards are either for testing purposes or the users stomach the problems.

                  The annoying thing is there is no good choices, because of the AI gold rush and the likes of

                  NVIDIA's new Eos supercomputer uses more than 10,000 H100 Tensor Core GPUs to train a 175 billion-parameter GPT-3 model in under four minutes.


                  everything from nvidia is 2 or 3 times the price it should be (pub game, see how many fellow drinkers were disinfo'd into thinking it was cryptominers buying them)

                  RTX4090 is a god for VR.

                  RTX3050 has generally higher perf than an RTX2090, the leap from 2 to 3 was that big.



                  solid card aiui. just missing a lot of the silicon for the new fun stuff.

                  1,536 cores

                  vs the

                  16,384 cores in a 4090 (plus all the tensor and RT cores).

                  basically go by the specs, none of the benchmarks Ive seen are making use of even 1% of the stuff nvidia started introducing with the 2000 series.
                  First time I heard of kernel panicking with the 6000 (6700?) series cards. I thought those had the most mature (and best) support now?

                  Nvidia and AMD are putting all their eggs in the AI basket - the majority of the focus is there now. That's where all the $$ is gonna be. I don't even want to go into the wilderness of the 4090s and China situation - Nvidia cards are already way overpriced even before that. Another excuse to blame it on.

                  I believe the AMD 7900 series - has their own can of worms and in Linux, support is at a snail's pace - you still can't use voltage and fan options - due to some thing taken out of the kernel - the magic is going back with kernel 6.7 - whenever it's released - apparently.... also, hdmi 2.1 - nope. So, if you have a TV, what are you supposed to do? Monitor owners can use display port. There's other quirks I read about regarding stutters or crashes but I forget those.

                  The nvidia card is just temporary - I need something with more than 6gb of vram.

                  Comment


                  • Originally posted by Panix View Post

                    I thought those had the most mature (and best) support now?
                    Apologies, that did sound like I was saying they do that now. They were release day purchases, so "the state of the drivers for them when the cards were released", obviously as time goes on the older cards have more of their issues fixed, but then you are on old slow silicon.

                    Comment


                    • Originally posted by mSparks View Post

                      Apologies, that did sound like I was saying they do that now. They were release day purchases, so "the state of the drivers for them when the cards were released", obviously as time goes on the older cards have more of their issues fixed, but then you are on old slow silicon.
                      Oh, right. Well, that is still the situation or status with AMD gpus in Linux afaik - the drivers aren't stable or there's numerous quirks, bugs or issues - they take a while to iron them out - whereas, often with new Nvidia cards - they work more or less - although, there's quite a lot of reports of issues - and bugs with recent Nvidia hardware - like the mention previously - 'pick your poison' - neither are perfect. I was hoping AMD support for productivity software would improve enough so that I could consider one of their cards this time around, though. I think they're about even in games - with adv and disadv. for each, respectively - it depends what you are willing to tolerate or put up with.

                      Comment

                      Working...
                      X