Announcement

Collapse
No announcement yet.

KDE Plasma Readies Its NVIDIA GBM Support, Fingerprint Authentication Added

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • It's his usual strategy. Write fanboy shit, then declare everyone is harassing him and leave the thread for a while. Then come back with even more ridiculous arguments. Just ignore.

    Comment


    • Originally posted by tildearrow View Post

      Then why are you here?
      You are an absolute liar. You can't even respect yourself because your evil side has a passion for conflict and discord.

      Seriously, admit it. Are you here just to be annoying?
      He's already admitted that in the past, no need to repeat it. Basically he says that other people on here are trolls so he's going to be an even bigger one to make it even or something like that.

      Comment


      • Originally posted by birdie View Post

        When you waste countless hours trying to make Windows work many people gain Stockholm syndrome only they don't realize that. Considering I've spent probably twenty times as many hours as any user here I should have gained it as well only I haven't. There must be something wrong with me. I just want Windows to be stabler, and more user-friendly, that's it.

        Another possible explanation is probably tribalism and elitism. When the OS is used by more than 95% of people in the world, you kinda start feeling superior about your choice and start painting black everything and everyone who hasn't joined your tribe, including companies. Sadly, has happened to me for some reasons.
        Fixed your comment for you. No need to thank me.

        Oh, and about Nvidia vs AMD? I don't own AMD GPU since some years, but when I had one 7-8 years ago, THEY offered me choice to downvolt / downfreq GPU. Nvidia NEVER did.
        Same with AMD vs Intel on CPU side by the way.
        It may be not important for you, and now with automatic scaling it may seem unnecessary, but I find it extremely stupid to allow overclock and not underclock.

        Besides that, I never could properly install Nvidia acceleration on my laptops, and Nvidia on Windows at least regularly tries to dig and spy your data, through more or less subtle ways.
        On top of leaving an ugly icon and background app in Windows that is utterly useless to me. No idea how AMD works these days though, but I'm definitely not satisfied with Nvidia.

        Originally posted by TemplarGR View Post

        Every brand new architecture can have driver problems, Nvidia is not exempt. The real reason Nvidia "newer" architectures seem to not have problems is because they are not really new architectures, but iterations of Fermi architecture. Nvidia hasn't made a really new hardware architecture for a decade. Slapping new shit on top of the Fermi architecture, using die shrinks, and just invent marketing names for their mostly software gimmicks, are not new hardware. RDNA on the other hand was new hardware. Unlike the previous gens which were iterations of GCN and had more stable launches.... You see AMD actually attempts to redesign hardware from time to time, and they don't sell marketing BS to their customers at overinflated prices.
        Oh... So they basically behaved exactly as shitty as Intel then. Good to know, I'll be sure to keep it in mind when I can finally buy a new computer.

        Comment


        • Originally posted by birdie View Post

          NVIDIA binary crap has worked in Linux excellently so far sans Optimus (and switchable graphics laptops work horribly even under Windows) and support for some fancy new not really stable graphics APIs. Can't complain.

          Of course, if you want to limit yourself to Linux/DXVK bugfest/slowfest, you could look elsewhere. Sane people game under Windows where NVIDIA drivers are unsurpassed in terms of quality, performance and features.
          I used to have a laptop with GTX860M and never had issues with switchable graphics on that!

          There used to be some issue with VSYNC, as the GPU renders in its own memory and then does a copy to system RAM for the iGPU to display the image. This was eventually solved as well, sadly I drove my car into the canal and drowned my laptop with it

          Comment

          Working...
          X