Announcement

Collapse
No announcement yet.

Open-Source "Nouveau" Driver Now Supports NVIDIA Ampere - But Without 3D Acceleration

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by CochainComplex View Post
    ok then I support your statement. I was just wondering if you imply that AMD ones are bad and Nvidias are good. If we have vendors like ASUS, Sapphire etc. one can expect an even distribution of Nvidia vs AMD failurs. It might be possible that one particular design is more prone to have issues with signal chain impedance....but over multiple generations and models there should also be an even distribution.

    ..btw If the OEM uses cheaper jacks with less tight manufacturing tolerances this might also be an issue....among corrosion
    POLARIS from AMD was most likely out of the AMD cards to throw bad solder due to the recommend soldering temperature from AMD. There have been 4 different models of Nvidia where if the vendor did follow the Nvidia recommend soldering temperatures and used particular brands of solder they caused these problems as well. So there are particular generations and models of cards that are worse yes the 4 from Nvidia are around the same year as AMD Polaris released both have learnt since then don't have too low of a recommend soldering temperature its not a good idea it makes lemons. If silicon design is going to be too tight on solider temperature redesign it.

    The thing here is when you relfow those generation of completely badly soldered Nvidia/AMD cards you go slightly above the original recommendation up to the revised recommend solder temperature both cases AMD/Nvidia recommendations on allowed solder temperature on those was too low.

    Cheaper jacks can bring extra contamination into the solder and increase risk of dry joint with wacky behaviour. Even with high quality parts things can still go wrong where the solder process ends up off temperature problem is that slightly under temperature everything tests and looks perfect at the factory just its not going to stay that way.

    Of course not all the parts that make up a GPU card are from the same vendor or have the same max recommend temperature so some parts on the board may be rated for a higher soldering temperature. The parts rated for higher soldering temperature could be a choke or something like it could be soaking up heat like a heat sink so result in solder on them and items near them not being the right temperature.

    Basically there is a real art to soldering a multi part board and getting a 100 percent correctly soldered board every single time. There is a percentage of lemon.

    You are right the design is a factor but its not the biggest factor these days as most of the design issues were addressed back in the day. The biggest factor is that the these solder faults getting to end users now are basically undetectable at factory so maker may only find out like 6-8 months latter when return merchandise authorization (RMA) of cards happen if they come back at all that something went slightly wrong on X day in the production line. Do note what I said users who get these cards who are not pushing the output to limit are not going to notice the defective solder joints in most cases until at least the warranty is over.

    Its very hard to fix this class of problem. Percentage of cards is lemons we just have to accept it and pray if we are going to be pushing it to limit we don't get one.

    Comment


    • #32
      Originally posted by d3coder View Post
      I don't get why amd users always are writing toxic posts in nvidia related news
      As an unhappy nvidia user (too bad I can't update because amd cards are currently out of stock or overpriced), I say that it deserve much more toxicity...

      Comment


      • #33
        Originally posted by d3coder View Post
        I don't get why amd users always are writing toxic posts in nvidia related news
        Just because I am using AMD GPU now does not mean I have not been a Nvidia user. Also you see Nvidia users doing toxic posts about AMD not based on fact as well. The 4k60Hz one is one of those fun you bought a GPU cross you fingers you got a good one problem. Issues with 4K60Hz are not locked to a particular brand GPU.

        There are some true faults with the Nvidia closed source driver that we should not be suger coating.

        Comment


        • #34
          Hey, instead of talking shit about Nvidia, how about we give some kudos to these developers who are actually trying to make a difference, eh?

          Comment


          • #35
            Originally posted by tildearrow View Post

            How did you get it to "just work" as soon as you boot into Linux? There is no way this could happen; you need to install the NVIDIA drivers first.
            Check out the Pop_OS Nvidia spin

            As a side note they also have an AMD spin... but I have not had a chance to use AMD hardware in quite a long time.

            Comment


            • #36
              There is absolutely nothing in the Linux community that sets the world on fire like Nvidia news

              Any way, I will give 100% that Linux + Nvidia on a laptop is pretty much hell on earth to get working, especially in multi-GPU laptops (i.e. with GPU select and crap like that). To be honest I dont have any experience with ATI laptops in this case.

              Maybe I am one of just 4 or 5 super lucky users in the whole Linux community that has no issues running Nvidia and just gets killer performance. I have servers running A.I.s and Virtual Desktops on nvidia M40 and 1070 cards that run for months with no issues. The 1070 in a VM was a huge pain, will admit... but it was effectively free hardware. The M40s just work amazingly.

              So yes I am a strong proponent of Nvidia... sure I dont get cool splash screens when I boot my os... but lets be honest... if you are enjoying your splash screen so much it means you spend an endless amount of time booting and I would have to ask a few further questions at that point about productivity

              The last AMD/ATI card I used was a Radeon 9500 Pro and I will freely admit it was by far the dominant card at the time and an amazing feat of engineering! I really wish I could see another AMD card achieve what the 9XXX series did. At the same time 90% of my computer work these days requires CUDA so until ROCm becomes something real AMD cant touch Nvidia.

              Comment


              • #37
                Originally posted by s_j_newbury View Post

                I'm on POLARIS, I run at 4K@60Hz over HDMI and it works great. If you were complaining about the lack of FreeSync over HDMI I would agree, since it's a big gripe of mine. If you have a card that flickers or particularly artifacts randomly it's probably faulty.
                I had 5 different Polaris GPUs (2 SAPPHIRE RX 580s, 2 XFX RX 580s, and 1 XFX RX 560); all of them had instability at 4K@60Hz over HDMI, and it happened on Windows, macOS, and Linux, on several different motherboards, about 6 HDMI 2.0-certified cables, and even a TB2 eGPU enclosure. There's other reports about this from others on AMD's forum and Hard Forum.

                After the 3rd GPU, I was questioning if I was really that unlucky, but after trying a GTX 1060, I had no issues whatsoever. But then silly me figured maybe it was just XFX being odd and I decided to give SAPPHIRE a try. Both SAPPHIRE cards had the same instability, and still have it today.

                Comment


                • #38
                  Originally posted by Espionage724 View Post

                  Yeah, I'm going to buy the newest NVIDIA graphics cards... to use nouveau. Lmao

                  As long as NVIDIA's proprietary driver supports it, what's the problem? All the mainstream distros have easy installs for NVIDIA's driver.



                  You know what's nice? Not having my 4K display flicker and artifact randomly. 4K@60Hz over HDMI is flawed on AMD Polaris (and according to other reports, even on RDNA), and AMD nor vendors want to admit it. Worked fine on the GTX 1060 I had for a bit though.
                  It's your choice ! I don't oblige people not to buy Nvidia hardware, I just wrote Nvidia on Linux is not the best choice if you don't want to spend the days between black screens and various problems and it is enough to open a forum of any distribution to understand that it is a problem . After that if a user wants Nvidia hardware it is a choice, of him but don't complain if things don't work as they should.

                  Comment


                  • #39
                    Originally posted by Charlie68 View Post

                    Nvidia on Linux is not the best choice if you don't want to spend the days between black screens and various problems and it is enough to open a forum of any distribution to understand that it is a problem .
                    Speaking as a Nouveau user on Gnome Wayland and a GT1030:

                    That's utter BS.

                    Originally posted by Charlie68 View Post
                    After that if a user wants Nvidia hardware it is a choice, of him but don't complain if things don't work as they should.
                    I get a much sharper and clearer display on Nvidia hardware compared to AMD hardware on the same monitor. That makes AMD hardware trash-tier from the beginning. There is no point prostituting the "awesomeness" of the open drivers for AMD when the output quality from the card is already subpar.

                    Comment


                    • #40
                      Originally posted by zexelon View Post
                      There is absolutely nothing in the Linux community that sets the world on fire like Nvidia news...
                      It is pretty nasty that Nvidia won't release these PMU signatures to extend the lifecycle of their products as well to allow to use opensource driver fully. It doesn't make sense for Linux which is a ridiculous small portion of the desktop market, it won't affect Nvidia business while will help in its reputation, it is just nasty.. Having full access to their old GPUs with free software can have also meaning in the research field, will benefit a lot of other areas while for instance will prevent many to continue to purchase their products...

                      Comment

                      Working...
                      X