Announcement

Collapse
No announcement yet.

Open-Source "Nouveau" Driver Now Supports NVIDIA Ampere - But Without 3D Acceleration

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by Charlie68 View Post
    All the more reason not to buy Nvidia hardware if you want to use Linux. The world is beautiful even without Nvidia ... you know how nice not having to worry about Nvidia drivers anymore? That's great, because everything works!
    So the world is beautiful even without nVidia? then what graphics card manufacturers do you suggest?

    http://www.dirtcellar.net

    Comment


    • #22
      Originally posted by zexelon View Post
      You know what I love? I just bought a Gigabyte RTX 3090 a few weeks ago... and when I booted into Linux it just worked! And it works amazingly well
      How did you get it to "just work" as soon as you boot into Linux? There is no way this could happen; you need to install the NVIDIA drivers first.

      Originally posted by zexelon View Post
      You can keep your AMD open source driver (which apparently is also crappy, if you want to go all the way you have to use the AMD Pro closed source driver... but I have a 3090 that works so I wouldn't know about AMD issues).
      ...sure.
      Mesa 19.0 has been nearly 100% stable for me, and it is faster than AMDGPU-PRO (surprisingly, the PRO driver hangs more often...).

      Originally posted by zexelon View Post
      Keep your AMD FOSS GPU... I will stick with the one that crushes it and gives me all the performance I paid for
      Keep your NVIDIA monopoly GPU... I will stick with the one that crushes it and gives me all the ease of use and freedom I paid for

      Comment


      • #23
        Originally posted by Alexmitter View Post
        - Power management does effect signal quality? Signal quality of a digital signal, 1s and 0s. Are you trying to drive your 4k panel with VGA or composit?

        - Soldering also does not effect a digital signal quality as long as pins are connected.

        If there is one thing that may effect your digital signal quality, its bad cables and bad termination.
        The is completely not knowing you stuff.


        For any cable, attenuation (measured in dB) will increase with frequency; this attenuation comes from a few factors. Loss to resistance goes up with frequency, because higher frequency signals are able to use less and less of the cross-section of the wire (this is known as "skin effect") and so have less copper to travel through. Losses to reactance -- capacitance and inductance -- also increase with frequency.

        Soldering in some forms of dry joints that are some what functional have added capacitance/resistance/inductance(yes they can be 1 of these or any mix of all 3) to the signal path yes the skin effect mentioned there makes solider joint issues worse as the transfer frequency increase. As you attempt high frequency down the cable or across a bad solider joint the effect on signal loss as in dB drop off increases this shows as worse attenuation. Digital signal is 1 and 0 driven at different frequencies that frequency is why 30Hz 4K will work yet 60Hz dies in many cases.

        So your idea that bad soldering has no effect is wrong.

        Power management has effect on how much gain different GPU put on the output pins. If you don't have enough gain to over come the attenuation losses in the cable and soldering and board.... you will not get that mode. Lower frequency to drive 30Hz 4k you need less gain than at 60Hz 4k that using higher frequency down HDMI/Display port link so as more attenuation to over come. Fun point here is extra capacitance some forms of bad solder joints add also equal more power/gain equals bad due phantom data coming from the capacitive discharge.

        Being a digital signal does not make attenuation issues go away. High enough db signal loss(attenuation) a digital signal is not getting though or will not be stable the result is that output mode does not work.

        Fun point here all forms bad joints where the wires are in fact somewhat connected may not show it self if the signal is low enough frequency that the extra capacitance/resistance/inductance caused by the bad joints is not causing bad enough attenuation to cause signal failure. Yes you increase the signal frequency to to drive 60Hz instead of 30hz.

        Attenuation is a core problem that does not only apply to cables. The idea that soldering does not effect digital signals is wrong. Digital signals have higher tolerance horrible attenuation values than analog signals but digital signals are not 100 percent immune.

        Yes some cases of bad of cables is bad soldering inside the cable plug. Bad soldering does not only have to be in the cables this can be in the monitor and on the GPU cards.

        To be correct a bad solider joint on a GPU is sometimes over come by using a higher grade cable with lower attenuation to signal you have not fixed the problem just worked around it.

        Yes there are some known cards out there that need 1 or 2 joints touched fix them for some reason they were bad in factory and no one picked it up. This can be that the solder has slightly corroded after it left the factory so at the factory it looked fine but by the time it got to the end user its messing up only when they are in fact needing to clock the hdmi or display port to max. There will be quite a few people with GPU with defective solder joints who will never notice because they will never hook up a monitor requiring the higher frequency or put on a long enough cable to pick up the excess loss.

        Manufacturing defects are a ass.

        Comment


        • #24
          Originally posted by oiaohm View Post

          The is completely not knowing you stuff.
          Lets get some reality in that. You are right, soldering can effect the gain and signal integrity, but on a level so minimal that the cable, its pins, the pins of your GPUs hdmi socket and the pins in the receiving devices hdmi socket make it look meaningless in comparison.

          Comment


          • #25
            Speckles and line artefacts on a digital video output are most likely caused by a poor HDMI cable. If you're really really unlucky it's a bad card, e.g., hdmi connector poorly attached, but that's easily testable with different cables, and then you RMA it. Repeating artefacts OTOH are bad memory, again, RMA.

            I even had dust in a HDMI port causing issues once. Blew it out of the port+plug, and it was fine afterwards. I presume the dust (or a poor connection that one time) was causing a bit flip or two by static discharge or similar - the HDMI signal is fairly robust so it will recover quickly, so it looks like speckles.

            Comment


            • #26
              Originally posted by oiaohm View Post

              This is a little in correct. 4k60hz be it a raspberry pi, Intel, Nvidia or AMD its a problem child where defects show themselves.

              Yes you have people complaining both ways that Linux is able to do 4k60hz when windows can only do 30 and Windows is able to do 4k60 and Linux is only able to 30.

              Causes 1 thing but there is two parts to it first part is why windows and linux appear different the second part is what the fault is,.
              1) Differences in power management between Windows and Linux on the GPU results in different signal quality on the ports of the GPU. This is generic hell this results in either Windows works or Linux works and the other one does not. This one can at times be confirmed by locking the cards performance settings the same on Windows and Linux then magically getting the same results all time but that has not fixed the true cause.
              2) Minor defects in cables and soldering on cards show up when you are in 4k60hz mode. This is why you have users with the same model cards where one claims 4k60hz works fine for them and another where its broken. Yes these faults can be over come by right level of power when you have windows or linux working and the other not. Please note more power does not equal better in all cases either more power can equal more noise so equal poor signal so no 4k60hz and not enough power to over come the defect also equals no 4k60hz..

              So OneTimeShot you have a good card its fine. But there are a lot of users who don't luck out with a good card be it AMD or Nvidia. Yes you get people who say Nvidia suxs at 4k60Hz as well and AMD is great because they got a broken Nvidia card. Be it AMD or Nvidia you start attempting 4k60Hz you quickly find out how good your cables and card and monitor really is in construction quality sometimes the card you have is not the construction quality you would prefer other times its the cables other times it the monitor port wiggled the right way because that broken.
              So If I understand you correctly ...solderjoints on Nvidiacards are good and on AMD's are bad? (yes bad solderjoints can have the effects you had pointed out)

              Comment


              • #27
                Originally posted by Alexmitter View Post
                Lets get some reality in that. You are right, soldering can effect the gain and signal integrity, but on a level so minimal that the cable, its pins, the pins of your GPUs hdmi socket and the pins in the receiving devices hdmi socket make it look meaningless in comparison.
                This is badly under estimating how bad a bad solder joint can be. Good solder joints you are right the loses in the good solder joint is so low that the values is basically meaningless in comparison. A bad solder joint that is still electronically conductive can introduce as much signal noise as 30+ meters of HDMI cable way past the noise coming from pins. Solder faults can in fact be many times worse than the pins in the sockets and the cable in the middle.

                The ultra fun part about bad solider joints is they can be basically a frequency filter yes the triple bad joint that is adds capacitance and resistance and inductance can basically be I work fine unless you happen to attempt to put X frequency signal though it then the signal disappears.

                Some of this came clear to us here when we were doing reflows on POLARIS cards there were particular makes and batches of them that its appears that the reflow temperature across the card when they were made was wrong. Not out by much but enough to create a part functional joints all over the card that progressively get worse until the card does not function at all. The repaired cards were out performing non repaired cards that were not from known defective batches then we went looking. Note these cards are all the same chipset and PCB design. Then you would fine that a few pins were not soldered right on the outputs fix those invidually so not needing to reflow the complete thing and the cards performance would come up so where a card was not doing 4k60hz would after fix do it without issue without changing cables.

                Yes these fixes would also see differences between Linux and Windows on doing 4k60hz reduce as well. After we found this we looked at people who had Nvidia cards and were complaining about same problems in models that others said 4k60hz worked fine this turned out to be solder faults again with Nvidia support will be telling you over and over again to change cables not check for bad solder..

                Really it was surprising to me how little bit of solder to connect the hdmi/display port sockets to the GPU could in fact cause this level of interference at times.

                The is really like the story of the straw that broke the camels back the little minor thing that does not look important can be a very important factor.

                Alexmitter Its really fun when you have to explain what is happening to be intentionally attempting to make these different bad solder joints to get oscilloscope of what these different defective joints are up to and find out what they look like new and what they look like 6-8 months latter.

                The good part is most of the defective solder joints are what are called Dry joints that if you can do visual inspection on them they have a "dull or matt finish" instead of the correct shiny finish when they are finally giving trouble. The horrible part is we have cards we had photographic evidence of joints that at first look shiny seam to behave right but they are not right they were made at the wrong temperature for particular types of flux so flux chemicals has stayed in the solder the result is the joint changes from looking and behaving like a perfectly good joint to 6-8 months down the track being a really degraded dry joint in colour and performance. This could be as simple as something reflow chamber minor insulated that section of the board. From testing its less than 0.5 of a C out in the reflow chamber at that point on the board between having this issue and not having this issue. These kind of slow appearing dry joints do turn up on new GPUs more than you would expect as they will pass all testing at the factory and look and behave perfect at the factory.

                These are luck of the draw if you happen to get one of the cards that did not for some reason get to the right temperature to solder correctly. Really horrible when its the complete card like some of the polaris cards were. Yes there have been historic batches of Nvidia cards with complete soldering failure as well. This is basically the reverse of the silicon lottery where you win yourself a lemon that at first seams to be fine but fairly quickly ceases to be that you may or may not notice.

                Comment


                • #28
                  Originally posted by CochainComplex View Post
                  So If I understand you correctly ...solderjoints on Nvidiacards are good and on AMD's are bad? (yes bad solderjoints can have the effects you had pointed out)
                  I have had bad solder joints on AMD and Nvidia cards. Really a minor temperature fault when the card is being constructed in the factory will create them. If you don't push your HDMI/Display port outputs to the max you may have a card with defective solderjoints on output and never notice.

                  Nvidia tech support will tell people to keep on trying different cables in the hope they find on that works so they can blame the cable instead of having to admit some cards have solder issues.

                  raspberry pi, Intel, Nvidia or AMD << To be correct I have had defective solder on joints on everyone listed there causing 4K60hz not to work. Yes your onboard motherboard header not soldered on right for hdmi or the display ports on Nvidia/AMD card not soldered on right all cause the same set of problems.

                  This is basically a GPU vendor neutral fault. Just happens to be a form of lemon you will get with anything with a video output or port operating at high frequency.

                  Comment


                  • #29
                    Originally posted by oiaohm View Post

                    I have had bad solder joints on AMD and Nvidia cards. Really a minor temperature fault when the card is being constructed in the factory will create them. If you don't push your HDMI/Display port outputs to the max you may have a card with defective solderjoints on output and never notice.

                    Nvidia tech support will tell people to keep on trying different cables in the hope they find on that works so they can blame the cable instead of having to admit some cards have solder issues.

                    raspberry pi, Intel, Nvidia or AMD << To be correct I have had defective solder on joints on everyone listed there causing 4K60hz not to work. Yes your onboard motherboard header not soldered on right for hdmi or the display ports on Nvidia/AMD card not soldered on right all cause the same set of problems.

                    This is basically a GPU vendor neutral fault. Just happens to be a form of lemon you will get with anything with a video output or port operating at high frequency.
                    ok then I support your statement. I was just wondering if you imply that AMD ones are bad and Nvidias are good. If we have vendors like ASUS, Sapphire etc. one can expect an even distribution of Nvidia vs AMD failurs. It might be possible that one particular design is more prone to have issues with signal chain impedance....but over multiple generations and models there should also be an even distribution.

                    ..btw If the OEM uses cheaper jacks with less tight manufacturing tolerances this might also be an issue....among corrosion

                    Comment


                    • #30
                      I don't get why amd users always are writing toxic posts in nvidia related news

                      Comment

                      Working...
                      X