Announcement

Collapse
No announcement yet.

Blender Developers Find Old Linux Drivers Are Better Maintained Than Windows

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by thecursedfly View Post
    Still, I remember when Radeon+Vega mobile chipsets came out last year, it took some time before they were well supported on Linux (I don't know exactly at what stage that support is right now since I ended up with Intel), and I believe that was a few months slower than for Windows.
    That is still a bit of a curve problem. Amd run into issues up-streaming code into the Linux kernel due to real found defects. Yes it was a few months slower than Windows to get deployed but it less buggy than the first deployed Windows drivers. Peer review does not always go well this does at times cause driver delay on Linux for new hardware.

    Originally posted by thecursedfly View Post
    Then we don't want to talk about Nvidia, while that is one of the main players, if not THE player if we look at performance. I was "forced" to buy a Intel integrated graphics laptop just because of the Nvidia drivers situation (my previous Nvidia+Linux experience has been terrible: frequent freezes). There was too less choice for AMD laptops.
    This with Nvidia is a Microsoft driver core design (Nvidia closed source) mixing with at Linux kernel design driver being intel.

    Originally posted by thecursedfly View Post
    Also, I'm still waiting for OpenCL being enabled by default for my Intel UHD 620 card; a new NEO Intel driver is being talked about for that, but it's not yet the default under Fedora.
    This is fairly much price of development again with the intel work being reviewed.

    Originally posted by thecursedfly View Post
    There is also the confusing situation of multiple different drivers for the same devices of each brand, which doesn't help the users and creates confusion; under Windows you have ONE driver and it works well, while under Linux it seems that there are several attempts, none of which work perfectly (not saying it's like that, but it's my experience).
    Reality here this is missing something. Windows you normally don't have 1 driver. You have the system provided and third party installed. With windows the system provided is that crap its not usable.

    Also having multi different versions for a card is because the Linux drivers remain in development where the windows drivers normally end up frozen in stone.

    Comment


    • #12
      I think this thread from earlier this week is illuminating: https://lists.freedesktop.org/archiv...il/218511.html

      In it, a change to some common Mesa code last December broke new Firefox versions on certain old r600 class cards.

      Virtually no "official" AMD help could be found, because their developers are 100% focused on more current hardware.

      However, the fact that the driver is open source allowed the affected user to bisect and find the exact commit that caused the problem. And then an interested third party dev (thanks Dave Arlie!) was able to find the problem and fix it relatively easily.

      That's the difference between the proprietary drivers and the open source ones - the fact that other people besides the manufacturer can help fix them makes all the difference.


      The other side of things is unfortunately less rosy - for new hardware, proprietary drivers definitely have benefits vs the open source drivers because the hardware isn't available to anyone else anyway, and the manufacturer doesn't want to release any private details ahead of time.
      Last edited by smitty3268; 05-01-2019, 09:45 PM.

      Comment


      • #13
        Windows might have it's issues but macOS has completely shafted nvidia since Mojave(10.14 released last september).

        Despite Apple supporting eGPU and having Pascal gen support in High Sierra(10.13, previous release), they dropped support for nvidia GPUs down to kepler, citing only those with official support for their proprietary graphics API are able to be used going forward. Nvidia's more than happy to provide working drivers, but Apple refuses to accept them for Mojave. Their current hardware has moved to AMD GPUs.

        So if you want to use your nvidia GPU newer than kepler(which is pretty damn old now), you have to avoid updating macOS to Mojave(or newer assuming they continue to refuse supporting nvidia going forward). High Sierra is also limited to Pascal gen, nothing newer.

        Comment


        • #14
          Personally I don't find Linux all that great in this regard. Try running an old video card like any AGP card. The nVidia MX440 in my laptop for the 77900GT in my arcade machine. Or even the PCIe 8800GTS 640 in my other arcade machine. The problem is those old nVidia drivers are not kept up to date and don't work with the latest X.org. Don't even get me started on ATI/AMD, it's even worse, so buggy.

          Meanwhile those cards run fine on Windows except S-Video output gets disabled. Which is a huge stupid separate problem itself; something about copyright; like people would even use low quality S-Video for that. The screens in my arcade machines only take S-Video so I stick to the broken hacked patched-up Linux+nVidia but it's a PITA.

          Comment


          • #15
            Originally posted by thecursedfly View Post
            under Windows you have ONE driver and it works well, while under Linux it seems that there are several attempts, none of which work perfectly (not saying it's like that, but it's my experience).
            Last days somebody handled me a crappy laptop to find the best Windows drivers. For the sound card (some Sigmatel chip IIRC) I found 3 different drivers:
            1. The one from the manufacturers website. This one was horribly outdated and had almost no functions.
            2. The one Windows Update wanted to throw in. It looked like a newer but still outdated version of the first but still with basic functionality missing.
            3. The default HDA Windows driver. This one is the newest and offered functionality the others didn't (like per-sink volume adjustment, so you're able to have different volume when plugging in headphones, for example) and I couldn't find any downside in using it except Windows warning me that the driver might not be for the hardware.

            So are you sure the situation is better on Windows? In my experience it's just plug&play on Linux (in fact I booted a random Linux CD which was flying around just to see what happens and the sound driver offered everything needed) while Windows wants you to search and compare different drivers.

            Comment


            • #16
              Originally posted by linner View Post
              Personally I don't find Linux all that great in this regard. Try running an old video card like any AGP card. The nVidia MX440 in my laptop for the 77900GT in my arcade machine. Or even the PCIe 8800GTS 640 in my other arcade machine. The problem is those old nVidia drivers are not kept up to date and don't work with the latest X.org. Don't even get me started on ATI/AMD, it's even worse, so buggy.

              Meanwhile those cards run fine on Windows except S-Video output gets disabled. Which is a huge stupid separate problem itself; something about copyright; like people would even use low quality S-Video for that. The screens in my arcade machines only take S-Video so I stick to the broken hacked patched-up Linux+nVidia but it's a PITA.
              https://www.x.org/wiki/RadeonFeature/
              I would like to know what you are talking about with the ATI/AMD cards. If you use the radeon driver all output all all generations of ATI/AMD cards work including the S-Video. Yes there are some features missing in different generations.

              If its a old ATI AGP card other than been the features missed as listed in the Radeon Feature table it works perfectly these days including S-Video output.

              Really only reason I could think that you are calling ATI/AMD buggy is if you have attempted to use the old closed source drivers AMD stopped support on because in 99% of cases the open source mesa/radeon stack rendered better and run faster. I know this because when I am setting up displays using old computers I am looking for the machines with the ATI/AMD cards because they will be a lot less pain in ass due to fairly much working with the stock drivers.

              Comment


              • #17
                Originally posted by oiaohm View Post
                https://www.x.org/wiki/RadeonFeature/
                I would like to know what you are talking about with the ATI/AMD cards. If you use the radeon driver all output all all generations of ATI/AMD cards work including the S-Video. Yes there are some features missing in different generations.
                It's just buggy. Poor performance, graphical glitches, kernel panics, etc. I'm using an AMD 280x on the workstation I'm typing on right now and 2 of the 3 monitors I have hooked up are flickering glitches as I type (I'm running Arch Linux with the latest everything). Sometimes video will glitch out with lines on it. Etc... Only way to fix it is to reboot. About 90% of the time I boot this computer only 2 of the monitors turn on until I unplug and re-plug the 3rd one then it comes on. I also develop OpenGL software and run in to bugs all the time against the API. ATI sucks.

                Comment


                • #18
                  Originally posted by linner View Post

                  It's just buggy. Poor performance, graphical glitches, kernel panics, etc. I'm using an AMD 280x on the workstation I'm typing on right now and 2 of the 3 monitors I have hooked up are flickering glitches as I type (I'm running Arch Linux with the latest everything). Sometimes video will glitch out with lines on it. Etc... Only way to fix it is to reboot. About 90% of the time I boot this computer only 2 of the monitors turn on until I unplug and re-plug the 3rd one then it comes on. I also develop OpenGL software and run in to bugs all the time against the API. ATI sucks.
                  I hate to say this, but I think that's just the 280x. Back in my 260x/CIK days, they always had bugs that my GPU didn't have with AMDGPU. The reason is that even though the 260x is a cheaper, lower end GPU, it's based on Bonaire GCN2 and the 280x is Tahiti GCN1.

                  Comment


                  • #19
                    Originally posted by linner View Post
                    It's just buggy. Poor performance, graphical glitches, kernel panics, etc. I'm using an AMD 280x on the workstation I'm typing on right now and 2 of the 3 monitors I have hooked up are flickering glitches as I type (I'm running Arch Linux with the latest everything). Sometimes video will glitch out with lines on it. Etc... Only way to fix it is to reboot. About 90% of the time I boot this computer only 2 of the monitors turn on until I unplug and re-plug the 3rd one then it comes on. I also develop OpenGL software and run in to bugs all the time against the API. ATI sucks.
                    I am sorry to say the AMD 280x be it Windows or Linux you can be fairly screwed once you start pluging in lots of monitors into single card as this puts more pressure on the card and it seams to bring out a construction defect. AMD developers are still working exactly what is wrong all these years latter it still don't work right because it does not seam to be all AMD 280x with problem. Hopefully there will be a kernel driver fix for Linux. But this could big batch of horrible lemon.

                    The AMD 280x is also odd that some makes by some vendors of it never have screen flicker and run a lot more stable. Some messing with the cards clocking have made the AMD 280x behave by under clocking the memory. Its like as if some makes of the AMD 280x cards the memory does not have suitable cooling or power supply and then all hell breaks loss as the gpu memory turns into random garbage or performance stalls due to excess heat.

                    I find the older ATI cards quite dependable like the AGP ones.

                    Monitors not turning up I have had hell with Nvidia and Intel as well as AMD it was not because of the graphics card as such. Some monitors horrible only send id code inside X time of power coming on the monitor cable. So you can have a time out based on what port the monitor is in.

                    Comment


                    • #20
                      Originally posted by oiaohm View Post
                      The AMD 280x is also odd that some makes by some vendors of it never have screen flicker and run a lot more stable. Some messing with the cards clocking have made the AMD 280x behave by under clocking the memory. Its like as if some makes of the AMD 280x cards the memory does not have suitable cooling or power supply and then all hell breaks loss as the gpu memory turns into random garbage or performance stalls due to excess heat.
                      Good catch. I completely forgot about that clocking stuff since that was something I only ever read and never had to put into practice. IIRC, sometimes simply using the dpm high kernel command line could "fix" some of those cards.

                      I've had excess heat issues with two MSI cards now. With my 580 I found that undervolting it stopped all thermal throttling issues which really helped with gaming performance.

                      Oh, I should add that my 260x ran a lot more stable with the memory underclocked (it ran a few C cooler doing that).

                      I hope you get you GPU stuff figured out linner.

                      Comment

                      Working...
                      X