Announcement

Collapse
No announcement yet.

Blender Developers Find Old Linux Drivers Are Better Maintained Than Windows

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Blender Developers Find Old Linux Drivers Are Better Maintained Than Windows

    Phoronix: Blender Developers Find Old Linux Drivers Are Better Maintained Than Windows

    To not a lot of surprise compared to the world of proprietary graphics drivers on Windows where once the support is retired the driver releases stop, old open-source Linux OpenGL drivers are found to be better maintained...

    http://www.phoronix.com/scan.php?pag...U-Requirements

  • ms178
    replied
    Originally posted by polarathene View Post
    Windows might have it's issues but macOS has completely shafted nvidia since Mojave(10.14 released last september).

    Despite Apple supporting eGPU and having Pascal gen support in High Sierra(10.13, previous release), they dropped support for nvidia GPUs down to kepler, citing only those with official support for their proprietary graphics API are able to be used going forward. Nvidia's more than happy to provide working drivers, but Apple refuses to accept them for Mojave. Their current hardware has moved to AMD GPUs.

    So if you want to use your nvidia GPU newer than kepler(which is pretty damn old now), you have to avoid updating macOS to Mojave(or newer assuming they continue to refuse supporting nvidia going forward). High Sierra is also limited to Pascal gen, nothing newer.
    There is unfortunately just one way around it: Avoid Apple products. This company is even more anti-consumer than Nvidia or Intel.

    Leave a comment:


  • skeevy420
    replied
    Originally posted by oiaohm View Post
    The AMD 280x is also odd that some makes by some vendors of it never have screen flicker and run a lot more stable. Some messing with the cards clocking have made the AMD 280x behave by under clocking the memory. Its like as if some makes of the AMD 280x cards the memory does not have suitable cooling or power supply and then all hell breaks loss as the gpu memory turns into random garbage or performance stalls due to excess heat.
    Good catch. I completely forgot about that clocking stuff since that was something I only ever read and never had to put into practice. IIRC, sometimes simply using the dpm high kernel command line could "fix" some of those cards.

    I've had excess heat issues with two MSI cards now. With my 580 I found that undervolting it stopped all thermal throttling issues which really helped with gaming performance.

    Oh, I should add that my 260x ran a lot more stable with the memory underclocked (it ran a few C cooler doing that).

    I hope you get you GPU stuff figured out linner.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by linner View Post
    It's just buggy. Poor performance, graphical glitches, kernel panics, etc. I'm using an AMD 280x on the workstation I'm typing on right now and 2 of the 3 monitors I have hooked up are flickering glitches as I type (I'm running Arch Linux with the latest everything). Sometimes video will glitch out with lines on it. Etc... Only way to fix it is to reboot. About 90% of the time I boot this computer only 2 of the monitors turn on until I unplug and re-plug the 3rd one then it comes on. I also develop OpenGL software and run in to bugs all the time against the API. ATI sucks.
    I am sorry to say the AMD 280x be it Windows or Linux you can be fairly screwed once you start pluging in lots of monitors into single card as this puts more pressure on the card and it seams to bring out a construction defect. AMD developers are still working exactly what is wrong all these years latter it still don't work right because it does not seam to be all AMD 280x with problem. Hopefully there will be a kernel driver fix for Linux. But this could big batch of horrible lemon.

    The AMD 280x is also odd that some makes by some vendors of it never have screen flicker and run a lot more stable. Some messing with the cards clocking have made the AMD 280x behave by under clocking the memory. Its like as if some makes of the AMD 280x cards the memory does not have suitable cooling or power supply and then all hell breaks loss as the gpu memory turns into random garbage or performance stalls due to excess heat.

    I find the older ATI cards quite dependable like the AGP ones.

    Monitors not turning up I have had hell with Nvidia and Intel as well as AMD it was not because of the graphics card as such. Some monitors horrible only send id code inside X time of power coming on the monitor cable. So you can have a time out based on what port the monitor is in.

    Leave a comment:


  • skeevy420
    replied
    Originally posted by linner View Post

    It's just buggy. Poor performance, graphical glitches, kernel panics, etc. I'm using an AMD 280x on the workstation I'm typing on right now and 2 of the 3 monitors I have hooked up are flickering glitches as I type (I'm running Arch Linux with the latest everything). Sometimes video will glitch out with lines on it. Etc... Only way to fix it is to reboot. About 90% of the time I boot this computer only 2 of the monitors turn on until I unplug and re-plug the 3rd one then it comes on. I also develop OpenGL software and run in to bugs all the time against the API. ATI sucks.
    I hate to say this, but I think that's just the 280x. Back in my 260x/CIK days, they always had bugs that my GPU didn't have with AMDGPU. The reason is that even though the 260x is a cheaper, lower end GPU, it's based on Bonaire GCN2 and the 280x is Tahiti GCN1.

    Leave a comment:


  • linner
    replied
    Originally posted by oiaohm View Post
    https://www.x.org/wiki/RadeonFeature/
    I would like to know what you are talking about with the ATI/AMD cards. If you use the radeon driver all output all all generations of ATI/AMD cards work including the S-Video. Yes there are some features missing in different generations.
    It's just buggy. Poor performance, graphical glitches, kernel panics, etc. I'm using an AMD 280x on the workstation I'm typing on right now and 2 of the 3 monitors I have hooked up are flickering glitches as I type (I'm running Arch Linux with the latest everything). Sometimes video will glitch out with lines on it. Etc... Only way to fix it is to reboot. About 90% of the time I boot this computer only 2 of the monitors turn on until I unplug and re-plug the 3rd one then it comes on. I also develop OpenGL software and run in to bugs all the time against the API. ATI sucks.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by linner View Post
    Personally I don't find Linux all that great in this regard. Try running an old video card like any AGP card. The nVidia MX440 in my laptop for the 77900GT in my arcade machine. Or even the PCIe 8800GTS 640 in my other arcade machine. The problem is those old nVidia drivers are not kept up to date and don't work with the latest X.org. Don't even get me started on ATI/AMD, it's even worse, so buggy.

    Meanwhile those cards run fine on Windows except S-Video output gets disabled. Which is a huge stupid separate problem itself; something about copyright; like people would even use low quality S-Video for that. The screens in my arcade machines only take S-Video so I stick to the broken hacked patched-up Linux+nVidia but it's a PITA.
    https://www.x.org/wiki/RadeonFeature/
    I would like to know what you are talking about with the ATI/AMD cards. If you use the radeon driver all output all all generations of ATI/AMD cards work including the S-Video. Yes there are some features missing in different generations.

    If its a old ATI AGP card other than been the features missed as listed in the Radeon Feature table it works perfectly these days including S-Video output.

    Really only reason I could think that you are calling ATI/AMD buggy is if you have attempted to use the old closed source drivers AMD stopped support on because in 99% of cases the open source mesa/radeon stack rendered better and run faster. I know this because when I am setting up displays using old computers I am looking for the machines with the ATI/AMD cards because they will be a lot less pain in ass due to fairly much working with the stock drivers.

    Leave a comment:


  • V10lator
    replied
    Originally posted by thecursedfly View Post
    under Windows you have ONE driver and it works well, while under Linux it seems that there are several attempts, none of which work perfectly (not saying it's like that, but it's my experience).
    Last days somebody handled me a crappy laptop to find the best Windows drivers. For the sound card (some Sigmatel chip IIRC) I found 3 different drivers:
    1. The one from the manufacturers website. This one was horribly outdated and had almost no functions.
    2. The one Windows Update wanted to throw in. It looked like a newer but still outdated version of the first but still with basic functionality missing.
    3. The default HDA Windows driver. This one is the newest and offered functionality the others didn't (like per-sink volume adjustment, so you're able to have different volume when plugging in headphones, for example) and I couldn't find any downside in using it except Windows warning me that the driver might not be for the hardware.

    So are you sure the situation is better on Windows? In my experience it's just plug&play on Linux (in fact I booted a random Linux CD which was flying around just to see what happens and the sound driver offered everything needed) while Windows wants you to search and compare different drivers.

    Leave a comment:


  • linner
    replied
    Personally I don't find Linux all that great in this regard. Try running an old video card like any AGP card. The nVidia MX440 in my laptop for the 77900GT in my arcade machine. Or even the PCIe 8800GTS 640 in my other arcade machine. The problem is those old nVidia drivers are not kept up to date and don't work with the latest X.org. Don't even get me started on ATI/AMD, it's even worse, so buggy.

    Meanwhile those cards run fine on Windows except S-Video output gets disabled. Which is a huge stupid separate problem itself; something about copyright; like people would even use low quality S-Video for that. The screens in my arcade machines only take S-Video so I stick to the broken hacked patched-up Linux+nVidia but it's a PITA.

    Leave a comment:


  • polarathene
    replied
    Windows might have it's issues but macOS has completely shafted nvidia since Mojave(10.14 released last september).

    Despite Apple supporting eGPU and having Pascal gen support in High Sierra(10.13, previous release), they dropped support for nvidia GPUs down to kepler, citing only those with official support for their proprietary graphics API are able to be used going forward. Nvidia's more than happy to provide working drivers, but Apple refuses to accept them for Mojave. Their current hardware has moved to AMD GPUs.

    So if you want to use your nvidia GPU newer than kepler(which is pretty damn old now), you have to avoid updating macOS to Mojave(or newer assuming they continue to refuse supporting nvidia going forward). High Sierra is also limited to Pascal gen, nothing newer.

    Leave a comment:

Working...
X