Announcement

Collapse
No announcement yet.

SteamOS Update 153 Brings Updated AMD/NVIDIA Linux Graphics Drivers

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • SteamOS Update 153 Brings Updated AMD/NVIDIA Linux Graphics Drivers

    Phoronix: SteamOS Update 153 Brings Updated AMD/NVIDIA Linux Graphics Drivers

    While some are pondering the state of SteamOS and Steam Machines rumors, Valve has put out SteamOS Update 153 after being in the Alchemist Beta state for the past week...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    If somebody wants fglrx-driver 14.12, it is in Debian experimental or within the Kanotix Dragonfire repository patched with kernel support up to 3.18 (does not seem to be needed there). The driver is basically only useful for xbmc (sadly not kodi), game performance is still bad.

    Comment


    • #3
      well

      Originally posted by Kano View Post
      If somebody wants fglrx-driver 14.12, it is in Debian experimental or within the Kanotix Dragonfire repository patched with kernel support up to 3.18 (does not seem to be needed there). The driver is basically only useful for xbmc (sadly not kodi), game performance is still bad.
      Get a stronger video card. Game performance is good. I have no problems with any steam linux game. Even Witcher 2 is running ok after the patches from the developer.

      Comment


      • #4
        Phoronix can you do a comparison test between ubuntu and steamOS performances please ?

        Comment


        • #5
          Originally posted by marceel View Post
          Get a stronger video card. Game performance is good. I have no problems with any steam linux game. Even Witcher 2 is running ok after the patches from the developer.
          Did I miss something? I read "Kano" and below "Kanotix Developer", isn't he the developer?


          Is a real question through.
          Last edited by Paul-L; 15 January 2015, 06:50 AM.

          Comment


          • #6
            Originally posted by marceel View Post
            Get a stronger video card. Game performance is good. I have no problems with any steam linux game. Even Witcher 2 is running ok after the patches from the developer.
            Buying stronger hardware while letting 80% of the potential performance go out of the window is not a solution. It is the job of AMD to improve drivers, you can't deny that.

            Comment


            • #7
              Originally posted by eydee View Post
              Buying stronger hardware while letting 80% of the potential performance go out of the window is not a solution. It is the job of AMD to improve drivers, you can't deny that.
              Kano hope someone will fix blob driver for 5 years old chip (and probably +3 year engineer before sit down and design that hardware with that all that is 8 years going back... that can but really very rarely happen for any vendor drivers, and will more likely be dropped from a driver, then is to be improved .

              Sad but true, how time goes less and less people looked back - i can't deny that .
              Last edited by dungeon; 15 January 2015, 09:31 AM.

              Comment


              • #8
                Originally posted by dungeon View Post
                Kano hope someone will fix blob driver for 5 years old chip (and probably +3 year engineer before sit down and design that hardware with that all that is 8 years going back... that can but really very rarely happen for any vendor drivers, and will more likely be dropped from a driver, then is to be improved .

                Sad but true, how time goes less and less people looked back - i can't deny that .
                This isn't a static situation.
                Let me explain.
                The rate at which hardware is designed and changes is slowing down.
                Due to technology reaching limits and architecture stabilization.
                (Once you have GPGPU where the GPU's are basically massively parallel processors,
                you won't have fundamental changes in GPU design under the hood.)

                There will be a large upheaval in the coming years with the next generation graphics API's (Direct X 12, OpenGL NG/5, Mantle and Apple's Metal) and PBR(Physically Based Rendering) but otherwise the stabilizing and slowing, lenghtening refresh trend will continue.
                In the future we might have less hardware releases and thus making longer support cycles more realistic.

                Comment


                • #9
                  Originally posted by plonoma View Post
                  This isn't a static situation.
                  Let me explain.
                  The rate at which hardware is designed and changes is slowing down.
                  Due to technology reaching limits and architecture stabilization.
                  (Once you have GPGPU where the GPU's are basically massively parallel processors,
                  you won't have fundamental changes in GPU design under the hood.)

                  There will be a large upheaval in the coming years with the next generation graphics API's (Direct X 12, OpenGL NG/5, Mantle and Apple's Metal) and PBR(Physically Based Rendering) but otherwise the stabilizing and slowing, lenghtening refresh trend will continue.
                  In the future we might have less hardware releases and thus making longer support cycles more realistic.
                  Yeah, longer release cycles have already been going on.

                  The last major architectural overhaul at AMD was GCN. The last for nVidia was Fermi. In both cases that was a few years ago.

                  Comment


                  • #10
                    Originally posted by duby229 View Post
                    Yeah, longer release cycles have already been going on.

                    The last major architectural overhaul at AMD was GCN. The last for nVidia was Fermi. In both cases that was a few years ago.
                    And both are stupid. Video cards were invented for the very reason of having a specialized hardware for 3D graphics. Now they want to make it back, and turn video cards into generic computing things. It makes no sense, we already have hardware for that.

                    Comment

                    Working...
                    X