Announcement

Collapse
No announcement yet.

SteamOS Update 153 Brings Updated AMD/NVIDIA Linux Graphics Drivers

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Kano
    replied
    I have lots of gfx cards, most likely more than 10. I only bought very few myself. Feel free to send me one but don't expect that I spend a cent for an AMD gfx card upgrade. The HD 5670 can run fglrx, enough to test driver quality. If you need to buy a (former) highend card to use a 11 year old engine then something must be wrong. The oss drivers are better for really fast cards, but SteamOS has a simple autodetection that would enforce fglrx.

    Leave a comment:


  • marceel
    replied
    Originally posted by Kano View Post
    @marceel

    I do not buy gfx cards, feel free to send me something better. fglrx has some basic problems, especially with Source engine there is a weird lag that does not happen with nvidia binary. You might be right that a HD 5670 is not a fair comparison to my GTX 650 Ti, but I have got Haswell or GT 630 (Kepler) to compare as well. Witcher 2 is unplayable with my AMD gfx card, but that's no real surprise, it was unstable with my Nvidia card as well, especially if you have got lots of savegames.
    I had 4850 and now i have 6950 and I never had problems with source engine(dota 2). Right now, I'm playing Civ:BE on 1920x1080, ultra settings and I have 25-30FPS. Even Path of Exile is now playable with new driver.

    So I think, hardware upgrade might be good choice for you, if you want blob drivers, otherwise you should use open driver.

    Leave a comment:


  • The Walking Glitch
    replied
    No, game performance is still unacceptably bad. I get several times the framerate in TF2 with this card under Wangblows, and a comparable Nvidia card gets several times that.

    Originally posted by marceel View Post
    Get a stronger video card. Game performance is good. I have no problems with any steam linux game. Even Witcher 2 is running ok after the patches from the developer.

    Leave a comment:


  • bridgman
    replied
    Originally posted by eydee View Post
    And both are stupid. Video cards were invented for the very reason of having a specialized hardware for 3D graphics. Now they want to make it back, and turn video cards into generic computing things. It makes no sense, we already have hardware for that.
    Graphics cards still have specialized hardware for 3D graphics. What changed is that the 3D graphics pipeline evolved from fixed-function processing to programmable processing (shaders) with each pipeline stage requiring a lot of floating point processing. The industry response to that was a unified shader core that could be shared between pipeline stages, and a side-effect of *that* was that you ended up with a parallel processing core which could also be used for other things.

    Leave a comment:


  • Kano
    replied
    @marceel

    I do not buy gfx cards, feel free to send me something better. fglrx has some basic problems, especially with Source engine there is a weird lag that does not happen with nvidia binary. You might be right that a HD 5670 is not a fair comparison to my GTX 650 Ti, but I have got Haswell or GT 630 (Kepler) to compare as well. Witcher 2 is unplayable with my AMD gfx card, but that's no real surprise, it was unstable with my Nvidia card as well, especially if you have got lots of savegames.

    @Paul-L

    I founded Kanotix, that's correct. But I am not the only developer. Maybe the only one who posts here

    Leave a comment:


  • duby229
    replied
    Originally posted by eydee View Post
    And both are stupid. Video cards were invented for the very reason of having a specialized hardware for 3D graphics. Now they want to make it back, and turn video cards into generic computing things. It makes no sense, we already have hardware for that.
    I don't know about that. Maybe your right.

    But, rasterizer technology probably isn't the future of 3d graphics. Well, it will be for the next while, but not forever. I'm still fascinated by realtime raytracing. I can't wait for GPUs to get there.

    EDIT: Raytracing technology can look almost like live action. When it can be done in realtime the fun stuff will start coming out.
    Last edited by duby229; 15 January 2015, 10:37 AM.

    Leave a comment:


  • eydee
    replied
    Originally posted by duby229 View Post
    Yeah, longer release cycles have already been going on.

    The last major architectural overhaul at AMD was GCN. The last for nVidia was Fermi. In both cases that was a few years ago.
    And both are stupid. Video cards were invented for the very reason of having a specialized hardware for 3D graphics. Now they want to make it back, and turn video cards into generic computing things. It makes no sense, we already have hardware for that.

    Leave a comment:


  • duby229
    replied
    Originally posted by plonoma View Post
    This isn't a static situation.
    Let me explain.
    The rate at which hardware is designed and changes is slowing down.
    Due to technology reaching limits and architecture stabilization.
    (Once you have GPGPU where the GPU's are basically massively parallel processors,
    you won't have fundamental changes in GPU design under the hood.)

    There will be a large upheaval in the coming years with the next generation graphics API's (Direct X 12, OpenGL NG/5, Mantle and Apple's Metal) and PBR(Physically Based Rendering) but otherwise the stabilizing and slowing, lenghtening refresh trend will continue.
    In the future we might have less hardware releases and thus making longer support cycles more realistic.
    Yeah, longer release cycles have already been going on.

    The last major architectural overhaul at AMD was GCN. The last for nVidia was Fermi. In both cases that was a few years ago.

    Leave a comment:


  • plonoma
    replied
    Originally posted by dungeon View Post
    Kano hope someone will fix blob driver for 5 years old chip (and probably +3 year engineer before sit down and design that hardware with that all that is 8 years going back... that can but really very rarely happen for any vendor drivers, and will more likely be dropped from a driver, then is to be improved .

    Sad but true, how time goes less and less people looked back - i can't deny that .
    This isn't a static situation.
    Let me explain.
    The rate at which hardware is designed and changes is slowing down.
    Due to technology reaching limits and architecture stabilization.
    (Once you have GPGPU where the GPU's are basically massively parallel processors,
    you won't have fundamental changes in GPU design under the hood.)

    There will be a large upheaval in the coming years with the next generation graphics API's (Direct X 12, OpenGL NG/5, Mantle and Apple's Metal) and PBR(Physically Based Rendering) but otherwise the stabilizing and slowing, lenghtening refresh trend will continue.
    In the future we might have less hardware releases and thus making longer support cycles more realistic.

    Leave a comment:


  • dungeon
    replied
    Originally posted by eydee View Post
    Buying stronger hardware while letting 80% of the potential performance go out of the window is not a solution. It is the job of AMD to improve drivers, you can't deny that.
    Kano hope someone will fix blob driver for 5 years old chip (and probably +3 year engineer before sit down and design that hardware with that all that is 8 years going back... that can but really very rarely happen for any vendor drivers, and will more likely be dropped from a driver, then is to be improved .

    Sad but true, how time goes less and less people looked back - i can't deny that .
    Last edited by dungeon; 15 January 2015, 09:31 AM.

    Leave a comment:

Working...
X