Announcement

Collapse
No announcement yet.

Civilization: Beyond Earth Overcoming Linux GPU Driver Problems

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    Originally posted by emblemparade View Post
    You are correct -- I oversimplified by using the term "square" because I didn't want to get into the sticky details. But ... note that if you are supporting OpenGL ES before version 2.0 (virtually all current devices?), then this problem still exists.
    Sure, but this is again an issue that D3D could have in no way made easier, because GLES1 devices could not possibly support D3D9 level of features. You would end up writing two (completely different!!) code paths again.

    Originally posted by emblemparade View Post
    MS did a good thing in standardizing the API together with the hardware. So, when you buy a GPU, you know that it supports "D3D 11", and that means all those features. A game can simply advertise that as a requirement and you know you're good. OpenGL is more fragmented, especially when you add OpenGL ES to the mix. Even the most recent OpenGL 4.4 has some optional extensions (very cool stuff from NVIDIA), so putting a "OpenGL 4.4 capable" label on a GPU is still not enough information about what it's capable of doing.
    Yep, that's true, that's why we have this awkward situation where D3D9 capable hardware supports a mix of both GL2 and 3 (but not all of 3). However, I don't see the issue with this, it just means I have to write maybe a hundred lines more code. I can say "I require D3D9 level hardware in my game", take a look at exactly what this means in terms of capabilities, look for the corresponding GL extensions, and just hard depend on them. Extensions not present? You're very likely not running a D3D9 capable graphics card in the first place.

    I would actually go a step further and say the situation is worse with D3D. Here's an example: Let's say you want to use array textures. In OpenGL, you check for the extension. If it's present, your texture handling code will use it. Otherwise, the texture handling part of your engine will transparently fallback.
    Now, what happens in D3D land? D3D 10.0 required for array textures. Oops. Now you either rewrite your entire engine, even the code bits that have absolutely nothing to do with textures, in D3D 10, effectively cutting out a big (albeit shrinking) XP market, or you have two completely different renderer code bases (!!), just because of one single functionality that can be discovered in OpenGL via extensions.

    Originally posted by emblemparade View Post
    You're right -- the difference is in degree, not in kind. These problems seem to be an order of magnitude more painful in OpenGL, with many more versions and extensions. BTW, I totally skipped over the whole issue of GLSL versions, the most baffling aspect of OpenGL advances. Why break the shader language with almost every release? My current project has four versions of each shader.
    I'm not sure I'm following your, what do you mean by "breaking"? Do you mean the backwards compatibility issues of shader source in things like "varying" vs "out"? That's the only major break I'm aware of, when they created the core profile etc. As far as I know, you can always request a specific GLSL version as long as the driver supports it at a minimum, or higher. So even on a 3.3 driver, you can compile GLSL 110 shaders just fine. Why do you need 4 versions if you can always just compile the lowest GLSL version one?

    Comment


    • #52
      Thanks, you made some good points.

      Originally posted by Ancurio View Post
      I'm not sure I'm following your, what do you mean by "breaking"? Do you mean the backwards compatibility issues of shader source in things like "varying" vs "out"? That's the only major break I'm aware of, when they created the core profile etc. As far as I know, you can always request a specific GLSL version as long as the driver supports it at a minimum, or higher. So even on a 3.3 driver, you can compile GLSL 110 shaders just fine. Why do you need 4 versions if you can always just compile the lowest GLSL version one?
      That break is major enough for me. And then you have things like texture3D being removed. Sure, you can always use the lowest common denominator, but -- it plain sucks that you would have to do that. I ended up deciding for myself that I want my engine to be forward-looking and use newer versions of GLSL (they are indeed nicer), so it was worth it for me to also maintain older versions. And by the way, OpenGL ES is also different enough that I needed yet more versions for that.

      Well, as I keep reiterating: none of these problems are the end of the world. And its not like there aren't other challenges in game programming that might be greater. (Want to talk about cross-platform build systems?) But, when comparing to the D3D experience, OpenGL is more painful.

      Comment


      • #53
        Originally posted by Nille View Post
        But Khronos dont force that each Vendor implement it correct. There is nothing from khronos that test the implementations (e.g. with piglit but with more tests) if they work like documented.
        Exactly. Implementation matters. In Linux, OGL is implemented through MESA, in Windows, though a few .dll's.

        I mean sheesh, the GPU's understand the function calls, so that leaves you just needing to implement the OS handling of all the data. It's not that hard, and more or less inexcusable the full OGL 4.x feature set isn't supported yet. The fact it isn't tells me there are more basic concerns within Linux that need re-evaluation.

        Comment


        • #54
          Originally posted by gamerk2 View Post
          I mean sheesh, the GPU's understand the function calls, so that leaves you just needing to implement the OS handling of all the data. It's not that hard, and more or less inexcusable the full OGL 4.x feature set isn't supported yet. The fact it isn't tells me there are more basic concerns within Linux that need re-evaluation.
          Are you new to Phoronix? Michael has been covering Mesa and Gallium3D progress for years. The proprietary drivers don't use Mesa (well, AMD will use it in the future), but instead support OpenGL on their own, and at the latest version with all the goodies. The free drivers still have a way to go. But you're such a genius, so maybe you can pitch in and solve all the issues.

          It is not trivial to support the newer OpenGL features. OpenGL does not map to the way the GPU is designed, and never has (it was created before GPUs really existed). There's a lot of stuff going on between Mesa and the Gallium3D tracker system to get these features running with good performance.

          This is part of the impetus for MANTLE and whatever-will-come-after-OpenGL: instead of having programmers fight the API, just give them more direct access to the GPU (in a portable way).

          Comment


          • #55
            Originally posted by emblemparade View Post
            The proprietary drivers don't use Mesa (well, AMD will use it in the future)
            No, AMD is currently developing an open-source kernel driver. The AMD's proprietary OpenGL driver will still be fglrx (and closed-source).

            Comment


            • #56
              Originally posted by benmoran View Post
              I'm not directing this solely at you, but:
              Just wanted to add that I'm "lucky" enough to have an Nvidia card in my work machine, and the proprietary driver crashed X about twice a week, like clockwork. It annoyed me to the point that I switched back to the integrated Intel graphics for a time, but Nouveau has thankfully improved enough since then to be a pleasant stable experience.

              Kudos to the Nouveau team for making my card usable. The Nvidia binary drivers tend to be good but if they don't work, you're pretty much screwed if not for Nouveau. My laptop and personal desktop both have APUs, both use the Radeon driver, and both run Civilization 5 perfectly. (I'm not holding out much hope for my laptop running this new one though )
              Not that it invalidates your anecdotal example, but in the past 12 years i have had literally 100s of nVidia cards on dozens of machines ranging from low end ION atom netbooks, bumblebee notebooks, self built desktops up to high end Xeon workstations with Quadro GPUs, all running either Fedora or RHEL. Quick count puts me currently at around 30 active machines.

              ... And they are all *rock* solid.

              Either you are very unlucky or I am extremely lucky. You choose.
              (Or Fedora / RPMFusion may be very good at packaging nVidia, no idea)

              - Gilboa
              DEV: Intel S2600C0, 2xE5-2658V2, 32GB, 6x2TB, GTX1080, F30/x86_64, Dell UP3216Q 4K.
              SRV: Intel S5520SC, 2xX5680, 36GB, 6x2TB, GTX550, F30/x86_64, Dell U2711.
              WIN: Gigabyte B85M-HD3, E3-1245V3, 32GB, 5x1TB, GTX980, Win10Pro.
              LAP: ASUS Strix GL502V, i7-6700HQ, 32GB, 1TB+256GB, 1070M, F30/x86_64.

              Comment


              • #57
                Originally posted by gilboa View Post
                Not that it invalidates your anecdotal example, but in the past 12 years i have had literally 100s of nVidia cards on dozens of machines ranging from low end ION atom netbooks, bumblebee notebooks, self built desktops up to high end Xeon workstations with Quadro GPUs, all running either Fedora or RHEL. Quick count puts me currently at around 30 active machines.

                ... And they are all *rock* solid.

                Either you are very unlucky or I am extremely lucky. You choose.
                (Or Fedora / RPMFusion may be very good at packaging nVidia, no idea)

                - Gilboa

                I have been using an AMD radeon 7950 for several years and it has been rock solid. I have just been geeting more and more performance for each new driver release. But since I felt that it was a time for an upgrade, (especially now that there are so many games coming to linux) I listened to people on the net (this forum and others) and bought me a Nvidia geforce GTX 770.

                The result was that X would not start for me. I uninstalled fglrx first and then installed the nvidia drivers and I could not get it to work. After a lot of fixing I decided to reinstall ubuntu completely and I still didn't get it to work. At last I thought the card was broken, so I took it back to the store and they tried it and said that there was nothing wrong with it. So I went to a friend and installed it in his windows 8.1 machine and it worked perfectly.
                So I sold the card to him with a small discount and reinstalled my 7950 again and now my computer is working perfectly again.

                I tell you this to show that Nvidia is not working perfectly all the time and I am not saying that AMD does either. But I have learned one thing of this: Do not listen too much on people on forums, since there are a lot of fanboys out there.

                Comment


                • #58
                  Originally posted by vein View Post
                  I have been using an AMD radeon 7950 for several years and it has been rock solid. I have just been geeting more and more performance for each new driver release. But since I felt that it was a time for an upgrade, (especially now that there are so many games coming to linux) I listened to people on the net (this forum and others) and bought me a Nvidia geforce GTX 770.

                  The result was that X would not start for me. I uninstalled fglrx first and then installed the nvidia drivers and I could not get it to work. After a lot of fixing I decided to reinstall ubuntu completely and I still didn't get it to work. At last I thought the card was broken, so I took it back to the store and they tried it and said that there was nothing wrong with it. So I went to a friend and installed it in his windows 8.1 machine and it worked perfectly.
                  So I sold the card to him with a small discount and reinstalled my 7950 again and now my computer is working perfectly again.

                  I tell you this to show that Nvidia is not working perfectly all the time and I am not saying that AMD does either. But I have learned one thing of this: Do not listen too much on people on forums, since there are a lot of fanboys out there.
                  Calling 99% (made up number) of nVidia fanboys just because you were unlucky enough to be part of the "other" 1% is somewhat overstretching your (unlucky) personal experience.
                  But in general, I always remind people, and I cannot stress this enough, that hardware and software configuration matters *a lot*.
                  While I can give you at least 30 different nVidia based configurations that I know for a fact that they are rock solid, for the life of me I cannot promise that it will even boot if you stray one inch from any of these configurations.
                  The same, BTW, is true for *any* installation of hardware on any type of system. This is way big companies pay big bucks for buying per-assembled and pre-tested HP/Dell/etc servers and workstations and keep to the tried-and-true configuration.

                  - Gilboa
                  DEV: Intel S2600C0, 2xE5-2658V2, 32GB, 6x2TB, GTX1080, F30/x86_64, Dell UP3216Q 4K.
                  SRV: Intel S5520SC, 2xX5680, 36GB, 6x2TB, GTX550, F30/x86_64, Dell U2711.
                  WIN: Gigabyte B85M-HD3, E3-1245V3, 32GB, 5x1TB, GTX980, Win10Pro.
                  LAP: ASUS Strix GL502V, i7-6700HQ, 32GB, 1TB+256GB, 1070M, F30/x86_64.

                  Comment


                  • #59
                    Originally posted by gilboa View Post
                    Calling 99% (made up number) of nVidia fanboys just because you were unlucky enough to be part of the "other" 1% is somewhat overstretching your (unlucky) personal experience.

                    I think he was just generalizing about fanboys and not nVidia fanboys exclusively.

                    I'm the only Android fanboy in my family everybody including extended family have Apple products. I'm not out to destroy them, all of us separate "brand" fanboys should just get together and love each other.

                    Comment


                    • #60
                      nVidia issues aside, how's CivV stability? At least in my case, it blows up every 3-4 minutes.
                      (In Asphyr's defense, I've opened a support request and, at least for now, they seem very helpful.)

                      - Gilboa
                      DEV: Intel S2600C0, 2xE5-2658V2, 32GB, 6x2TB, GTX1080, F30/x86_64, Dell UP3216Q 4K.
                      SRV: Intel S5520SC, 2xX5680, 36GB, 6x2TB, GTX550, F30/x86_64, Dell U2711.
                      WIN: Gigabyte B85M-HD3, E3-1245V3, 32GB, 5x1TB, GTX980, Win10Pro.
                      LAP: ASUS Strix GL502V, i7-6700HQ, 32GB, 1TB+256GB, 1070M, F30/x86_64.

                      Comment

                      Working...
                      X