Announcement

Collapse
No announcement yet.

Vulkan 1.0.5 API Specification Released

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by hansg View Post
    I find the notion that the standard is already being extended with alternate (but vendor-specific) ways to do things that are already possible (in the core spec) a very, very bad sign. I had great hopes for Vulkan, that it might not become the mess that is OpenGL (where a beginner is faced with hundreds of extensions that may or may not be supported on specific drivers and hardware), but before there are even proper drivers or applications, the game starts again.

    I suppose NVidia will frantically port every OpenGL extension they support to Vulkan now, leaving us with (yet again) an absolute jungle of unclear extensions a few years down the road. This is just incredibly disappointing.
    Of course, that is what vendors have done and will always do; they try to tempt developers into making non-portable assumptions it's called "lock in".
    They want programs, which don't say "runs on a GPU", but "requires brand X GPU". They see developers going lower level closer to the hardware as a marketing & competition opportunity. Furthermore they want managers to have the DOUBT, that moving to a competitor will have large knock on costs, they defend their encumbent market position with this strategy, and it's Nvidia who have the most to lose. Vulkan is a disruptive technology to the OpenGL market place, giving new opportunities to firms like AMD & Intel.

    Developers need to be ones who make tool kits, conformance tests and such for core parts of API, so vendor extensions can wither & die, just used by those with legacy requirements (moving code base written already using vendor specific extensions).

    Application programmers, generally tend to want to code to a platform/engine which takes care of all the nasty details in an efficient enough way (possibly using vendor extensions); but games are NOT like that, a lot of effort may be spent in good games to wring the last few % out of the HW potential, hence techniques like data oriented programming, rather than objects to reduce locking contention and so on.

    Comment


    • #32
      Okey, I don't understand why the NVIDIA's VK_NV_glsl_shader extension is even added. To me it feels like Nvidia is saying "Hey, we are adding things to Vulkan because we know how it works", even though it feels like the opposite.
      In any case, lets see if the Nvidia specific extension is going to be used and if it's going to lead to problems or success.

      Comment


      • #33
        I think some of you are looking way too much into this. The extensions are meant as helper functions to aid porting and getting started as quickly as possible. It helped me when I was messing with Vulkan on launch day and like everyone else I replaced the GLSL extension with the proper way when I got things working which is like 10 lines of extra code + some spirv binaries.

        As for the Vulkan to GL extension GLFW (opengl window manager) was pretty quick to get Vulkan working but if you have a huge OpenGL codebase it's useful for comparing OpenGL against Vulkan output.

        The idea behind these extensions is that you remove them when you're done. Who would want their application to only work on Nvidia?

        Comment


        • #34
          Originally posted by TheBuzzSaw View Post
          Also, NVIDIA's crimes against humanity had nothing to do with extensions; it was the fact that NVIDIA would hard code against specific games.

          When I would read NVIDIA drive release notes, I would see that game X runs 12% faster, game Y runs 22% faster, and game Z runs 15% faster. I never really dug far into it. I assumed that NVIDIA was running the game, studying API call patterns, and merely optimizing its driver against those patterns. That is totally fair game as far as I am concerned. If real games are commonly using D3D/GL a particular way, the driver can streamline that path.

          But that's not what was happening. NVIDIA had a bunch of hard coded checks against whole applications/shaders and substituting code with its own re-implementation! In other words, rather than go to Gearbox and say "hey, your Borderland shaders could be better written this way", NVIDIA just quietly swaps the entire shader out for a better one it wrote. So, on the outside, it appears to the world that Borderlands just "runs better on NVIDIA", but the truth is AMD (or whatever other GPU) is literally running an outdated and suboptimal shader in the first place! "Oh, AMD sucks. They can't do anything right."

          In short, all vendors create extensions for graphics APIs. That shouldn't upset anyone or be seen as "the first steps to anti-competitive behavior". NVIDIA simply needs to be called out on its attempts to cherry-pick bugs in other programs and create this illusion that NVIDIA's actual driver is "better at everything" when it actually isn't.
          Indeed people thing that games are written for Nvidia and Nvidia is not standard. The sad truth is that Nvidia driver replaces half the game with their own version and without output errors for devs to correct. Someone must sue them for their invasive way in games code, except if they all agree to make it standard.

          Comment


          • #35
            Originally posted by CapsAdmin View Post
            Who would want their application to only work on Nvidia?
            Feral interactive.

            Sorry couldn't resist the bait. Not commenting on the technical similarities... But I understand why the Linux community has become paranoid about this; because what has happened in the last few years is the opposite of what 'PC gaming' is supposed to represent... A place where hardware and software vendors compete within a framework of industry standards.

            Comment


            • #36
              I think there is another side to VK_NV_glsl_shader.
              Supposedly Nvidia has a well tuned GLSL to Nvidia's internal IR used in opengl.
              With Vulkan, Nvidia needs to use a new SPIR-V to Nvidia's internal IR.

              So does using VK_NV_glsl_shader generate faster shaders than the standard way?

              If it does than we know Nvidia is being too greedy as usual.

              Comment


              • #37
                Originally posted by rob11311 View Post
                A compiler for a shader doesn't even need to be loaded if unused
                Loading the compiler on demand will need some time, so your FPS will drop.

                Originally posted by triangle View Post
                So does using VK_NV_glsl_shader generate faster shaders than the standard way?
                Maybe in the short run but the SPIR-V compiler should catch up fast and in the long run be faster (for example: "This allows higher performance lowering to target devices" - source: https://en.wikipedia.org/wiki/Standa...ntation#SPIR-V ).

                Comment


                • #38
                  Originally posted by V10lator View Post
                  Loading the compiler on demand will need some time, so your FPS will drop
                  Stuff like that is done on start up, as part of initilisation, not required per frame, so it ought not affect FPS at all, you already have big delays when starting a new level/room/area, shader compiling fits in naturally with that phase where data's loaded from disk

                  Comment

                  Working...
                  X