Announcement

Collapse
No announcement yet.

Vulkan 1.0.5 API Specification Released

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by Daktyl198 View Post

    No, but that's the thing: Even if it's not a part of "core" Vulkan, some form of it is in the official, released documentation of Vulkan. It's an extension that, I admit, is going to help a lot of early ports to Vulkan (though with minimal benefits) and while before it was strictly NVidia only, now that it's on the official documentation of the API (definition only or no), people are going to see that as "So everybody ships it, we're okay to use it"... and so everybody will have to ship it. AMD/Intel are basically forced to implement this now.
    I agree with you that extensions often have a vendor lock-in effect and that this move may make some think that this one is official. However, it doesn't seem very dangerous (at least in comparison with other extensions from NVIDIA), because the GLSL shaders that it accepts must anyway adhere to the Vulkan version of GLSL:
    Originally posted by VK_NV_glsl_shader
    Implementations that expose this function allow GLSL shaders […] as an alternative to SPIR-V shaders. […] the GLSL shaders must be authored to the GL_KHR_vulkan_glsl extension specification.
    AFAIU, this means that freeing from this extension would mostly require changes to the build system.

    That said, I don't see the point in this extension. Porting to Vulkan already requires bigger changes (also in the shader themselves) than merely compiling the shaders in a separate step. It seems like saving a few bucks in the glasses when building an expensive house (understandable if one does the same with many other small things, but still questionable).
    Last edited by kalrish; 03-05-2016, 05:00 PM.

    Comment


    • #12
      And there we go, adding vendor specific stuff, thats the first step in order to Vulkan become what OpenGL is
      This
      The whole point of Vulkan was to design a new specification from the ground up without any legacy limitations.
      Introducing glsl ruins the lean driver part of the clean design thing.
      No, but that's the thing: Even if it's not a part of "core" Vulkan, some form of it is in the official released documentation of Vulkan. It's an extension that, I admit, is going to help a lot of early ports to Vulkan (though with minimal benefits) and while before it was strictly NVidia only, now that it's on the official documentation of the API (definition only or no), people are going to see that as "So everybody ships it, we're okay to use it"... and so everybody will have to ship it. AMD/Intel are basically forced to implement this now. Despite what somebody said earlier, extensions ARE part of why OpenGL was so fucked up. 90% of OpenGL problems were from vender-specific implementations... guess what's harder to implement to spec than core OpenGL? That's right, extensions which, afaik, were never bound to the (already not-very-strict) definitions/implementation rules as core OGL. When there's a lot of them, you're going to have problems in terms of not rending exactly the same as the implementation by the creator of the extensions.

      Extensions are fine, but I really feel they should be sent to Khronos as a whole for review, then if accepted (changed or not) renamed to fit Vulkan instead of having a ton of "NV_" or "AMD_" etc etc extensions in the spec. It really gets annoying after a while. It could be like the browser --webkit- or --mozilla- flags, where if you're using "VK_NV_glsl_shader", you're using NVidia's specific implementation, but if you're using "VK_VE_glsl_shader", you're using the official, spec-compliant version. AMD/Intel wouldn't be forced to implement the _NV_ (nor NVidia _AMD_/_INTEL_) versions so less complaining from everybody, we get more strict extension definitions (less breakage overall). Win-win.


      Extensions are optional. We see implementations differ in extension support all the time... For instance, Mesa doesn't support any binary formats or ARB_compatibility. AMD and Intel doesn't support NV_command_list. In addition, implementations are not "forced" to implement another vendors extensions. As a matter of fact, it's very rarely ever done. Instead, an extension specification for the specific vendor is often created, something in the middle is created and formatted into a EXT extension and eventually into an ARB extension if it gets enough support and Khronos sees it fit to approve.

      We won't see an official extension for GLSL, we already have the appropriate and correct method of using GLSL via a Spir-V compiler. You can also compile GLSL into Spir-V yourself at runtime. NVidia is just making it easier by providing the interface in their driver. It will be slower than Spir-V -> Nvidia Intermediary Binary Format and it will come with its own attached problems.

      Extensions aren't a reason why OpenGL was fucked up. OpenGL was fucked up because of abstract specification that let multiple methods of doing things differ in performance across vendors. It also had specification that went against an efficient multithreading application. It isn't ideal when you need control over the GPU rather than control over the driver that controls the GPU, which differs across implementations and GPU. I've never seen anyone complain about OpenGL extensions (aside from maybe loading them which LunarG SDK now abstracts for us). As a matter of fact, I'd say it's the only reason why OpenGL is still viable at all.



      Comment


      • #13
        Originally posted by eydee View Post
        Why isn't it still in alpha, if it has to change every week, and isn't feature complete either?
        They can't win regardless of what they do and you wouldn't be happy if they waited another 5 or 6 months to make the API more stable, not to mention that those changes perhaps wouldn't have been made if it hadn't gotten use from the public to begin with.

        Comment


        • #14
          Originally posted by kalrish View Post
          I agree with you that extensions often have a vendor lock-in effect and that this move may make some think that this one is official. However, it doesn't seem very dangerous (at least in comparison with other extensions from NVIDIA), because the GLSL shaders that it accepts must anyway adhere to the Vulkan version of GLSL
          Doesn't this still mean that you, as someone writing let's say an open source Vulkan driver, have to compile the shaders at run time, in the driver, and try to match Nvidia's proprietary implementation? (Good luck with that)

          Comment


          • #15
            Also, NVIDIA's crimes against humanity had nothing to do with extensions; it was the fact that NVIDIA would hard code against specific games.

            When I would read NVIDIA drive release notes, I would see that game X runs 12% faster, game Y runs 22% faster, and game Z runs 15% faster. I never really dug far into it. I assumed that NVIDIA was running the game, studying API call patterns, and merely optimizing its driver against those patterns. That is totally fair game as far as I am concerned. If real games are commonly using D3D/GL a particular way, the driver can streamline that path.

            But that's not what was happening. NVIDIA had a bunch of hard coded checks against whole applications/shaders and substituting code with its own re-implementation! In other words, rather than go to Gearbox and say "hey, your Borderland shaders could be better written this way", NVIDIA just quietly swaps the entire shader out for a better one it wrote. So, on the outside, it appears to the world that Borderlands just "runs better on NVIDIA", but the truth is AMD (or whatever other GPU) is literally running an outdated and suboptimal shader in the first place! "Oh, AMD sucks. They can't do anything right."

            In short, all vendors create extensions for graphics APIs. That shouldn't upset anyone or be seen as "the first steps to anti-competitive behavior". NVIDIA simply needs to be called out on its attempts to cherry-pick bugs in other programs and create this illusion that NVIDIA's actual driver is "better at everything" when it actually isn't.

            Comment


            • #16
              Originally posted by M1kkko View Post

              Doesn't this still mean that you, as someone writing let's say an open source Vulkan driver, have to compile the shaders at run time, in the driver, and try to match Nvidia's proprietary implementation? (Good luck with that)

              ... Are you joking?

              Comment


              • #17
                Originally posted by computerquip View Post

                ... Are you joking?
                No I am not, but I'm not an expert either. Can you explain why you think what I said was silly?

                Comment


                • #18
                  Originally posted by M1kkko View Post

                  No I am not, but I'm not an expert either. Can you explain why you think what I said was silly?
                  Because there's no "proprietary implementation" to match. There's a specification.
                  EDIT: And really, the implementation is mostly done for all drivers. They've done it for over a decade in their OpenGL drivers. Alternatively, they can just use glslang to go from GLSL to Spir-V to their intermediary.
                  Last edited by computerquip; 03-05-2016, 06:11 PM.

                  Comment


                  • #19
                    Originally posted by computerquip View Post

                    Because there's no "proprietary implementation" to match. There's a specification.
                    There is a specification AND the Nvidia's proprietary driver, which happens to be the implementation that developers will actually be testing their software against. And the specification seems to be too high-level to accurately tell how Nvidia compiles the GLSL shaders to SPIR-V in their driver. There's so much newly-introduced complexity that even in the best case scenario, at least some difference in performance across implementations is to be expected.

                    Originally posted by computerquip View Post
                    EDIT: And really, the implementation is mostly done for all drivers. They've done it for over a decade in their OpenGL drivers. Alternatively, they can just use glslang to go from GLSL to Spir-V to their intermediary.
                    Yeah, and the performance of the open source drivers has been in such a great shape for all these years.

                    Comment


                    • #20
                      Honestly I think VK_NV_glsl_shader is a pretty dumb extension, but if you're porting a legacy GL application I can see it being helpful. To be clear for all those not paying full attention: they are merely documenting the extension, it is not in core Vulkan, and I doubt other vendors are interested in implementing it.

                      If you want to transition to Vulkan but retain your glsl shaders, I think you could fairly simply introduce Glslang; though honestly, Glslang is in many ways not as faithful an implementation of GLSL and a mature OpenGL driver is, especially when it comes to generating SPIR-V code.

                      I hope NVIDIA deprecates this, because the mere presence of this extension has the potential to degrade the quality of Vulkan driver implementations, leading to the same sorts of issues which prompted the spec to begin with.

                      Comment

                      Working...
                      X