Announcement

Collapse
No announcement yet.

Valve Developed An Intel Linux Vulkan GPU Driver

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #91
    Originally posted by Kano View Post
    You forget 10000 ppl at Globalfoundaries and and lots at TSMC as AMD was splitted and Nvidia never had own fabs. So you compare apples with oranges. Comparing AMD to Nvidia might be better. I don't know if it is funny or not as AMD and Nvidia both compete for being the first TSMC client to get shrinked chips... TSMC will use 16 nm soon, GF seems to be stuck at 28 nm, Intel uses 22 and 14 nm currently.
    Honestly i don't care about that company called nvidia at all for so many years now , it actually does not exist for me really So i don't ever even think about nvidia, as i don't want to be stuck with their blobs, etc...

    Yep that is, i don't even think about it and their products. I heard sometimes something about of course, but those does not exist for me I might buy something from AMD or Intel, but nvidia - no... even if tegra really sounds interesting to me.
    Last edited by dungeon; 06 March 2015, 08:28 AM.

    Comment


    • #92
      Intel builds new fabs on regular basis or upgrades em. That's how it works, you can not keep the fabs for long time. If AMD would ask Intel if they want to produce their chips it would be very interesting, especially for GF...

      Comment


      • #93
        Originally posted by Kemosabe View Post
        And hopefully wine devs will replace their OGL wrapper with Vulkan
        Not likely, as Vulkan is not supported by older hardware

        Comment


        • #94
          Originally posted by newwen View Post
          Not likely, as Vulkan is not supported by older hardware
          The wine devs just need to pull their heads out of the sand and allow support for native apis when they are available. Period.

          Comment


          • #95
            Originally posted by bakgwailo View Post
            That is some revisionist history right there. How did Intel sidestep Gallium? Intel already had a massive investment in the legacy Mesa driver styles before Gallium came out, looked at Gallium, and decided that it would be way too much work with little benefit to them to drop all of their (working very well) old driver code and port to Gallium. Seems fair to me, its not like they don't contribute a ton to Mesa/X anyways.
            Is "sidestepped" such a loaded word? I never intended to criticize Intel for their decisions, nor did I imply Gallium was mature when Intel started. It wasn't just the work involved though, many Intel folks thought the Gallium architecture hadn't proved itself yet at the time.

            Comment


            • #96
              Originally posted by Ancurio View Post
              Is "sidestepped" such a loaded word? I never intended to criticize Intel for their decisions, nor did I imply Gallium was mature when Intel started. It wasn't just the work involved though, many Intel folks thought the Gallium architecture hadn't proved itself yet at the time.
              That was also almost 8 years ago. Even if their reasoning at that time made sense, it makes no sense at all today. They've had plenty of time to do something about it.

              Comment


              • #97
                Originally posted by -MacNuke- View Post
                Any informations about how Gallium3D is useful for Vulkan?

                As far as I understand it, Gallium is a framework for "classic" graphic APIs. To build Vulkan on top of Gallium it had to read the SPIR-V code and transform it to its own IR so the driver can execute it...

                Sounds rather stupid to me. It would be more effective to put Vulkan under Gallium3D.
                First, an analogy.

                OpenGL is to Vulkan what Gallium3D is to SPIR-V.

                So, Gallium3D is not directly useful to Vulkan per se. However, SPIR-V is more useful than Gallium3D, in that since it is required to be supported by the spec, you won't run into Intel choosing not to support it, optimizing the SPIR-V compilers really will raise performance across the board.

                Indirectly, Gallium3D has a large number of optimizations for GPUs that could be used if someone wanted to write a SPIR-V backend. This would have the benefit of compiling existing GLSL, but I don't know if you would lose information in the transformation that SPIR-V took pains to keep for debugging purposes.

                Gallium is a framework for OpenGL APIs, not classic APIs. OpenGL will still be developed and used in the years to come.

                Comment


                • #98
                  I am under the impression that as far as Gallium (e.g TGSI) was concerned it doesn't know anything at all about SPIR-V. SPIR-V will be implemented via LLVM and all the hardware drivers will then be able to use it. So in that case a Vulkan state tracker can function on gallium and SPIR-V will be driver dependent.

                  It just doesn't make any sense to build different driver platforms for every different API. Gallium3d was made to resolve that problem.
                  Last edited by duby229; 06 March 2015, 10:28 AM.

                  Comment


                  • #99
                    Originally posted by blackout23 View Post
                    So there probably won't be a /usr/lib/libvk.so, but every game ships its own like on Windows with d3d9.dll. Which is actually cool, because that allows stuff like the ENB Series for Skyrim & Co. I'm not sure you could do the same with current OpenGL right now.



                    Or OpenGL 4.3 where OpenGL ES 3.1 is basically a subset of that. Both introduced compute shaders. Ivy Bridge Hardware does compute with Direct3D 11, but they never released OpenGL 4.3 drivers so in theory Vulkan should work on Ivy Bridge and newer.

                    A fragment of the talk is online here:
                    End to end video platform for media & enterprises. Live streaming, video hosting, transcoding, monetization, distribution & delivery services for businesses.


                    John McDonald answered a question about hardware support towards the end.
                    Mentioned in that Ustream channel is an answer to the question of how to choose the correct GPU when writing your application. They stated that they are working on a loader library that will take care of some of the details, so there will be some sort version or incarnation of /usr/lib/libvk.so. However, you don't have to use it. They stressed they recommend its use, as we may inadvertently interact with some GPU combination that we did not think of, since it will be possible for Intel, Nvidia, AMD and others to all be in the same system, and they will take care of all of that in the loader.

                    Comment


                    • A correction.

                      Originally posted by Kano View Post
                      @SSX

                      Intel mobile SoC are pretty expensive, what you think is much is less than 4% marketshare. The new Atom x3/x5 do not use Intel HD gfx and are not produced by Intel itself, but they are cheaper, lets wait for this years results. Btw. informer Intel often used PowerVR gfx cores for low power Atom chips.
                      You are correct about the Atom X3, however, both the X5 and the X7 are using Intel graphics.

                      Source

                      What many in this discussion are missing is the real reason Intel will support this heavily. The area they have traditionally struggled with is power consumption, and OpenGL's single core model, driver overhead, and required error checking create a significant load on all SOC's. Getting rid of this overhead will mean in most use cases, the SOC can stay lower clocked, using less power, and providing longer battery life. Yes, it will enable fancy games, and all of the things, but it has more than one side to the story.

                      Now, Intel is already a big fan of Wayland, so if a Wayland backend is created for Vulkan, and/or Android adopts it as the underlying graphics stack, then the sky is the limit for Intel.

                      Comment

                      Working...
                      X