Announcement

Collapse
No announcement yet.

Kazan Continues Making Progress As A CPU-Based Vulkan Implementation

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Kazan Continues Making Progress As A CPU-Based Vulkan Implementation

    Phoronix: Kazan Continues Making Progress As A CPU-Based Vulkan Implementation

    While Google Summer of Code ended one month ago, Jacob Lifshay has continued working on his "Vulkan-CPU" project now known as Kazan. He's certainly making progress on this CPU-based Vulkan implementation...

    http://www.phoronix.com/scan.php?pag...September-2017

  • #2
    Wow that's some quick progress. Seeing as Vulkan is supposed to reduce CPU overhead, I'll be interested to see how much faster it is compared to OpenGL.

    Seeing as how quickly Lifshay and Arlie have made accomplishments to their own Vulkan implementations, it makes me wonder how hard it'd be to get it working on Nouveau. Not that I personally care; I don't have a compatible Nvidia platform, but I'm led to believe it is easier than most other work done for the project.

    Also, here before people start talking about other meanings of Kazan.
    Last edited by schmidtbag; 09-25-2017, 12:32 PM.

    Comment


    • #3
      Originally posted by schmidtbag View Post
      Wow that's some quick progress. Seeing as Vulkan is supposed to reduce CPU overhead, I'll be interested to see how faster it is compared to OpenGL.
      You mean OpenGL running on LLVMPipe or what?

      Comment


      • #4
        Impressive work! I do not only consider Vulkan important because of performance optimizations but also because of the simplifications in the graphic stack on the long term.

        Comment


        • #5
          Originally posted by M1kkko View Post
          You mean OpenGL running on LLVMPipe or what?
          Yes - I'm interested to see how this compared to software-rendered OpenGL; it'd be more of an apples-to-apples comparison.

          Comment


          • #6
            What's the use case for software rendered OpenGL when even the on-board graphics of budget peecee's can manage it in hardware? It's not like a 3D accelerator is a luxury add-on any more. Even low-power embedded solutions have 3D hardware these days.

            Comment


            • #7
              Originally posted by torsionbar28 View Post
              What's the use case for software rendered OpenGL when even the on-board graphics of budget peecee's can manage it in hardware? It's not like a 3D accelerator is a luxury add-on any more. Even low-power embedded solutions have 3D hardware these days.
              I can think of a few reasons:
              1. Many of the embedded solutions you speak of don't support OpenGL, but rather GLES. Their drivers are often closed-source and limited to specific kernels, and I think in some cases are limited to X11 versions.
              2. As a failsafe. I'm sure you've encountered broken GPU drivers at one point, and both GNOME and KDE have a lot of dependence on GL. In the even your DE does not depend on GL, you may need to start researching a solution, in which case a software renderer might make the experience less miserable.
              3. Testing. Depending how compliant this gets with Vulkan, it could be a way to determine if artificating is caused by a faulty GPU or if its just the code.

              But for most people, yeah, it's kinda useless, especially while there are pretty much no applications anyone depends on that uses Vulkan. However, I'm sure that will change in the coming years, in which case it is nice to have a head-start on this.
              Last edited by schmidtbag; 09-25-2017, 02:58 PM.

              Comment


              • #8
                Originally posted by torsionbar28 View Post
                What's the use case for software rendered OpenGL when even the on-board graphics of budget peecee's can manage it in hardware? It's not like a 3D accelerator is a luxury add-on any more. Even low-power embedded solutions have 3D hardware these days.
                You forgot about desktop virtualization (VDI).

                Comment


                • #9
                  Another use case is shader debugging. I can see this being used to step through shaders, something dx has had for forever now.

                  Comment


                  • #10
                    There are cpus with complex vector units that could take advantage of this. I saw an icube cpu a while back but don't think it ever caught on... http://icubecorp.com/

                    Comment

                    Working...
                    X