Announcement

Collapse
No announcement yet.

Why Linux's Direct Rendering Manager Won't Add A Generic 2D Acceleration API

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by msotirov View Post
    I don't get the need for a dedicated 2D API. OpenGL and Vulkan are fine for it – you just set up an orthographic view from the "top" and render flat polygons. Isn't that essentially what Qt Quick does with the scene graph? I assume that's what GTK / Cairo does with the OpenGL backend.
    Quote from the linked blog post:

    "an awful lot of of GPUs having more or less featureful blitter units. And many systems need them for a lot of use-cases, because the 3D engine is a bit too slow or too power hungry for just rendering desktops."

    2D APIs are needed, because they can be implemented with far less HW than generic 3D shaders. Yes, OpenGL and Vulkan are capable of 2D render, but they are overkill. A lot of HW must be implemented, mostly for nothing when only used for 2D.

    Comment


    • #12
      Why not "Vulkamor"? Couldn't Glamor benefit from a lower level api?
      ## VGA ##
      AMD: X1950XTX, HD3870, HD5870
      Intel: GMA45, HD3000 (Core i5 2500K)

      Comment


      • #13
        Could GLAMOR be rewritten to use Vulkan instead of OpenGL?
        Would that provide benefits?

        Comment


        • #14
          What about Skia?

          Comment


          • #15
            Reading the article, it seems to be saying that 2D should not be in the kernel. In my opinion, video does not have to be in the kernel, and never was with X, X had all video drivers in userspace. KMS took over modesetting video hardware from the X server, this is questionable, and it might have been better to put KMS in its own userland daemon, rather than move more in the direction of putting more stuff in the kernel. Basically the idea of moving modesetting out of the X can be okay, if X crashes (pretty rare these days), the modesetter can reset the hardware. Previously when you shut down X or you went out of X to a virtual terminal, X would set the modesettings back to what the terminal driver expects it to be. If X crashed, the display would be corrupted and had to reboot.

            Comment


            • #16
              Originally posted by uid313 View Post
              Could GLAMOR be rewritten to use Vulkan instead of OpenGL?
              Would that provide benefits?
              It might help a bit, but still doesn't solve the fundamental problem that 3D hardware tends to be far, far more power hungry and less efficient in 2D operations than dedicated 2D hardware:

              Originally posted by Meteorhead View Post
              2D APIs are needed, because they can be implemented with far less HW than generic 3D shaders. Yes, OpenGL and Vulkan are capable of 2D render, but they are overkill. A lot of HW must be implemented, mostly for nothing when only used for 2D.

              For example, GLAMOR on my setup, I get about 10% less battery life when I switch from SNA to GLAMOR. The performance in 2D apps is noticeably worse to boot:

              https://www.phoronix.com/scan.php?pa...ing-2017&num=3

              Comment


              • #17
                Originally posted by willmore View Post
                Intel doesn't want ARM chips to benefit from such an interface so their people do all they can to block it. That's the missing summary of the story.
                This is bullshit. All ARM devices with monitor/screen connections have a 3D GPU of some kind.

                Comment


                • #18
                  Originally posted by starshipeleven View Post
                  This is bullshit. All ARM devices with monitor/screen connections have a 3D GPU of some kind.
                  It's what I've heard from many different ARM SoC developers for a long time. These chips have separate 2D engines--mostly for historical reasons. But they are functional as we've had code to drive them for years. But, there has been a specific resistance to adding a generic 2D API to the DRM and that resistance has come from Intel.

                  The people who keep saying "just use the 3D engine" are missing the point. The 3D engines are always poorly documented and their support is marginal at best. The 2D engine support is mature and only lacking a common API to be made better use of. The 2D support we're looking at isn't anything fancy, it's simple blits for scrolling the screen, drawing text boxes, drawing glyphs with color space expansion, etc.

                  Comment


                  • #19
                    Originally posted by willmore View Post
                    It's what I've heard from many different ARM SoC developers for a long time. These chips have separate 2D engines--mostly for historical reasons. But they are functional as we've had code to drive them for years. But, there has been a specific resistance to adding a generic 2D API to the DRM and that resistance has come from Intel.

                    The people who keep saying "just use the 3D engine" are missing the point. The 3D engines are always poorly documented and their support is marginal at best. The 2D engine support is mature and only lacking a common API to be made better use of. The 2D support we're looking at isn't anything fancy, it's simple blits for scrolling the screen, drawing text boxes, drawing glyphs with color space expansion, etc.
                    I understand what you are saying, but I still think the best way forward is to have better support for the 3D engines instead (which would have more benefits than just running 2D applications and it's sorely needed), and dropping the 2D hardware alltogether.

                    If you let a 2D api slip in then there is even less incentive to make decent 3D drivers and documentation, and this would stifle advancement in the field.

                    Comment


                    • #20
                      I suppose the gist of all of this is that some cards have specialized 2D engines which provide better 2D performance than doing 2D on a 3D engine? Again, if you want to expose the 2D engine seperately from the 3D engine at the video driver level, as the article says, providing a Kernel interface is not where it belongs, it belongs as a DRI interface, maybe using some standardized subset of OpenGL or OpenVG. X servers and DRI apps can then use that backend.

                      Comment

                      Working...
                      X