No announcement yet.

John Carmack Is Interested In Wayland On Ubuntu

  • Filter
  • Time
  • Show
Clear All
new posts

  • #46
    Outlining the problems

    Wayland is developed as a side project by a Red Hat developer. The problem with accepting this as a solution is that Red Hat needs to invest huge amounts of resources into the project. Definitely more people need to be active on its' development.

    As a professional software developer my counterparts and I have to look ahead two years. I'm at times applaud that distributions devote so little to maintenance of their releases. This mess with upstream taking care of the problems leaves a potential customer looking ahead to new releases.

    Debian Etch dropped support on 2010-02-15. So if your machine won't run squeeze then you are potential stuck on Etch with no support.

    Slackware still provides patches all the way back to version 8.1

    Bind was updated back in june.

    That's support folks.


    • #47
      I am wondering, why does no one make use of the Linux Framebuffer?
      Because it is too Linux specific?
      It is such a great idea, you just write to a file and can draw things on your screen. A simple draw library, a simple window manager utilizing this, and voila!
      And since it is a file, using it remotely is a matter of ease not comparable to any of these complicated protocols.

      Seriously, I really hoped the Framebuffer would eventually take off.


      • #48
        Originally posted by Bill Cosby View Post
        I am wondering, why does no one make use of the Linux Framebuffer?
        Because it is too Linux specific?
        Because it's absolutely horrific for performance, even on simplistic hardware produced these days. Modern GPUs are not framebuffers. They emulate dumb framebuffers for compatibility reasons, but do so in the simplest way possible and are extremely inefficient when doing so.

        If you want to so much as paint a grey rectangle on the screen, the speed of pushing those pixels manually is a teeny little fraction of the speed of sending the commands to render two grey triangles via the GPU itself. Especially when you take into account how the GPU framebuffer's internal tiling works, which its triangle rasterizer naturally optimized for while your low-level GPU-agnostic blitting code will not do so.

        And yes, this performance matters, even for simple applications: it's the difference between your CPU and GPU and bus being in low-power states practically all the time versus merely most of the time, which is pretty important to people on battery-powered devices.

        I can't overstate how irritating and utterly pathetic it is when a clueless or lazy developer writes a simple non-animated 2D app that just sits there waiting for user input yet inexplicably drains the battery 3x as fast as a real-time 3D action-adventure game with fancy shaders and graphics. It's just pure incompetence, nothing else.

        Use the GPU's shaders and rendering pipeline, or don't use the GPU at all. This isn't 1990, and the hardware is an entirely different beast than what is used to be. Think of how CPUs have changed, with our 2-12 core desktop CPUs, SIMD units, and memory speeds that can't keep up, and how optimizing CPU code has changed so much over the last 10 years; now multiply that change by 100x, and that's how different graphics programming is today than it was a decade or two ago.

        That is why a framebuffer is a horrible idea for a graphics API. It's in part why Xorg's internals are becoming an increasingly horrible architecture for a desktop, and why the X11 protocol is increasingly irrelevant for modern applications. It's why OpenGL and its 20 year old API is getting to be more and more of a pain in the butt compared to DirectX despite remaining more or less feature comparable.

        The mixed CPU+GPU processors of the near future will shift things again. Framebuffers might actually become a bit more useful again (for *very* simple needs) as the latency of the bus sitting between the CPU and GPU will literally disappear; on the other hand, the importance of shaders/SPUs for almost every task will increase greatly.