Announcement

Collapse
No announcement yet.

John Carmack Is Interested In Wayland On Ubuntu

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by bridgman View Post
    One thing nobody seems to have mentioned is that there is another protocol being used today that lets you run an application on one box and have user interaction on another box. It's called HTTP. Many systems are moving to a model of using web apps for configuration/administration... why not adopt a model like that for on Linux as well (web-based UI for configuration / admin that can run locally or remote) ?

    There seem to be relatively few cases where you need to be able to run "all" apps remotely other than tech support / troubleshooting, and a screen capture model seems best for that anyways.

    Just a thought...
    i think i read something in the mailing list about html5 and stuff (by Kristian himself) when someone asked about network transparency

    nothing is final and the network stuff are lower in the list of priorities as far as i understand it

    Comment


    • #42
      Originally posted by rohcQaH View Post
      Some of that could be fixed with a new remote protocol and enough plumbing, but I'm not seeing anyone doing it. It's just "Hey, let's jump wayland, screw features!".
      Also: how am I going to run apps on my server which lacks a GPU? Software rendering + forwarding image data can't be better than remote X, not when the only remaining rendering protocol shall be openGL.
      Sure, I can still run X on top of wayland, but that somewhat defeats the purpose of replacing X with something more lightweight..
      To quote Kristian:

      > > Based on the architectural overview on the new Wayland website, it
      > > seems that VNC/NX/SPICE could be made to work with Wayland. Are there
      > > technical reasons why this would be difficult?
      >
      > No, you're right, that's an option, just not a priority right now.
      >
      > Kristian

      Also, nobody is forcing you to make the switch right now.

      Comment


      • #43
        Originally posted by jbrown96 View Post
        Games on Wayland are going to happen about as much/fast as games on X: not at all. Seriously, Linux is not a gaming platform.
        OEM gaming platforms using the Linux kernel have never used X11. OEM computers with Linux/X11 have cheap integrated GPU's. MeeGo and Ubuntu will use Wayland when it has matured. Google's Android and Chrome OS use a custom display server. How many times have X11 updates broken proprietary drivers? Et cetera, etc., etc...

        Well, it makes sense to me!

        Comment


        • #44
          Originally posted by cbh2000 View Post
          OEM gaming platforms using the Linux kernel have never used X11. OEM computers with Linux/X11 have cheap integrated GPU's. MeeGo and Ubuntu will use Wayland when it has matured. Google's Android and Chrome OS use a custom display server. How many times have X11 updates broken proprietary drivers? Et cetera, etc., etc...

          Well, it makes sense to me!
          LibC plus Xlib to texture-wrapper. Big fscking deal....

          Want to know why Windows versions DLLs? Bingo. By definition Windows sucks more for gaiming than Linux...

          Comment


          • #45
            So. I just knew that too many people complaining about client-side decorations meant I misunderstood something. I stand corrected. Wayland IS going to use client-side decorations. Oh the stupidity...

            Kristian:
            I'm waiting for two things to happen in GTK+:
            merge GdkDrawable into GtkWindow and the client-side-decoration branch to land.

            Comment


            • #46
              Outlining the problems

              Wayland is developed as a side project by a Red Hat developer. The problem with accepting this as a solution is that Red Hat needs to invest huge amounts of resources into the project. Definitely more people need to be active on its' development.

              As a professional software developer my counterparts and I have to look ahead two years. I'm at times applaud that distributions devote so little to maintenance of their releases. This mess with upstream taking care of the problems leaves a potential customer looking ahead to new releases.


              Debian Etch dropped support on 2010-02-15. So if your machine won't run squeeze then you are potential stuck on Etch with no support.

              Slackware still provides patches all the way back to version 8.1


              Bind was updated back in june.
              ftp://ftp.slackware.com/pub/slackwar...ches/packages/

              That's support folks.

              Comment


              • #47
                I am wondering, why does no one make use of the Linux Framebuffer?
                Because it is too Linux specific?
                It is such a great idea, you just write to a file and can draw things on your screen. A simple draw library, a simple window manager utilizing this, and voila!
                And since it is a file, using it remotely is a matter of ease not comparable to any of these complicated protocols.

                Seriously, I really hoped the Framebuffer would eventually take off.

                Comment


                • #48
                  Originally posted by Bill Cosby View Post
                  I am wondering, why does no one make use of the Linux Framebuffer?
                  Because it is too Linux specific?
                  Because it's absolutely horrific for performance, even on simplistic hardware produced these days. Modern GPUs are not framebuffers. They emulate dumb framebuffers for compatibility reasons, but do so in the simplest way possible and are extremely inefficient when doing so.

                  If you want to so much as paint a grey rectangle on the screen, the speed of pushing those pixels manually is a teeny little fraction of the speed of sending the commands to render two grey triangles via the GPU itself. Especially when you take into account how the GPU framebuffer's internal tiling works, which its triangle rasterizer naturally optimized for while your low-level GPU-agnostic blitting code will not do so.

                  And yes, this performance matters, even for simple applications: it's the difference between your CPU and GPU and bus being in low-power states practically all the time versus merely most of the time, which is pretty important to people on battery-powered devices.

                  I can't overstate how irritating and utterly pathetic it is when a clueless or lazy developer writes a simple non-animated 2D app that just sits there waiting for user input yet inexplicably drains the battery 3x as fast as a real-time 3D action-adventure game with fancy shaders and graphics. It's just pure incompetence, nothing else.

                  Use the GPU's shaders and rendering pipeline, or don't use the GPU at all. This isn't 1990, and the hardware is an entirely different beast than what is used to be. Think of how CPUs have changed, with our 2-12 core desktop CPUs, SIMD units, and memory speeds that can't keep up, and how optimizing CPU code has changed so much over the last 10 years; now multiply that change by 100x, and that's how different graphics programming is today than it was a decade or two ago.

                  That is why a framebuffer is a horrible idea for a graphics API. It's in part why Xorg's internals are becoming an increasingly horrible architecture for a desktop, and why the X11 protocol is increasingly irrelevant for modern applications. It's why OpenGL and its 20 year old API is getting to be more and more of a pain in the butt compared to DirectX despite remaining more or less feature comparable.

                  The mixed CPU+GPU processors of the near future will shift things again. Framebuffers might actually become a bit more useful again (for *very* simple needs) as the latency of the bus sitting between the CPU and GPU will literally disappear; on the other hand, the importance of shaders/SPUs for almost every task will increase greatly.

                  Comment

                  Working...
                  X