No announcement yet.

Unity 2D To Go Away In Ubuntu 12.10

  • Filter
  • Time
  • Show
Clear All
new posts

  • Unity 2D To Go Away In Ubuntu 12.10

    Phoronix: Unity 2D To Go Away In Ubuntu 12.10

    It appears that Unity 2D -- the Qt non-accelerated desktop version of the Ubuntu Unity desktop -- will be abandoned by Canonical. There's also going to be some GNOME 3.6 packages appearing in Ubuntu 12.10...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Next version of ubuntu to require 'multi-core hardware' and 'x86' or OpenGL.


    • #3

      what does this portend for devices like the pandaboard??


      • #4
        Well, the main Unity2D developer (Aurelien Gateau) quit from Canonical a while ago to work at Blue Systems on KDE software (he's originally a KDE developer best known for Gwenview). Despite having the better technological foundation, the move was to be expected. The devs of regular Unity have no clue at all about Qt and QML. Training them would be far more difficult than simply letting once again Red Hat to do the hard work on LLVMpipe and get software acceleration for Compiz-based regular Unity for free.


        • #5
          Don't most ARM devices have OpenGL acceleration?


          • #6
            Originally posted by johnc View Post
            Don't most ARM devices have OpenGL acceleration?
            Yes, but only if you have their proprietary blob driver installed (which is PITA).


            • #7
              LLVMpipe is bad for gaming, but good enough for a composited desktop experience.

              We know you're capable of words with greater depth and less ambiguity than "good" and "bad". You might want to invest a larger portion of your time writing your articles. Here's an example of a 5 minute investment in that line:

              While the performance of LLVMPIPE is insufficient to adequately run most 3D games, this CPU based GPU driver meets the requirements of many OpenGL composited desktops.
              Not only did I avoid the words "good" and "bad", I correctly avoided the missing "is" in the second half of your sentence. I still fail, because I didn't take the time to remove the first instance of "to be/is". After reading my version, the reader is reminded of what LLVMPIPE is, that only 3D game performance is insufficient, and that the composited desktops need to be OpenGL accelerated for LLVMPIPE to come into play. Yes, writing is hard. It gets easier as you do more of it, it comes out better when you spend a couple minutes figuring out what you're trying to say.

              Asking from my own experience, have you considered an editor?



              • #8
                Originally posted by russofris View Post
                Asking from my own experience, have you considered an editor?
                Not that I disagree with most of the way that you re-wrote Michael's sentence, I think you should also consider an editor "CPU based GPU driver" is a nonsense term. The terms "GPU" and "llvmpipe" should never be used in the same sentence, unless you are talking about the performance difference between using a driver that is hardware-accelerated by the GPU, versus llvmpipe (which is not).

                Taking apart "CPU based GPU driver":

                "CPU-based" = adjective
                "GPU driver" = compound noun

                Thus, dropping the adjective, you're saying that llvmpipe is a GPU driver. This is plain wrong. It is not.

                llvmpipe is a Gallium3d driver, which simply provides software facilities that implement the same user-visible APIs as you would expect from 3d hardware (GPUs). The software facilities it provides are only run on the CPU. The GPU is a useless brick while llvmpipe is running. Your graphics card's display output chipset will be used, and depending on the status of your drivers, you might also be getting kernel mode setting or 2D hardware acceleration from your Xorg DDX and kernel bits.

                In other words it should theoretically be possible to use a kernel driver and Xorg DDX for everything except OpenGL / GLX / the rest of the G3D state trackers, and to use llvmpipe for the things in that "except" list. You might see better performance / features / display compatibility with this compared to using VESA compatibility mode, where you use compatibility interfaces to the PCI bus for display output that have worked on every graphics card since the 90s. But display output is completely separate from using the GPU on modern graphics hardware.

                OK, I'm done preaching. Just wanted to make that clear.


                • #9
                  Would it be a real loss when somebody could not run unity? I guess not, there are so many other possibilities. When you think about llvmpipe as a temp. replacement until you install binary drivers then this would not be so bad, for old systems most likely different DE are better. I saw a presentation of unity's ideas a few years ago (before you could test it) and thought that might be something for small factor and touchpads. Well somehow now it is for any form factor. A similar approch is W8, one ui for every form factor. It has got a good side like everything is controlled the same way and another side is that you have got give up some years old behavior pattern. Nothing is really bad but it is nice to have got alternatives.


                  • #10
                    This makes me sad. I'm actually quite liking 12.04 on my wife's desktop after some minor tweaks. One of which was switching to unity-2d. 3D has some pretty annoying bugs with some opengl applications (wine in some instances, for one).