Announcement

Collapse
No announcement yet.

RadeonSI Gallium3D Is Improving, But Still Long Shot From Catalyst

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by ua=42 View Post
    Yeah. I was hoping to see the performance of the drivers with the Ungine benchmarks.
    I heard that there was problems with libre office (and other programs) with Glamor, but with the new upstream patches most of the problems have disappeared.
    The most visible issue with libreoffice was that when you opened the menu and hovered an option, the highlight color was almost the same as the text color so the hovered text was pretty much invisible and certain cell drawing operations werent very fast. Also in the system tray the icons put there by GTK2 apps were transparent (took up space and could be interacted with if you knew where to click). Gtkperf crapped out at line drawing, drawing lines excruciatingly slow.

    Now there were 2 patches in the last days that solved these issues. The Gtk line drawing still needs some work, as its still very much slower than with exa, but it improved a lot.
    Anyway, on a user-level glamor is pretty much the same as exa on r600g, its even more stable.

    Comment


    • #32
      Originally posted by Kivada View Post
      GCN based GPUs don't have 2D hardware, they require it to be handled by the 3D engine. AMD did this to get more 3D performance in the same chip size, with Catalyst it has shown to be pretty fast and power efficient.
      The 2d hardware was dropped in r600 already. It hasn't existed in years.

      Also the 3d performance argument is bullshit, it's pure penny pinching. Bruno showed that their 2d core takes 26k transistors, including VGA bios which is also included in AMD cards still. Current manufacturing puts around 9M transistors per mm^2.

      So a 2d core would take the __huge__ amount of space of 0.0028 mm^2, or 2800 um^2. That's a rounding error in a chip of 440 mm^2.

      Comment


      • #33
        Originally posted by curaga View Post
        The 2d hardware was dropped in r600 already. It hasn't existed in years.

        Also the 3d performance argument is bullshit, it's pure penny pinching. Bruno showed that their 2d core takes 26k transistors, including VGA bios which is also included in AMD cards still. Current manufacturing puts around 9M transistors per mm^2.

        So a 2d core would take the __huge__ amount of space of 0.0028 mm^2, or 2800 um^2. That's a rounding error in a chip of 440 mm^2.
        The problem is a a legacy 2D engine is not particularly useful. They are pretty much limited to blits and fills. Most modern toolkits use alpha blending and transforms which require the 3D engine anyway.

        Comment


        • #34
          Compositing toolkits must be the cause of recent distros slowing down on old hardware

          Originally posted by agd5f View Post
          The problem is a a legacy 2D engine is not particularly useful. They are pretty much limited to blits and fills. Most modern toolkits use alpha blending and transforms which require the 3D engine anyway.
          That must be why recent distros perform so poorly on Pentium 3 class machines-and reportedly on tablets as well. It wasn't so long ago that Linux was routinely used to speed up obsolete old computers that otherwise were on their way to becoming E-waste. Compositing toolkits are a hell of a lot of extra work on old machines that have very limited GPUs at the same time as slow CPU's. No wonder I've had so many "Vista moments" playing with new distros on old machines!

          A couple days ago I hooked a hard drive with Ubuntu Lucid to an AMD K-6 500 MHZ motherboard. It booted fine but then was unable to start Firefox at all and incredibly slow to open windows of most programs, even after removing update checking and apt-xapian-database, the two items I find tend to hog the CPU after booting an old clunker. I then hooked up a CD drive and put in a Ubuntu Warty (4-10) disk, and the exact same board was fast and responsive. I was able to get Mint 16 MATE to run on a 933 MHX Pentium III after removing those two programs and Pulseaudio, plus dropping in the i-810 driver binary out of an older version of Mesa to stop the use of LLVMpipe. That was able to play 360p H264 video in mpv while holding the browser open. It could even run Critter/Criticalmass at an almost playable framerate. Were the toolkits not so heavy, it would not have been necessary to set the window manager to use wireframes for easy window dragging, just as it was not in Ubuntu Warty on a far slower board.

          It seems to me that with the growth of CPU limited tablets and phones, some attention ought to be given to weight loss by both GTK and QT. I recall seeing a story on Phoronix somewhere that stated that one ARM core running at nearly 1GHZ had processing power comparable to a Pentium II running at something like 266 MHZ, slower than anything I have tested this week. Unless they've got a graphics driver so good the CPU load for compositing is zero, they too would benefit from lightweight toolkits or an option to turn compositing on or off, like the Metacity/Marco window manager can. Even Intel Atom benefits from turning off as much compositing as possible.

          Making toolkits pretty should not be at the expense of locking them out of the huge tablet and phone market, nor at the price of forcing people with older machines to trash them. If the main developers don't want to address this, then someone, maybe someone with phone industry funding, could write "gtk-compatable" and "qt-compatable" toolkits that are kept light and fast. With all the attention QT5 is getting from the phone folks, I wonder how it will compare to QT4, GTK2, and GTK3 for CPU and GPU resource use? If we are serious about being able to run on limited resource hardware (which includes all those phones and tablets), then Page's Law is going to have to be dealt with once and for all.

          Comment


          • #35
            Originally posted by agd5f View Post
            The problem is a a legacy 2D engine is not particularly useful. They are pretty much limited to blits and fills. Most modern toolkits use alpha blending and transforms which require the 3D engine anyway.
            Many 2d engines do support blending. Transforms require 3d, yes, but people do live without wobbly windows.

            There was a recent post here by libv on why exactly 2d engines are useful. I fully agree with him.

            Comment


            • #36
              Originally posted by Luke View Post
              That must be why recent distros perform so poorly on Pentium 3 class machines-and reportedly on tablets as well. It wasn't so long ago that Linux was routinely used to speed up obsolete old computers that otherwise were on their way to becoming E-waste. Compositing toolkits are a hell of a lot of extra work on old machines that have very limited GPUs at the same time as slow CPU's. No wonder I've had so many "Vista moments" playing with new distros on old machines!

              A couple days ago I hooked a hard drive with Ubuntu Lucid to an AMD K-6 500 MHZ motherboard. It booted fine but then was unable to start Firefox at all and incredibly slow to open windows of most programs, even after removing update checking and apt-xapian-database, the two items I find tend to hog the CPU after booting an old clunker. I then hooked up a CD drive and put in a Ubuntu Warty (4-10) disk, and the exact same board was fast and responsive. I was able to get Mint 16 MATE to run on a 933 MHX Pentium III after removing those two programs and Pulseaudio, plus dropping in the i-810 driver binary out of an older version of Mesa to stop the use of LLVMpipe. That was able to play 360p H264 video in mpv while holding the browser open. It could even run Critter/Criticalmass at an almost playable framerate. Were the toolkits not so heavy, it would not have been necessary to set the window manager to use wireframes for easy window dragging, just as it was not in Ubuntu Warty on a far slower board.

              It seems to me that with the growth of CPU limited tablets and phones, some attention ought to be given to weight loss by both GTK and QT. I recall seeing a story on Phoronix somewhere that stated that one ARM core running at nearly 1GHZ had processing power comparable to a Pentium II running at something like 266 MHZ, slower than anything I have tested this week. Unless they've got a graphics driver so good the CPU load for compositing is zero, they too would benefit from lightweight toolkits or an option to turn compositing on or off, like the Metacity/Marco window manager can. Even Intel Atom benefits from turning off as much compositing as possible.

              Making toolkits pretty should not be at the expense of locking them out of the huge tablet and phone market, nor at the price of forcing people with older machines to trash them. If the main developers don't want to address this, then someone, maybe someone with phone industry funding, could write "gtk-compatable" and "qt-compatable" toolkits that are kept light and fast. With all the attention QT5 is getting from the phone folks, I wonder how it will compare to QT4, GTK2, and GTK3 for CPU and GPU resource use? If we are serious about being able to run on limited resource hardware (which includes all those phones and tablets), then Page's Law is going to have to be dealt with once and for all.
              qt5 on mobile is entirely ogl es accelerated. The qt mediaplayer element will use platform native video acceleration, but doesn't go after, say, a 2d engine, it relies on the OS to provide it in a convenience api.

              So in practice qml is only using 3d, just like what agd5f said. 2d proper is only relevant in older toolkit versions, which I concede a lot of desktop apps still use.

              Comment


              • #37
                As long as there is Si for it, I'd also rather like to see 2D Si used. A lot of programs would participate from it. I guess a 2D Si might be much more power efficient than using a 3D engine for it. At least now, I don't know how power efficient they might become in the future.
                Stop TCPA, stupid software patents and corrupt politicians!

                Comment


                • #38
                  I notice increased performance in furmark_benchmark_fullscreen_1920x1080 from 20 -> 62 fps (mesa 10.0.2 -> mesa.git) and in Dota2 (OpenGL 2.1) I estimate the same increase - it is very playable now. Metro LL has some bugs in the shaders - I think the mesa devs make no workarounds [1].

                  [1] http://steamcommunity.com/app/43160/...6998172578/#p5



                  Tom

                  Comment

                  Working...
                  X