Announcement

Collapse
No announcement yet.

GLAMOR'ized Radeon Driver Shows Hope Over EXA

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by curaga View Post
    It doesn't matter how much faster it is as long as it tears. Also quite sad it's that buggy.
    At least under xfce and with r600 it doesnt tears and its not particularly buggy (excepting the transparent icons, libreoffice menus), in fact it seems very stable. Problem is that isnt faster than EXA at all.

    Comment


    • #12
      I don't understand how we're still talking about (insufficient) 2D performance 19 years after the ATI Mach 64 was released. This shouldn't be a problem anymore. Why is it so difficult to provide smooth 2D rendering still today?

      Comment


      • #13
        devius because most of the code is written by hobbyists that are either reverse engineering or only have access to minimal documentation and tech.

        Comment


        • #14
          Originally posted by Pallidus View Post
          devius because most of the code is written by hobbyists that are either reverse engineering or only have access to minimal documentation and tech.
          No, it isn't.
          ## VGA ##
          AMD: X1950XTX, HD3870, HD5870
          Intel: GMA45, HD3000 (Core i5 2500K)

          Comment


          • #15
            Originally posted by Pallidus View Post
            devius because most of the code is written by hobbyists that are either reverse engineering or only have access to minimal documentation and tech.
            Still the worst performing driver (fglrx) is developed by the professionals, who have full access to the hardware documentation. 2D is simply not seen as important. And the x86 desktop hardware is seriously overpowered to provide somewhat acceptable user experience even with exceptionally horrible drivers. Everyone is just participating in the 3D race.

            PS. And if somebody ever starts asking questions about the poor drivers quality and inadequate performance, it is always possible to conveniently blame the X11 architecture

            Comment


            • #16
              "No, it isn't. "

              http://www.freedesktop.org/wiki/GettingInvolved/

              show me ANYONE working for ATI/AMD who submitted code to glamor


              stfu...

              "Still the worst performing driver (fglrx) is developed by the professionals"

              nah, the only professionally developed graphical driver for linux is NVIDIA's proprietary one.

              if you have a radeon card you are better off removing it and using integrated intel gfx... even if intel only makes an half assed effort they at least make sure it works
              Last edited by Pallidus; 10-03-2013, 09:37 AM.

              Comment


              • #17
                The problem is X's 2D sematics don't map well to hardware. E.g., you can't just accelerate X lines with GL lines, you need to do a bunch of complex munging in the shaders to make the semantics match. Adding back a 2D engine doesn't really help. Most modern apps use RENDER features which require a 3D engine but also have semantics that don't match most 3D hardware. So you still have to do munging in the shaders to match semantics.

                So you basically have two choices:
                1. Implement a native 2D acceleration architecture for your chips. This ends up being as complex or more complex than the 3D driver (Intel's ddx is bigger than it's 3D driver, IIRC).
                2. Implement 2D in terms of a 3D API. Something like glamor or XA.

                Comment


                • #19
                  Originally posted by darkbasic View Post
                  Faster!? Maybe with gnome, with KDE is simply unusable: https://bugs.freedesktop.org/show_bug.cgi?id=69341
                  At least it isn't glamor's fault because on Intel HD 4000 I haven't this issue.
                  ArchLinux/gentoo

                  phenom II 965 / FX6100
                  Radeon HD rv710 4350 / HD7700 cape verde
                  Kde 4.11.2
                  Glamor + XV forced
                  kernel 3.11.3/3.12-rc3

                  Kwin settings

                  render Raster
                  Opengl Version 3.1 / 2.0

                  i don't have any issues even with this crappy gpu or RadeonSI, so i assume must be an gpu specific issue or distro specific because i cannot reproduce it in any of my radeon machines

                  Comment


                  • #20
                    Also, modern GPUs found on desktops have 2D engine removed.
                    Maybe Intel chips still have 2D engine, which may have sense for integrated solutions and low-resolution graphics, but not for customer market.

                    My understanding GLAMOR is simply a must for AMD GCN, because there is no such thing as 2D engine in hardware anymore.
                    So, one can't really compare SNA with GLAMOR, its like comparing two-wheel bike versus four-wheel car - same result, but completely different config resulting in different dynamics, requiring completely different driving approach..

                    Also, would be interesting how it will compare on Wayland. Since Wayland just gives texture and welcomes application to write using OpenGL or toolkits - each one calling driver capabilities; it should be easy to get FPS when benchmarking real-life generic tasks.

                    Xorg has a lot of overhead, a lot of extensions and weird window system. For example, check youtube-swfdec result, this is copying bunch of pixels on screen. Both Fglrx and EXA perform worse than software, meaning under Xorg it more efficient just to copy buffers via CPU than to call any 3D engine and do the swap via GPU. I mean, modern desktop is not drawing lines with text (which would be easily wrappable via X), but operating on lots of pixels, using composite, OpenGL features (shading etc). With X's wrapping everything around, the advantage of (outsourcing with) hardware acceleration is pretty much killed by the slow postman. Giving each application a texture and using toolkits to expand into OpenGL calls or using it directly, will result in zero packaging. Xorg by design was made for CPU-only networked line and text UI. This use case is hardly exists now (outside of remote server management). Its obvious what to prioritize for the future. So, the big question - do we game or do we remotely admin on our machines using GTK1-like toolkits more in the future?

                    The only thing that surprises me is Intel itself, still putting strain on acceleration via 2D, whilst being authors of (3D based) Wayland..

                    "The professional driver" from nvidia doesn't even support Wayland in the first place and overrides half of xorg. Proves two points - how xorg is incapable to address current use case of professional environiment, and how incooperative nvidia is in designing anything next-technology, but outside of own property. More you support nvidia, more are you doomed to have it as single possible choice. Its like windows,.. but green?

                    Originally posted by jrch2k8 View Post
                    ArchLinux/gentoo

                    phenom II 965 / FX6100
                    Radeon HD rv710 4350 / HD7700 cape verde
                    Kde 4.11.2
                    Glamor + XV forced
                    kernel 3.11.3/3.12-rc3
                    Why? No UVD?

                    Comment

                    Working...
                    X