Announcement

Collapse
No announcement yet.

Open-Source Radeon HD 6000 Series Still Borked

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #61
    Originally posted by Plombo View Post
    Thanks for reporting it! I've pushed a fix to the glsl-to-tgsi branch on fd.o and GitHub which increases the maximum number of temps from 256 to 4096.
    Thanks fo fixing it

    Updated packages for the PPA are building right now.

    Comment


    • #62
      unigine heaven runs now i got 3fps on 1024x756 pixels in window mode.

      loading "/home/ass/.Unigine Heaven/heaven_2.1.cfg"...
      Engine::init(): clear video settings for "Gallium 0.4 on AMD RV740 2.1 Mesa 7.12-devel (git-6bde225 natty-oibaf-ppa+glsl-to-tgsi)"
      Loading "libGL.so.1"...
      Loading "libopenal.so.1"...
      ALWrapper::init(): can't load "libopenal.so.1" library
      libopenal.so.1: cannot open shared object file: No such file or directory
      Can't initialize OpenAL wrapper
      Install latest OpenAL
      Set 1024x768 windowed video mode
      Set 1.00 gamma value
      Unigine engine http://unigine.com/
      Binary: Linux 64bit GCC 4.3.2 Release May 20 2010
      App path: /home/ass/Downloads/Unigine_Heaven/bin/
      Data path: /home/ass/Downloads/Unigine_Heaven/data/
      Save path: /home/ass/.Unigine Heaven/

      ---- System ----
      System: Linux 3.0.0-0300rc2-generic x86_64
      CPU: AMD Phenom(tm) II X2 550 Processor 2614MHz MMX+ 3DNow!+ SSE SSE2 SSE3 SSE4A HTT
      GPU: Gallium 0.4 on AMD RV740 2.1 Mesa 7.12-devel (git-6bde225 natty-oibaf-ppa+glsl-to-tgsi)
      System memory: 3958 Mb
      Video memory: 256 Mb

      ---- Render ----
      GLRender::GLRender(): Unknown GPU
      OpenGL vendor: X.Org
      OpenGL renderer: Gallium 0.4 on AMD RV740
      OpenGL version: 2.1 Mesa 7.12-devel (git-6bde225 natty-oibaf-ppa+glsl-to-tgsi)
      Found required GL_ARB_map_buffer_range
      Found required GL_ARB_vertex_array_object
      Found required GL_ARB_vertex_buffer_object
      Found required GL_ARB_half_float_vertex
      Found required GL_ARB_half_float_pixel
      Found required GL_ARB_occlusion_query
      Found required GL_EXT_texture3D
      Found required GL_EXT_texture_cube_map
      Found required GL_EXT_texture_sRGB
      Found required GL_EXT_texture_swizzle
      Found required GL_ARB_shader_object
      Found required GL_ARB_vertex_shader
      Found required GL_ARB_fragment_shader
      Found required GL_ARB_draw_buffers
      Found required GL_ARB_framebuffer_object
      Found required GL_EXT_framebuffer_blit
      Found required GL_EXT_framebuffer_multisample
      Found optional GL_ARB_draw_elements_base_vertex
      Found optional GL_ARB_texture_rg
      Found optional GL_ARB_texture_compression
      Found optional GL_ARB_texture_compression_rgtc
      Found optional GL_ARB_seamless_cube_map
      Shading language: 1.20
      Maximum texture size: 8192
      Maximum texture units: 32
      Maximum draw buffers: 8

      ---- Physics ----
      Physics: Multi-threaded

      Unigine~# gl_render_use_arb_tessellation_shader 0 && render_restart
      ---- Interpreter ----
      Version: 2.31

      [...]
      Unigine~# render_hdr 2 && render_srgb 1 && render_restart
      Unigine~# render_dof 0 && render_restart
      Mesa warning: glDraw[Range]Elements(start 7331, end 8087, count 2373, type 0x1403, indices=0xabc0)
      end is out of bounds (max=8086) Element Buffer 21 (size 48714)
      This should probably be fixed in the application.
      Mesa warning: glDraw[Range]Elements(start 11022, end 11057, count 123, type 0x1403, indices=0x11232)
      end is out of bounds (max=11056) Element Buffer 11 (size 70440)
      This should probably be fixed in the application.
      [...]

      Last edited by Qaridarium; 07-06-2011, 10:59 AM.

      Comment


      • #63
        That's very cool.

        So, this was the last Linux app that did not work with r600g?

        Comment


        • #64
          No, it does work since some months, it didn't work with the glsl-to-tgsi branch, but the fix was easy.
          ## VGA ##
          AMD: X1950XTX, HD3870, HD5870
          Intel: GMA45, HD3000 (Core i5 2500K)

          Comment


          • #65
            Works does not nearly mean it is usable. It is not usable. But this would also be last thing that prevents its usage as a driver, at least for 3D apps.

            Comment


            • #66
              Originally posted by crazycheese View Post
              Works does not nearly mean it is usable.
              Yeah, that's obvious. We still have a long way ahead before unigine
              ## VGA ##
              AMD: X1950XTX, HD3870, HD5870
              Intel: GMA45, HD3000 (Core i5 2500K)

              Comment


              • #67
                It's been working almost as long as on r300g, performance seems slightly better on r300g though

                Comment


                • #68
                  Originally posted by whizse View Post
                  It's been working almost as long as on r300g, performance seems slightly better on r300g though
                  amd should rebuild an x1950xtx in an 32nm build process as a opensource linux edition ;-)

                  Comment


                  • #69
                    X1950XTX isn't that fast anymore, really. An HD3870 is already faster in the majority of the scenario. Unigine may be an exception tough, I didn't test it.
                    ## VGA ##
                    AMD: X1950XTX, HD3870, HD5870
                    Intel: GMA45, HD3000 (Core i5 2500K)

                    Comment


                    • #70
                      Originally posted by darkbasic View Post
                      X1950XTX isn't that fast anymore, really. An HD3870 is already faster in the majority of the scenario. Unigine may be an exception tough, I didn't test it.
                      you don't get my point. my point is a R300 driver card. just a x1950xtx in 32nm build process

                      Comment


                      • #71
                        Ok, the HD5870 finally arrived. First impression (with a 4 weeks old graphical stack and an Athlon64 3800+ X2): slow, SLOW, SLOW.
                        Too much slow for such a monster, my Intel HD3000 is faster with nexuiz
                        Obviously the HD3000 runs on a Core i5 2500K, but the Athlon64 3800+ X2 was overclocked @2.7Ghz
                        It doesn't seem faster than the HD3870, next week I will do some real benchmarks.
                        ## VGA ##
                        AMD: X1950XTX, HD3870, HD5870
                        Intel: GMA45, HD3000 (Core i5 2500K)

                        Comment


                        • #72
                          Originally posted by darkbasic View Post
                          Ok, the HD5870 finally arrived. First impression (with a 4 weeks old graphical stack and an Athlon64 3800+ X2): slow, SLOW, SLOW.
                          Too much slow for such a monster, my Intel HD3000 is faster with nexuiz
                          Obviously the HD3000 runs on a Core i5 2500K, but the Athlon64 3800+ X2 was overclocked @2.7Ghz
                          It doesn't seem faster than the HD3870, next week I will do some real benchmarks.
                          How unfortunate. Soon I will buy new PC, and I am still in doubt. Llano or SB? No discrete GPU, Linux only.

                          Comment


                          • #73
                            Sandy Bridge rocks, it's so damn fast:
                            http://www.linuxsystems.it/3-linux-3...dge-benchmarks
                            I will benchmark the HD5870 on the Core i5 2500K, I'm pretty sure it's cpu limited.
                            Last edited by darkbasic; 07-07-2011, 01:42 PM.
                            ## VGA ##
                            AMD: X1950XTX, HD3870, HD5870
                            Intel: GMA45, HD3000 (Core i5 2500K)

                            Comment


                            • #74
                              Originally posted by darkbasic View Post
                              Ok, the HD5870 finally arrived. First impression (with a 4 weeks old graphical stack and an Athlon64 3800+ X2): slow, SLOW, SLOW.
                              Too much slow for such a monster, my Intel HD3000 is faster with nexuiz
                              Obviously the HD3000 runs on a Core i5 2500K, but the Athlon64 3800+ X2 was overclocked @2.7Ghz
                              It doesn't seem faster than the HD3870, next week I will do some real benchmarks.
                              Thanks for your replies!
                              I understand your situation. I have gtx260sp216 card with 1792 megs of gddr3, of course it is LARGE card, but very powerful (less in era of 460/560+ but still its useable only with blob, and fermi support is even more damaged). So you have this card, payed for it and you expect it to perform. 5870 with gddr5, wee, yes - MONSTER.
                              You insert the card and heh... Yes.
                              So you stand there, between the unfree decision - opensource and borked, or blob and working investment.
                              And if even your blob driver is flawed(like it was in 4670 era, several years ago), you want to throw the card in trash...

                              Still radeon has most stakes of becoming somehow usable open AND performing driver. Its these 5 developers vs 2,000 that ultimately decide. Official AMD keeps telling me: a boat with least interest will get least amount of purchases, and I keep telling them : a boat with most developer attention will get most interest and most amount of purchases.

                              I think their whole strategy is to attract new programmers and somehow get part of nvidia-only performance gpu cake in linux by dream a of open driver. Well, at least they fixed the closed driver so you can use the card to some degree and keep it; and some talented people(students) appreciate open driver and work on it... doing their job, fixing their own boat.

                              I think the root of the problem lies within secret partnerships - ie they are not interested in selling their cards for use within linux. But not much of the concurrence here, nvidia is actually becoming crappier and crappier in linux support.

                              Comment


                              • #75
                                Originally posted by darkbasic View Post
                                Sandy Bridge rocks, it's so damn fast:
                                http://www.linuxsystems.it/3-linux-3...dge-benchmarks
                                I will benchmark the HD5870 on the Core i5 2500K, I'm pretty sure it's cpu limited.
                                Haha, intel used linux information to further push winblows driver. Nice one.

                                Comment

                                Working...
                                X