Announcement

Collapse
No announcement yet.

New Ryan Gordon Game Port Goes Into Beta

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Dammit... Why am I feeling a need to try something with a bit more oomph than my Intel graphics laptop?

    I will give this a try when I have time to set-up a USB disk with something a tad more uptodate than RHEL.

    Comment


    • #12
      Runs at slide show speeds on my HD4850 using radeon. Runs better on my 200M, but the laptop is just too slow; I bet it would run fine on 9600pro level hardware though, even with the slow drivers.

      Comment


      • #13
        Originally posted by monraaf View Post
        I have 2 builds of mesa, one with GLSL and one without. Both give the same result here.
        Hmm, that's odd. We generally have the same level of functionality on 6xx and 7xx.

        Thanks... even though I don't like your answer

        Originally posted by Melcar View Post
        Runs at slide show speeds on my HD4850 using radeon. Runs better on my 200M, but the laptop is just too slow; I bet it would run fine on 9600pro level hardware though, even with the slow drivers.
        Any error messages that might give a clue ? If it runs OK on a 3200 it should run *really* OK on a 4850...
        Last edited by bridgman; 12-14-2009, 11:10 PM.

        Comment


        • #14
          Hey that is strange. If I run the game with LIBGL_ALWAYS_INDIRECT=1 all the stuff that was missing becomes visible, the game runs slower (but not slide show slow) though, and there's heavy tearing.

          Comment


          • #15
            OK, that's a good clue. I don't remember what it means, but it's still good

            Comment


            • #16
              It means to always use indirect rendering. i.e. use glx to send all the OpenGL commands to the X server and let it render them, or something like that.

              Comment


              • #17
                Originally posted by bridgman View Post


                Any error messages that might give a clue ? If it runs OK on a 3200 it should run *really* OK on a 4850...


                Code:
                *********************************************************
                *********************************************************
                *********************************************************
                *********************************************************
                *********************************************************
                     Warning: This is a beta version of AQUARIA.
                *********************************************************
                *********************************************************
                *********************************************************
                *********************************************************
                *********************************************************
                
                
                
                WARNING: no AL_EXT_vorbis support. We'll use more RAM.
                That's what the game itself spits out.
                I'm using the xorg-edgers packages, which haven't had a good upgrade in a while, so I'm not surprised if the code not being up to date is causing the problem.

                Comment


                • #18
                  Well I found the culprit for my problem. I ran the game without kms and everything was visible with direct rendering, so I went back to kms and disabled the GL_EXT_framebuffer_object extension in r600_context.c and bingo now also everything is visible with direct rendering under kms.

                  So there's something broken in the fbo code.

                  Comment


                  • #19
                    Originally posted by bridgman View Post
                    Nightmorph, monraaf, did either or both of you have the experimental GLSL flag enabled in your mesa build ? Wondering if that explains the difference in your experiences.
                    Uh, possibly not. Here are some snippets from the compile log:

                    Code:
                    ./configure --prefix=/usr --build=x86_64-pc-linux-gnu --host=x86_64-pc-linux-gnu 
                    --mandir=/usr/share/man --infodir=/usr/share/info --datadir=/usr/share 
                    --sysconfdir=/etc --localstatedir=/var/lib --libdir=/usr/lib64 --with-driver=dri 
                    --disable-glut --without-demos --disable-debug --disable-glw --disable-motif 
                    --enable-glx-tls --enable-xcb --with-dri-drivers=,swrast,radeon,r200,r300,r600 
                    --disable-gallium --enable-asm
                    and:

                    Code:
                            prefix:          /usr
                            exec_prefix:     ${prefix}
                            libdir:          /usr/lib64
                            includedir:      ${prefix}/include
                    
                            Driver:          dri
                            OSMesa:          no
                            DRI drivers:     swrast radeon r200 r300 r600
                            DRI driver dir:  ${libdir}/dri
                            Use XCB:         yes
                    
                            Gallium:         no
                    
                            Shared libs:     yes
                            Static libs:     no
                            EGL:             yes
                            GLU:             yes
                            GLw:             no (Motif: no)
                            glut:            no
                            Demos:           no
                    This is after I disabled support for Gallium, just to see if it would make a difference. Previously, I compiled a 12-14 git checkout with Gallium, as per usual. After the dismal results, I wondered if any experimental Gallium code was affecting it. Since removing Gallium had no effect, the problem is likely elsewhere.

                    Would enabling GLSL give better performance, or worse? How can I check for GLSL support in the configure options? I don't see anything like it in the configure log, either enabled or disabled.

                    Comment


                    • #20
                      I think the GLSL support is still so experimental that the only way to enable it is to go into the code manually and delete the comment around the relevant #define. So if you aren't sure whether it's on or not, then it's definitely not.

                      Comment

                      Working...
                      X