Announcement

Collapse
No announcement yet.

GLESX enableFlags ? (or DRI with HD2600XT = X FP error?)

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • GLESX enableFlags ? (or DRI with HD2600XT = X FP error?)

    So, I got a HD2600XT to replace an X800GTO that was happy till its fan flaked out. But with the new card, I can't get X to start with DRI enabled: X always gets a floating point error (signal 8) in glesx.so (esutSetDestSurf called from glesxMakeTrans) while trying to map the root window. I notice that with the old card (with all the same configs), /var/log/Xorg.0.log reported:

    (II) fglrx(0): GLESX enableFlags = 14

    and without DRI, with either card, one gets:

    (II) fglrx(0): GLESX enableFlags = 10

    but the new card gives the exotic:

    (II) fglrx(0): GLESX enableFlags = 58

    Anyone know what the flag bits are, or what sets them, or seen anything like 58 in their log?

    Or any other hints about what might be causing the FP error?

    (Glad to post the xorg.conf and/or Xorg.0.log, but maybe it's something obvious--to somebody else. I've tried every version of the driver that can support this card, both direct from the ATI site and emerged from Gentoo, plus a couple kernels and Xorg versions. In all cases, old card works fine till the fan stops and it dies of heat exhaustion, but new card gives same error. I even tried a superstitious card swap, but of course H/W is never the problem.)

  • #2
    I noticed playing with the Textured2D, TexturedXrender and TexturedVideo options changed the glesx enabled flags.

    I have a AGP Radeon 9550SE, normally glesx isn't loaded with no Textured options set.

    But with Textured 2D, Xrender and Video enabled, GLESX enableFlags = 30.

    So maybe you could try enabling or disabling TexturedVideo ?

    Comment


    • #3
      Thanks. No joy yet, but some interesting results to ponder. With all three "Textured" options set, I get GLESX enableFlags = 62 instead of your 30 (so that 32 bit is of interest). And I played around with the others to conclude:

      4 bit : TexturedXrender sets
      16 bit : TexturedXvideo sets

      In my situation, clearing the Textured2D causes X to segfault in fglrx_drv.so:atiddxOverlayWindowExposures, regardless of the other Textured settings. So, that might be a lead, if I had any idea where to go with it.

      And somehow the 32 bit is always set for me (unless I specify no_dri, which leads to just the 8 and 2 bits being set).

      This is a PCI-E card, but I've no idea if that has anything to do with anything.

      Comment


      • #4
        Did you try a clean Xorg.conf made with aticonfi --initial?

        With my HD2600XT I get the following with only
        Code:
        	Option  "Textured2D" "on"                                               
        	Option  "TexturedXrender" "on"
        set in the Xorg.conf
        Code:
        (II) Loading extension GLESX
        (II) fglrx(0): GLESX enableFlags = 30
        (II) fglrx(0): Using XFree86 Acceleration Architecture (XAA)
                Screen to screen bit blits
                Solid filled rectangles
                Solid Horizontal and Vertical Lines
                Driver provided ScreenToScreenBitBlt replacement
                Driver provided FillSolidRects replacement
        (II) fglrx(0): GLESX is enabled
        (II) fglrx(0): X context handle = 0x1
        (II) fglrx(0): [DRI] installation complete
        (II) fglrx(0): Direct rendering enabled

        Comment


        • #5
          Thanks. At least I know somebody has this card working (as I'd supposed), but no luck here: after an aticonfig --initial, I'm still seeing the 32-bit set:
          Code:
          (II) Loading extension GLESX
          (II) fglrx(0): GLESX enableFlags = 62
          (II) fglrx(0): Using XFree86 Acceleration Architecture (XAA)
          	Screen to screen bit blits
          	Solid filled rectangles
          	Solid Horizontal and Vertical Lines
          	Driver provided ScreenToScreenBitBlt replacement
          	Driver provided FillSolidRects replacement
          (II) fglrx(0): GLESX is enabled
          (WW) fglrx(0): Option "VendorName" is not used
          (WW) fglrx(0): Option "ModelName" is not used
          (II) fglrx(0): X context handle = 0x1
          (II) fglrx(0): [DRI] installation complete
          (II) fglrx(0): Direct rendering enabled
          and ultimately:
          Code:
          Backtrace:
          0: X(xf86SigHandler+0x81) [0x80d3d76]
          1: [0xffffe420]
          2: /usr/lib/xorg/modules//glesx.so [0xb54ac442]
          3: /usr/lib/xorg/modules//glesx.so [0xb543a26c]
          4: /usr/lib/xorg/modules//glesx.so [0xb543a07f]
          5: /usr/lib/xorg/modules//glesx.so [0xb5436406]
          6: /usr/lib/xorg/modules//glesx.so [0xb5436119]
          7: /usr/lib/xorg/modules//glesx.so [0xb5433072]
          8: /usr/lib/xorg/modules//glesx.so [0xb53c4599]
          9: /usr/lib/xorg/modules//glesx.so [0xb53d40bc]
          10: /usr/lib/xorg/modules//glesx.so [0xb53bd932]
          11: /usr/lib/xorg/modules//glesx.so [0xb535da35]
          12: /usr/lib/xorg/modules//glesx.so [0xb53631e8]
          13: /usr/lib/xorg/modules//glesx.so [0xb535bd3f]
          14: /usr/lib/xorg/modules//glesx.so(esutSetDestSurf+0x29) [0xb535a5a9]
          15: /usr/lib/xorg/modules//glesx.so [0xb5359e38]
          16: /usr/lib/xorg/modules//glesx.so(glesxMakeTrans+0x85) [0xb5359fb5]
          17: /usr/lib/xorg/modules/drivers//fglrx_drv.so [0xb7975c5e]
          18: /usr/lib/xorg/modules/drivers//fglrx_drv.so [0xb7972954]
          19: X(MapWindow+0x2dc) [0x8076a11]
          20: X(InitRootWindow+0x10a) [0x8076cd1]
          21: X(main+0x46c) [0x807054b]
          22: /lib/libc.so.6(__libc_start_main+0xe6) [0xb7d22826]
          23: X(FontFileCompleteXLFD+0x8d) [0x806fb11]
          
          Fatal server error:
          Caught signal 8.  Server aborting
          So... well, I've put the following files out for inspection:
          /etc/X11/xorg.conf at http://docs.google.com/Doc?id=dgvzn665_2dksq6m
          /var/log/Xorg.0.log at http://docs.google.com/Doc?id=dgvzn665_0d6c2mq
          lspci -vvv output at http://docs.google.com/Doc?id=dgvzn665_1fwx5sp

          Of course, I don't really know that the strange value of GLESX enableFlags is part of the problem. It just looks suspicious.

          Comment


          • #6
            I removed the two Textured Options because Textured2D seems to be enabled by default and TexturedXrender causes glitches when watching videos.

            My xorg.conf looks like http://pastebin.com/m15eb2172 at the moment.

            Comment


            • #7
              Thanks. Yeah, with those settings, I get the original enableFlags = 58 results.

              Hmmm... and despite removing all reference to Video or TexturedVideo in xorg.conf, I get Textured Video enabled. But if I explicitly disable TexturedVideo, I get 42 (with no TexturedXrender, or 46 with TexturedXrender).

              So, it would be interesting to know the value of enableFlags for others, without Xrender. Others got 30 before with Xrender (and 2D), so maybe it's 26. (But I think 14 is another possibility. I'm now thinking that the 4-valued bit doesn't correspond to TexturedXrender for anybody but me, and that getting TexturedVideo by default is possibly another artifact of the problem. But my little brain is getting all twisty now.)

              Also... we know that Textured2D is the default, but I wonder if with it explicitly disabled, everybody else gets a segfault in X, or if I'm "special" that way, too.

              Comment


              • #8
                I'm now thinking that the 4-valued bit doesn't correspond to TexturedXrender for anybody but me, and that getting TexturedVideo by default is possibly another artifact of the problem.
                No, it's 26 and for me TexturedVideo is also enabled by default. So you're stuck with the question why the 32-value bit is enabled and what it means.

                Comment


                • #9
                  Thanks again. Wow, so... I'm beginning to doubt my sanity as I grasp at straws for what can be causing this. (I can't be the only one trying to run this card with a 32-bit kernel still? Nah, that's... insane.)

                  Part of the puzzle that keeps me up nights is that the configuration works just fine with that X800 card, yet fails so miserably with this thing. And the interpretation of GLESX enableFlags is complicated by the 14 it gets with that X800, despite not having TexturedXrender set, which suggests that the 4-valued bit is somehow overloaded or card-specific.

                  Comment


                  • #10
                    Originally posted by OnceBitten View Post
                    I can't be the only one trying to run this card with a 32-bit kernel still?
                    I had it running with fglrx 8.42.3 and before on a 32 Bit kernel. Never tried 7.11.

                    Comment

                    Working...
                    X