Announcement

Collapse
No announcement yet.

Updated and Optimized Ubuntu Free Graphics Drivers

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by necro-lover View Post
    I switched from a hd4770 to a hd3850 and now my KDE turn into black screen with a running mouse (strg+alt+f1 console also works).

    only the KDE-save-mode runs.. maybe the desktop 3D kwin are broken for hd3850 ? and with a hd4770 it works..
    i now uninstalled the oibaf ppa now the kwin works in normal mode.

    show to me i found a bug in the mesa9 with a hd3850+kwin

    Comment


    • I like that it defaults to LLVMpipe instead of swrast.

      Last edited by Leon55ia; 09 October 2012, 10:06 PM.

      Comment


      • Originally posted by necro-lover View Post
        i now uninstalled the oibaf ppa now the kwin works in normal mode.

        show to me i found a bug in the mesa9 with a hd3850+kwin
        It's probably LLVM, try running with R600_LLVM=0 .

        Maybe it's also fixed in current git but I cannot upload updated packages until next week.

        Comment


        • Originally posted by Edwa55rd
          I see the brightness of the gears constantly alternating between dark and bright



          Also see my previous post about disabling LLVM.

          Comment


          • Output capped to 10 fps, as if vertical refresh is 10 Hz

            I'm using xserver-xorg-video-ati 1:6.14.99+git1206080959.588837~gd~o in Mythbuntu 12.04 with a VisionTek Radeon HD2600 Pro AGP, a Soltek SL-65KVB motherboard and a 1 GHz Pentium 3. OpenGL and VDPAU output is capped to 10 fps, as if the vertical refresh rate is 10 Hz. However, it is actually 60 Hz, and xrandr properly outputs that.

            Glxgears runs at 10 fps unless I have a vblank_mode=0 environment variable or I set the same thing via .drirc. (Driconf does not change anything unless ~/.drirc it creates is edited to change driver at the top to "dri2".) When I make that change, it runs at close to 200 FPS. Mplayer OpenGL output also runs at 10 fps unless I use "-vo gl:swapinterval=0". Interestingly, I don't see any tearing after changing those options.

            VDPAU is useless because it is always capped to 10 fps and I don't know how to change that.

            When MythTV is outputting video via XVideo at 30 fps, there is severe judder. Smooth pans have a very regular pattern of slowing down and speeding up. This problem goes away if I select one of the 2x deinterlacers and output at 60 fps. This is not exactly a 10 fps cap, but it may have something to do with this issue.

            None of these problems are present in the driver that comes with 12.04.

            Comment


            • I just wonder what -ati are you using since 6.14.99+git1206080959.588837~gd~o is for 11.10 and also failed to build in the PPA, so you can't be really using that.

              Comment


              • Originally posted by oibaf View Post
                I just wonder what -ati are you using since 6.14.99+git1206080959.588837~gd~o is for 11.10 and also failed to build in the PPA, so you can't be really using that.
                I was using 1:6.99.99+git1209261624.e8cb0b~gd~p, the right driver, from your PPA. I just copied the wrong string into my post. Sorry.

                Comment


                • Originally posted by dreamlayers View Post
                  I was using 1:6.99.99+git1209261624.e8cb0b~gd~p, the right driver, from your PPA. I just copied the wrong string into my post. Sorry.
                  No idea what's happening there - you may want to search and eventually report a bug for that issue (and eventually post here the link of the bug) if you want to see it fixed, see the debugging section on main page of the PPA.

                  Comment


                  • 10 FPS issue seems to be due to radeon driver in kernel

                    The 10 FPS issue isn't linked to using drivers from the Oibaf PPA. Exactly the same thing can happen with 1:6.14.99~git20111219.aacbd629-0ubuntu2. It does not happen every time, so it can be confusing.

                    What happens is unhandled interrupts cause the kernel to disable the radeon IRQ. This is the IRQ used for waiting for vertical retrace, and I guess that wait times out after 0.1s. When things work properly, I can see radeon interrupt count increasing in /proc/interrupts while running glxgears and for a few moments afterwards. When things don't work properly, that count is 200000 and this appears in dmesg:

                    Code:
                    [   33.806034] irq 5: nobody cared (try booting with the "irqpoll" option)
                    followed by a long backtrace and

                    Code:
                    [   33.806980] handlers:
                    [   33.807173] [<e8a57cc0>] radeon_driver_irq_handler_kms
                    [   33.807180] Disabling IRQ #5
                    This happened about 5 seconds after all the drm initialization messages. As you can see, the radeon irq handler was there. It just acted as if the interrupt was not from the video card. Earlier on there is:

                    Code:
                    [   24.047694] ACPI: PCI Interrupt Link [LNKA] enabled at IRQ 5
                    [   24.047707] PCI: setting IRQ 5 as level-triggered
                    [   24.047723] radeon 0000:01:00.0: PCI INT A -> Link[LNKA] -> GSI 5 (level, low) -> IRQ 5
                    Since the interrupt is level triggered, I don't think it would be reasonable to use the noirqdebug kernel option to prevent the interrupt from being disabled. It would just keep interrupting and make everything slow or even unresponsive. I'll do some more investigation and report a bug eventually if it seems like there's a bug and not a hardware issue.

                    Comment


                    • I see the brightness of the gears constantly alternating between dark and bright

                      Comment

                      Working...
                      X