Announcement

Collapse
No announcement yet.

Open-Source Radeon Driver Reworks Audio Code, Adds DisplayPort Audio

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by SXX⁣ View Post
    May be you can give more details on your hardware and distribution used?

    Recently I'm helped one guy who had issues with his HP laptop and Ubuntu 14.04 because kernel used there have "runpm" bug that caused discrete graphics card to disappear. And fix was to disable runtime power management (radeon.runpm=0) or upgrade kernel. So at moment LTS distributions still may have issues with PRIME, but it's should be possible to solve them.

    PS: Though PRIME still cause X crashes in some cases which is sad.
    Anyway it's still better than what Windows AMD drivers provide because those don't even have suppoer for laptops with HD42XX+6XXXM. On Linux there at least way to configure those devices to work so I think AMD FOSS team did great job here.
    Thanks for helping me SSX.

    I have a ASUS X550D laptop with an AMD A8-5550m APU (Radeon 8550G iGPU) + Radeon 8670M dGPU.
    OS: Ubuntu 14.04 with kernel 3.18 and latest Mesa components from oibaf ppa.

    When using xrander --listproviders I get

    Code:
    Providers: number : 1
    Provider 0: id: 0x56 cap: 0x9, Source Output, Sink Offload crtcs: 4 outputs: 3 associated providers: 0 name:radeon

    Comment


    • #22
      Originally posted by bridgman View Post
      AFAIK with recent kernels the open source kernel driver tries to turn off the dGPU by default. Don't think there is a way to turn off the iGPU since it's needed for display.
      Is that why I can't boot with the dGPU set as primary in the firmware, HDMI plugged into dGPU, with iGPU still enabled? Works fine if I disable either the iGPU or dGPU in firmware, and have HDMI plugged into the one that's enabled. I think it also works if the iGPU is set as primary with HDMI plugged into it, but I'd have to check again to be sure.

      Comment


      • #23
        Originally posted by Adriannho View Post
        Thanks for helping me SSX.

        I have a ASUS X550D laptop with an AMD A8-5550m APU (Radeon 8550G iGPU) + Radeon 8670M dGPU.
        OS: Ubuntu 14.04 with kernel 3.18 and latest Mesa components from oibaf ppa.
        First of all make sure you don't have X server config file as it's will break XRandr configuration that work in Ubuntu by default.

        If it's not the game I'll recommend you start with checking "lspci" output to find out if discrete graphics missing there as it's should be visible there have to be one ID, but it's not here. E.g "lspci | grep VGA" should return two devices.

        Then try to check "dmesg" contents like "dmesg | grep radeon" and see if it's there two devices in log. If you see two different devices ID here then try to add "radeon.runpm=0" kernel boot option and check if device appear after reboot or not. There already was bug in "runpm" that break it on some laptops (#79701 on kernel bugzilla) and may be there still other problem that affect you.

        Comment


        • #24
          Originally posted by Adriannho View Post
          Thanks for helping me SSX.

          I have a ASUS X550D laptop with an AMD A8-5550m APU (Radeon 8550G iGPU) + Radeon 8670M dGPU.
          OS: Ubuntu 14.04 with kernel 3.18 and latest Mesa components from oibaf ppa.
          First of all make sure you don't have X server config file as it's will break XRandr configuration that work in Ubuntu by default.

          If it's not the problem I'll recommend you start with checking "lspci" output to find out if discrete graphics missing there as it's should be visible there have to be one ID, but it's not here. E.g "lspci | grep VGA" should return two devices.

          Then try to check "dmesg" contents like "dmesg | grep radeon" and see if it's there two devices in log. If you see two different PCI ID here then try to add "radeon.runpm=0" kernel boot option and check if device appear after reboot or not. There already was bug in "runpm" that break it on some laptops (#79701 on kernel bugzilla) and may be there still other problem that affect you.

          PS: I have no idea how long will it take to get my post with link approved so I just send other one without the link. Sorry.
          Last edited by SXX⁣; 14 January 2015, 08:09 PM.

          Comment


          • #25
            Originally posted by Nobu View Post
            Is that why I can't boot with the dGPU set as primary in the firmware, HDMI plugged into dGPU, with iGPU still enabled? Works fine if I disable either the iGPU or dGPU in firmware, and have HDMI plugged into the one that's enabled. I think it also works if the iGPU is set as primary with HDMI plugged into it, but I'd have to check again to be sure.
            There are rarely any display connectors wired to the dGPU on PX laptops. The dGPU is purely for offscreen rendering. See this page for an overview of how to configure these sorts of laptops:

            The information is the same regardless of the chips uses (AMD, Intel, Nvidia).

            Comment


            • #26
              Originally posted by SXX⁣ View Post
              First of all make sure you don't have X server config file as it's will break XRandr configuration that work in Ubuntu by default.

              If it's not the problem I'll recommend you start with checking "lspci" output to find out if discrete graphics missing there as it's should be visible there have to be one ID, but it's not here. E.g "lspci | grep VGA" should return two devices.

              Then try to check "dmesg" contents like "dmesg | grep radeon" and see if it's there two devices in log. If you see two different PCI ID here then try to add "radeon.runpm=0" kernel boot option and check if device appear after reboot or not. There already was bug in "runpm" that break it on some laptops (#79701 on kernel bugzilla) and may be there still other problem that affect you.
              .
              Well now, this is surprising.

              lspci definitely identify both cards, one as VGA(iGPU) and one as 'display controller'(dGPU)
              Code:
              00:01.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Richland [Radeon HD 8550G]
              01:00.0 Display controller: Advanced Micro Devices, Inc. [AMD/ATI] Sun XT [Radeon HD 8670A/8670M/8690M]
              when boooting with radeon.runpm=0 xrandr returns 3 providers (it looks like the last two lines are identical though).
              Code:
              Provider 0: id: 0x76 cap: 0x9, Source Output, Sink Offload crtcs: 4 outputs: 3 associated providers: 2 name:radeon
              Provider 1: id: 0x3f cap: 0x6, Sink Output, Source Offload crtcs: 0 outputs: 0 associated providers: 2 name:radeon
              Provider 2: id: 0x3f cap: 0x6, Sink Output, Source Offload crtcs: 0 outputs: 0 associated providers: 2 name:radeon
              what exactly is runpm supposed to do?

              One last note. Power management seems to be disabled now as temps went up considerably. lm-sensors now reports the temp for the second card as before it was showing a N/A in its place. Guess that runpm disables the card at boot but might this be a bug or not?
              Last edited by Adriannho; 15 January 2015, 07:07 AM.

              Comment


              • #27
                Originally posted by Adriannho View Post
                what exactly is runpm supposed to do?
                RunPM it's "runtime power managements" that should automatically power off discrete GPU when it's not used. In same time powered off GPU have to remain visible for userspace just as if it's was on. E.g it's have to be visible via "listproviders".

                Originally posted by Adriannho View Post
                One last note. Power management seems to be disabled now as temps went up considerably. lm-sensors now reports the temp for the second card as before it was showing a N/A in its place.
                Depend on your laptop model enabled discrete GPU may increase temperature, but there shouldn't be huge difference as "dpm" (Dynamic Power Management) should automatically adjust frequencies when GPU it's not actively used.

                I'm really not sure about 3.18 kernel, but in default 3.13 of Ubuntu "radeon.dpm" is disabled for some reason.
                So other thing worth to check is:
                Code:
                cat /sys/class/drm/card0/device/power_dpm_state
                cat /sys/class/drm/card1/device/power_dpm_state
                So if you see some error instead of power state you need to add "radeon.dpm=1" to kernel options. This will decrease amount of head dramatically.

                And yeah other important note. I don't track firmware changes, but make sure that if you using new kernel you have appropriate new firmware because if firmware is incorrect it's will cause issues.

                Originally posted by Adriannho View Post
                Guess that runpm disables the card at boot but might this be a bug or not?
                Basically only thing "runpm" have to do it's disable discrete GPU, but this functionality may work differently depend on ACPI specifics so I suppose on some laptops it's work better and on some it's may not work. I would recommend you to dig kernel bugzilla and check if someone with same laptop manufacturer as you have reported something like that.

                So runpm have to disable GPU and this is normal, but it's should be visible for userspace.
                Hope you'll solve your problem.

                Comment


                • #28
                  So, I managed to make it work.

                  when booting with runpm off i can set DRI_PRIME to 1 and it reports my second graphic card

                  Code:
                  OpenGL renderer string: Gallium 0.4 on AMD HAINAN
                  while default DRI_PRIME=0 reports

                  Code:
                  OpenGL renderer string: Gallium 0.4 on AMD ARUBA
                  Made a short warsow benchmark comparing the two, but I got quite similar results, but maybe this is because I use an old version of LLVM (3.4). I'll update that and redo the tests.

                  SXX, you were right, dpm was not broken. I got slightly higher temps because the second GPU was powered on.

                  thank you very much for your help.

                  off topic:
                  could someone tell me if it would still be possible to put the two cards in a crossfire setup (once and if that ever gets implemented) considering that they already work in enduro mode?

                  Comment


                  • #29
                    Originally posted by Adriannho View Post
                    off topic:
                    could someone tell me if it would still be possible to put the two cards in a crossfire setup (once and if that ever gets implemented) considering that they already work in enduro mode?
                    There is no support for crossfire in the open source driver at the moment. If someone wanted to work on it, there is no special hardware required. It's just lots of work in mesa to split up rendering between multiple GPUs.

                    Comment


                    • #30
                      Originally posted by agd5f View Post
                      There is no support for crossfire in the open source driver at the moment. If someone wanted to work on it, there is no special hardware required. It's just lots of work in mesa to split up rendering between multiple GPUs.
                      Thanks Alex.

                      Comment

                      Working...
                      X