Announcement

Collapse
No announcement yet.

Linux 4.18-rc3 Kernel Released

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Linux 4.18-rc3 Kernel Released

    Phoronix: Linux 4.18-rc3 Kernel Released

    Linus Torvalds is back to his regular release timing for new Linux 4.18 kernel release candidates...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    I feel strongly compelled to point out that this kernel will not allow you or your friends to run 4k at 60 hz over a HDMI connection on RX 4xx cards because this kernel now supports 10-bit and that just doesn't work for some reason. And it doesn't call back to 4k 60Hz 8-bit when 4k 60Hz 10-bit fails.

    This total scandal is documented in freedesktop bug 106959 and is caused by linux.git commit e03fd3f300f6184c1264186a4c815e93bf658abb

    It is possible to force 8-bit colors and get 4k 60Hz by adding bpc = 8; before switch (bps) in drivers/gpu/drm/amd/display/amdgpu_dm/amdgpu_dm.c but if you don't do this you're totally screwed and you're stuck with 30 Hz on your 4k display like it's the 1970s. Using DisplayPort is an alternative if you've just got one 4k monitor, though. But I'm abusing a 3x4K setup and my card's only got two DPs so that's a non-solution.

    AMD seems very quiet about this total failure to properly support modern resolutions. Nobody seems willing to admit anything. It seems like they are happy with kernel 4.18 shipping with this major bug that will prevent you from running a 10-bit capable 4k display at 60 Hz over HDMI.
    Last edited by xiando; 01 July 2018, 09:34 PM.

    Comment


    • #3
      Originally posted by xiando View Post
      This total scandal
      L O L

      Comment


      • #4
        Originally posted by xiando View Post
        I feel strongly compelled to point out that this kernel will not allow you or your friends to run 4k at 60 hz over a HDMI connection on RX 4xx cards because this kernel now supports 10-bit and that just doesn't work for some reason. And it doesn't call back to 4k 60Hz 8-bit when 4k 60Hz 10-bit fails.

        This total scandal is documented in freedesktop bug 106959 and is caused by linux.git commit e03fd3f300f6184c1264186a4c815e93bf658abb

        It is possible to force 8-bit colors and get 4k 60Hz by adding bpc = 8; before switch (bps) in drivers/gpu/drm/amd/display/amdgpu_dm/amdgpu_dm.c but if you don't do this you're totally screwed and you're stuck with 30 Hz on your 4k display like it's the 1970s. Using DisplayPort is an alternative if you've just got one 4k monitor, though. But I'm abusing a 3x4K setup and my card's only got two DPs so that's a non-solution.

        AMD seems very quiet about this total failure to properly support modern resolutions. Nobody seems willing to admit anything. It seems like they are happy with kernel 4.18 shipping with this major bug that will prevent you from running a 10-bit capable 4k display at 60 Hz over HDMI.
        HDMI 2.0 does not support 10 bit RGB over HDMI, there is not enough bandwidth. It's possible if you use YUV encoding rather than RGB and the monitor supports, but, but at the moment the xserver does not support YUV modes. So we can go back to clamping HDMI to 8 bpc, but then you won't get the full range if your monitor supports 10+ bpc. Ideally the xserver would be fixed, but I'm not sure how feasible that is.

        Also, if you want to use DP, you can use an MST hub to connect multiple DP monitors to a single DP port.
        Last edited by agd5f; 01 July 2018, 11:51 PM. Reason: typo

        Comment


        • #5
          Originally posted by agd5f View Post
          HDMI 2.0 does not support 10 bit RGB over HDMI, there is not enough bandwidth. It's possible if you use YUV encoding rather than RGB and the monitor supports, but, but at the moment the xserver does not support YUV modes. So we can go back to clamping HDMI to 8 bpc, but then you won't get the full range if your monitor supports 10+ bpc. Ideally the xserver would be fixed, but I'm not sure how feasible that is.

          Also, if you want to use DP, you can use an MST hub to connect multiple DP monitors to a single DP port.
          All the MST hub's I've seen are for connecting 4 1080p monitors to one DP port. I have seen zero that supports two 4k monitors. I doubt that's a viable solution.

          I ordered a RX 570 8 GB with 3 DP ports (and one HDMI and one DVI) half an hour ago so I can use DP for all of my 3 4k monitors.

          As for clamping HDMI to 8 bpc: It seems totally obvious that this should be done in the special little case of 4K monitors over HDMI but only in that case. HDMI 2.0 can probably do 1440p60Hz with 10-bit just fine. There's no reason to restrict that. But cases where 10-bit colors reduce the refresh-rate to 30Hz should, in my humble opinion, be handled so 8-bit 60Hz is preferred over 10-bit 30Hz. That's just my opinion and here's why I have it: I don't even notice if the panel is set to 8-bit or 10-bit, I see no difference. I did, however, immediately notice that something was very wrong after upgrading the kernel to one which forced one monitor to 30 Hz. Just moving the mouse cursor on it was jerky and the whole experience becomes poor and sad and depressing and unacceptable.

          Comment


          • #6
            Originally posted by xiando View Post
            I feel strongly compelled to point out that this kernel will not allow you or your friends to run 4k at 60 hz over a HDMI connection on RX 4xx cards because this kernel now supports 10-bit and that just doesn't work for some reason. And it doesn't call back to 4k 60Hz 8-bit when 4k 60Hz 10-bit fails.

            This total scandal is documented in freedesktop bug 106959 and is caused by linux.git commit e03fd3f300f6184c1264186a4c815e93bf658abb

            It is possible to force 8-bit colors and get 4k 60Hz by adding bpc = 8; before switch (bps) in drivers/gpu/drm/amd/display/amdgpu_dm/amdgpu_dm.c but if you don't do this you're totally screwed and you're stuck with 30 Hz on your 4k display like it's the 1970s. Using DisplayPort is an alternative if you've just got one 4k monitor, though. But I'm abusing a 3x4K setup and my card's only got two DPs so that's a non-solution.

            AMD seems very quiet about this total failure to properly support modern resolutions. Nobody seems willing to admit anything. It seems like they are happy with kernel 4.18 shipping with this major bug that will prevent you from running a 10-bit capable 4k display at 60 Hz over HDMI.
            I recently tried out some version of 4.18 from Fedora's Rawhidenodebug repo and noticed I only had 30Hz at 4K (I had 60Hz with 4.17); thought something was just temporarily broken with AMDGPU, but that commit does explain the problem.

            I just got a 4k display and finally worked out the kinks to getting 4k@60Hz, and ended up purchasing a RX 560. This won't be a problem for me for a while (4.18 has to release first and then distro support), but I can't deal with 30Hz. Bad enough dealing with a slightly washed-out enforced full RGB pixel format (no way to switch to YCbCr with AMDGPU).

            I'm hoping an option is presented later on in 4.18 soon, but in the meantime, would a DisplayPort-to-HDMI adapter work? My display only has HDMI ports. The RX 560 I have has a DVI, DisplayPort, and HDMI port (only one of each). I wouldn't prefer to have to compile/maintain a custom kernel just to alter amdgpu_dm.c.

            For reference, with 18.6.1 AMD drivers on Windows, I can only choose 8 bpc at 4K at 4:4:4 YCbCr and RGB (full and limited). YCbCr 4:2:2 and 4:2:0 let me do 10 and 12 bpc at 4k@60Hz. Out-the-box 4k@60Hz 4:4:4 YCbCr is used (full RGB looks a little washed out; my display seems to accept YCbCr better).

            Originally posted by ihatemichael

            Oh noes! Have you tried asking for a refund? /s

            Maybe try joining #radeon on irc.freenode.net and make a "scandal" over there.
            Scandal is probably a bad choice of wording, but this definitely isn't an encouraging change in my situation.
            Last edited by Guest; 02 July 2018, 12:43 AM.

            Comment


            • #7
              Originally posted by Espionage724 View Post
              I just got a 4k display and finally worked out the kinks to getting 4k@60Hz, and ended up purchasing a RX 560. This won't be a problem for me for a while (4.18 has to release first and then distro support), but I can't deal with 30Hz.

              I'm hoping an option is presented later on in 4.18 soon, but in the meantime, would a DisplayPort-to-HDMI adapter work? My display only has HDMI ports. The RX 560 I have has a DVI, DisplayPort, and HDMI port (only one of each). I wouldn't prefer to have to compile/maintain a custom kernel just to alter amdgpu_dm.c.

              For reference, with 18.6.1 AMD drivers on Windows, I can only choose 8 bpc at 4K at 4:4:4 YCbCr and RGB (full and limited). YCbCr 4:2:2 and 4:2:0 let me do 10 and 12 bpc at 4k@60Hz.
              If what agd5f states above is true, and it much likely is, then RGB is the only choice right now due to Xorg limitations (it can't YCbCr 4:2:2 and 4:2:0). This would give us the choice of 8-bit 60Hz or 10-bit 30Hz.

              I share your inability to deal with 30Hz. I find it horrible and disgusting. I'd much rather have 60Hz 8-bit than suffer with unbearable 30Hz. It is not a good default. It would be much better to default to 60Hz 8-bit in all cases where increasing bpc reduces the frame-rate below 60Hz. Some configuration option, either a kernel amdgpu.whatever option or a Xorg configuration option, could let those who want stuttering 30Hz and 10-bit have their. I don't know who in their right minds would want that, though.

              I stand by my defaulting to 30Hz is a total scandalcomplaint.

              btw. I have a RX 470 8 GB for sale. It has DVI and 2xDP and 2xHDMI. Works fine. Except for this bug. For now. It's in Europe.

              Comment


              • #8
                Originally posted by xiando View Post

                If what agd5f states above is true, and it much likely is, then RGB is the only choice right now due to Xorg limitations (it can't YCbCr 4:2:2 and 4:2:0). This would give us the choice of 8-bit 60Hz or 10-bit 30Hz.
                According to https://bugs.freedesktop.org/show_bug.cgi?id=83226 it looks like the radeon driver supports colorspace switching between RGB and YUV. I imagine if it works on that driver, Xorg has the ability to do it, but I'm not 100% aware of why it isn't implemented in amdgpu.

                If I recall correctly, someone had luck with switching something (either RGB to YUV or RGB full to limited?) with a quick EDID edit and forcing that EDID to be used.

                Comment


                • #9
                  Originally posted by xiando View Post
                  This total scandal is documented in freedesktop bug 106959 and is caused by linux.git commit e03fd3f300f6184c1264186a4c815e93bf658abb
                  ...
                  AMD seems very quiet about this total failure to properly support modern resolutions.
                  You already had 5 responses from AMD developers between rc1 and rc2. If that is "very quiet" I would hate to see "chatty".

                  Is the "total scandal" you are referring to just having a problem show up in a development tree (not a release) as a side-effect of implementing a new feature ? If so, there are probably 500+ total scandals in every kernel cycle.

                  Originally posted by xiando View Post
                  Nobody seems willing to admit anything. It seems like they are happy with kernel 4.18 shipping with this major bug that will prevent you from running a 10-bit capable 4k display at 60 Hz over HDMI.
                  I haven't seen any indication that this is a bug at all - as agd5f said not being able to run 4K/60 RGB is an HDMI 2.0 + X server limitation, not a driver bug. What exactly are you expecting us to "admit" ?

                  AFAICS the issue here is just that the driver / X server combination is choosing 4K/30/10 over 4K/60/8 when both are available, while choosing 4K/60/8 might be a better default. Not sure if that would need to be changed in the X server or in the driver (probably X server) but it seems like a reasonable solution if it is possible.

                  In the meantime I imagine the best way to over-ride would be via X.conf or xrandr, but I don't have access to a 4K/10-bit system at home to try it over the long weekend.
                  Last edited by bridgman; 02 July 2018, 01:25 AM.
                  Test signature

                  Comment


                  • #10
                    Originally posted by bridgman View Post
                    You already had 5 responses from AMD developers by the time you posted this. If that is "very quiet" I would hate to see "chatty".
                    That is a valid point. I admit that AMD was all over this hours after filing the bug.

                    Originally posted by bridgman View Post
                    Is the "total scandal" you are referring to just having a problem show up in a development tree (not a release) ?
                    4.17 works fine, it's just a problem with 4.18 that showed up some commits before 4.18rc1. Isn't in a release yet. Perhaps it won't be if we all scream total scandal and or bloody murder before 4.18 ships? I know about that story about some boy who cried wolf and all but screaming total scandal still works (for now).

                    Originally posted by bridgman View Post
                    I haven't seen any indication that this is a bug at all - as agd5f said not being able to run 4K/60 RGB is an HDMI 2.0 + X server limitation, not a driver bug. What exactly are you expecting us to "admit" ?
                    Of course it's the bug. Boot kernel 4.17 and you have a smooth 60Hz usable display, boot 4.18rc1 and it's unusable horrible 30Hz. That's it, that's the scandal.

                    Admitting anything isn't important, submitting some code to git is.

                    Originally posted by bridgman View Post
                    AFAICS the "scandal" here is just that the driver / X server combination is choosing 4K/30/10 over 4K/60/8 when both are available, while choosing 4K/60/8 would probably be preferable. I don't know if that is possible (IIRC a lot of that logic is outside the driver) but it seems like a reasonable solution if it is.
                    It should be quite possible to add some logic to amdgpu_dm.c that simply checks if the connector is HDMI and the resolution is 4K and sets bpc = 8 if those are true in less than 6 lines. That would be a quick fix if the supported max/expected resolution and connector is decided and in some variables at that point. If it's not and more complex then perhaps some kernel argument could be used to decide if 8-bit should be forced or not.

                    I suspect that support for YCbCr 4:2:2 and 4:2:0 would require a whole lot more.
                    Last edited by xiando; 02 July 2018, 01:36 AM. Reason: add more whining

                    Comment

                    Working...
                    X