Announcement

Collapse
No announcement yet.

Linux 4.18 AMDGPU Tests: Vega Taking A Hit

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Ahh, OK... so you're still thinking about "no image / black screen" as a subset of "the wrong image" (with display, camera etc..) but easier to implement and maintain. Makes sense.

    The manual exposure control would be critical as you say, otherwise black, white, grey and my desktop background all look pretty much the same.
    Last edited by bridgman; 10 July 2018, 09:55 PM.
    Test signature

    Comment


    • #22
      Originally posted by bridgman View Post
      Ahh, OK... so you're still thinking about "no image / black screen" as being a subset of "the wrong image" (with display, camera etc..) but easier to implement and maintain. Makes sense.

      The manual exposure control would be critical as you say, otherwise black, white and grey all look pretty much the same.
      It's only fair to point out that I just jumped into the conversation and dwagner might have something else in mind.

      That said, I do think "no image / black screen" would be a useful class of bugs to have automated tests for, even if more comprehensive testing isn't automated, since it would enforce that minimum degree of signal well-formedness encoded in the test monitor's internal sanity checks.

      That said, a more expensive but more versatile option would be to find an HDMI capture device that either has sync limitations similar to a real monitor or can output metadata that can be checked in software. That way, if it passes the "no image / black screen" check, the capture stream could be used to set up CI testing for classes of errors not confusable with compression artifacts, such as tearing, pixel format conversion bugs, incorrect damage rectangles if you want to test things at the Window System level, etc.

      (For example, using a percentage image similarity algorithm intended for finding duplicates in libraries of JPEG images to do an "at least X% similarity" check between the frames from the capture stream and clean reference frames. Heck, given the original use for those types of algorithms, such an approach could be used to compensate for unknown or variable latency and potential dropped frames by checking that each output frame is at least X% similar to one of one of the frames that was supposed to come out under perfect circumstances without knowing which one.)

      It'd also have the benefit that, if a test fails, you've got a capture of the output right there to attach to the report.

      Comment


      • #23
        Originally posted by agd5f View Post
        If you are having issues with DP, please try this patch:
        https://lists.freedesktop.org/archiv...ly/023920.html
        Mm, no luck for me. No change in behavior: once amdgpu initializes and filches the display away from UEFI the DisplayPort signal is interpreted by my monitor as "no signal"

        Applied-against: 4.18.0-rc4 cleanly

        Comment


        • #24
          phoronix Is there a reason you don't include Vega IGPs like the Ryzen APUs in your comparative benchmarks? I open these articles enthusiastically only to find there's no Ryzen in the comparison

          Comment


          • #25
            txtsd Seconded. I'd like to see Vega IGP tests as well.

            Comment


            • #26
              Michael
              Please note that you need to flush the mesa shader cache before you make a test with each configuration.
              Part of regressions are simply because all shaders were already cached with the first configuration but the second configuration had to cache them first. This regularly happens if a change in the components (LLVM, Kernel, Mesa) makes shader recompilation necessary.
              Or configure pts to exclude the first run from the result.

              On page 2 Figure 3: Dota 2 is a very good example, as it can be seen that all cards are regressed. Kernel 4.17 has for some reason no spread but 4.18 has an extreme one. Why? Because this is a new configuration where shaders have to be cached first. Usually the first uncached run from Dota2 is very choppy.

              On page 2 figure 5:
              In higher resolutions we can see actually no spread. Shaders were cached, all good.

              For comparison my own Dota2 results with HD 7970:

              coldcache:

              pts/dota2-1.2.4 [Resolution: 1920 x 1080 - Renderer: OpenGL]
              Resolution: 1920 x 1080 - Renderer: OpenGL:
              110.2
              123
              119.8
              123.5
              122.6
              121.8

              Average: 120.15 Frames Per Second
              Deviation: 4.20%

              pts/dota2-1.2.4 [Resolution: 1920 x 1080 - Renderer: Vulkan]
              Resolution: 1920 x 1080 - Renderer: Vulkan:
              102.6
              120.3
              121
              117.9
              118.2
              120.7

              Average: 116.78 Frames Per Second
              Deviation: 6.05%

              warmcache:

              pts/dota2-1.2.4 [Resolution: 1920 x 1080 - Renderer: OpenGL]
              Resolution: 1920 x 1080 - Renderer: OpenGL:
              121.8
              122.9
              123.3

              Average: 122.67 Frames Per Second
              Deviation: 0.63%

              pts/dota2-1.2.4 [Resolution: 1920 x 1080 - Renderer: Vulkan]
              Resolution: 1920 x 1080 - Renderer: Vulkan:
              120.5
              121
              120.1

              Average: 120.53 Frames Per Second
              Deviation: 0.37%

              OpenBenchmarking.org, Phoronix Test Suite, Linux benchmarking, automated benchmarking, benchmarking results, benchmarking repository, open source benchmarking, benchmarking test profiles

              Comment


              • #27
                Originally posted by AngelKnight View Post

                Mm, no luck for me. No change in behavior: once amdgpu initializes and filches the display away from UEFI the DisplayPort signal is interpreted by my monitor as "no signal"

                Applied-against: 4.18.0-rc4 cleanly
                Please file a bug (https://bugs.freedesktop.org) and attach your xorg log and dmesg output.

                Comment


                • #28
                  bridgman ssokolow There is an even simpler solution: Buy a cheap HDMI splitter, like e.g. this one:
                  http://ftp.assmann.com/pub/DS-/DS-46...h_20150923.pdf
                  which costs about 30,- EUR. It has link-LEDs which will tell you whether the splitter picks up a valid input signal.

                  For 37,- EUR, you can even buy a similar device with audio outputs -

                  which you could feed-back into a computer, to test whether you can receive the same audio signal which you send on the test-machine.

                  For 67,- EUR, a splitter that also supports HDMI 2.0b / 4k@60Hz / 600MHz can be bought ("Marmitek Split 612 UHD 2.0"), which also has an input-indicator LED.
                  Last edited by dwagner; 11 July 2018, 07:07 PM.

                  Comment


                  • #29
                    Originally posted by dwagner View Post
                    bridgman ssokolow There is an even simpler solution: Buy a cheap HDMI splitter, like e.g. this one:

                    which costs about 30,- EUR. It has link-LEDs which will tell you whether the splitter picks up a valid input signal.

                    For 37,- EUR, you can even buy a similar device with audio outputs -

                    which you could feed-back into a computer, to test whether you can receive the same audio signal which you send on the test-machine.
                    The problem with the link-LEDs approach is that it won't catch black screens where the card is sending a valid stream of nothing to the monitor when it should be sending an image.

                    Comment


                    • #30
                      Originally posted by ssokolow View Post
                      The problem with the link-LEDs approach is that it won't catch black screens where the card is sending a valid stream of nothing to the monitor when it should be sending an image.
                      Haven't had that symptom so far - when my GPU crashed, the TV always told me "no signal". If you also verify an existing audio-signal loop, then chances are IMHO slim that you miss a crash.

                      Comment

                      Working...
                      X