Announcement

Collapse
No announcement yet.

Wrong results from iperf-1.0.2

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Wrong results from iperf-1.0.2

    Some test options of iperf-1.0.2 are giving wrong results and seems it's caused by problematic result parser. For the problematic options, Phoronix test suite takes the total speed and one of the single job speed, and calculate the average out of the two, which is lower than the actual total speed. The issue is reproducible on Clear Linux 28430 and PTS 8.6.1

    Some options with issues:
    Test=TCP and Parallel=10
    Test=UDP 100Mbit and Parallel=5, 10 and 20
    Test=UDP 1000Mbit and Parallel=5, 10 and 20

    One example test output:
    iPerf 3.1.3:
    pts/iperf-1.0.2 [Server Address: iperf-server - Server Port: 5201 - Duration: 10 Seconds - Test: TCP - Parallel: 10]
    Test 1 of 1
    Estimated Trial Run Count: 1
    Estimated Time To Completion: 6 Minutes [06:49 CDT]
    Started Run 1 @ 06:44:07

    Server Address: iperf-server - Server Port: 5201 - Duration: 10 Seconds - Test: TCP - Parallel: 10:
    95.4
    941

    Average: 518.20 Mbits/sec

  • #2
    We have seen this as well, specifically the issue is with the "-P 10" iperf parameter.

    If you look at the default definition of results-definition.xml (see https://openbenchmarking.org/innhold...8bb521097e72bc), it attempts to distinguish between the "-P 1" cases and the "-P >1" cases.

    For P=1, it uses this directive to help:
    <MatchToTestArguments>-P 1</MatchToTestArguments>

    However, this also matches "-P 10"! So, in this case, two results are matched: the last of the 10 thread results and the [SUM] result.

    A fix is easy -- add a space " " after the "1" and before the "<" in the above line.

    Comment


    • #3
      Originally posted by JesseG View Post
      We have seen this as well, specifically the issue is with the "-P 10" iperf parameter.

      If you look at the default definition of results-definition.xml (see https://openbenchmarking.org/innhold...8bb521097e72bc), it attempts to distinguish between the "-P 1" cases and the "-P >1" cases.

      For P=1, it uses this directive to help:
      <MatchToTestArguments>-P 1</MatchToTestArguments>

      However, this also matches "-P 10"! So, in this case, two results are matched: the last of the 10 thread results and the [SUM] result.

      A fix is easy -- add a space " " after the "1" and before the "<" in the above line.
      Thanks for reporting this issue, I will get that updated upstream.
      Michael Larabel
      https://www.michaellarabel.com/

      Comment


      • #4
        Originally posted by JesseG View Post
        We have seen this as well, specifically the issue is with the "-P 10" iperf parameter.

        If you look at the default definition of results-definition.xml (see https://openbenchmarking.org/innhold...8bb521097e72bc), it attempts to distinguish between the "-P 1" cases and the "-P >1" cases.

        For P=1, it uses this directive to help:
        <MatchToTestArguments>-P 1</MatchToTestArguments>

        However, this also matches "-P 10"! So, in this case, two results are matched: the last of the 10 thread results and the [SUM] result.

        A fix is easy -- add a space " " after the "1" and before the "<" in the above line.
        pts/iperf-1.0.3 is now available with this change, does it work for you now?
        Michael Larabel
        https://www.michaellarabel.com/

        Comment

        Working...
        X