Announcement

Collapse
No announcement yet.

iperf sometimes does not populate the RawData element in composite.xml

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • iperf sometimes does not populate the RawData element in composite.xml

    We are using PTS/iperf and having it execute a set of tests. We have observed that, in some cases, not all of the RawString entries are filled. All of the Value entries are there (and are correct as we've checked using debug-run). Here is a snippet of a recent composite.xml file and additional details can be provided, if requested.

    There is no RawString entry in the first case (TCP -P 4), yet it is there for the UDP -P 4, TCP -P 16 and UDP -P 16 cases. We can see from the JSON entry that three trials were run for each of these experiments. Note these cases use a slightly modified test-definition.xml, adding these cases to be able to name them individually (versus using the default parallel parameter).

    Any thoughts or suggestions? Thanks in advance.

    <Result>
    <Identifier>pts/iperf-1.0.2</Identifier>
    <Title>iPerf</Title>
    <AppVersion>3.1.3</AppVersion>
    <Arguments>-w 256k -P 4 -t 10 -c 100.100.243.217 -p 5201</Arguments>
    <Description>Test: TCP-4 - Duration: 10</Description>
    <Scale>Mbits/sec</Scale>
    <Proportion>HIB</Proportion>
    <DisplayFormat>BAR_GRAPH</DisplayFormat>
    <Data>
    <Entry>
    <Identifier>Red Hat Virtio device</Identifier>
    <Value>1042</Value>
    <RawString></RawString>
    <JSON>{"compiler-options":{"compiler-type":"CC","compiler":"gcc","compiler-options":"-O3 -march=native -lm"},"test-run-times":"10.19:10.18:10.18"}</JSON>
    </Entry>
    </Data>
    </Result>
    <Result>
    <Identifier>pts/iperf-1.0.2</Identifier>
    <Title>iPerf</Title>
    <AppVersion>3.1.3</AppVersion>
    <Arguments>-u -P 4 -t 10 -c 100.100.243.217 -p 5201</Arguments>
    <Description>Test: UDP-4 - Duration: 10</Description>
    <Scale>Mbits/sec</Scale>
    <Proportion>HIB</Proportion>
    <DisplayFormat>BAR_GRAPH</DisplayFormat>
    <Data>
    <Entry>
    <Identifier>Red Hat Virtio device</Identifier>
    <Value>4.19</Value>
    <RawString>4.19:4.19:4.19</RawString>
    <JSON>{"compiler-options":{"compiler-type":"CC","compiler":"gcc","compiler-options":"-O3 -march=native -lm"},"test-run-times":"10.17:10.17:10.17"}</JSON>
    </Entry>
    </Data>
    </Result>
    <Result>
    <Identifier>pts/iperf-1.0.2</Identifier>
    <Title>iPerf</Title>
    <AppVersion>3.1.3</AppVersion>
    <Arguments>-w 256k -P 16 -t 10 -c 100.100.243.217 -p 5201</Arguments>
    <Description>Test: TCP-16 - Duration: 10</Description>
    <Scale>Mbits/sec</Scale>
    <Proportion>HIB</Proportion>
    <DisplayFormat>BAR_GRAPH</DisplayFormat>
    <Data>
    <Entry>
    <Identifier>Red Hat Virtio device</Identifier>
    <Value>1043</Value>
    <RawString>1044:1043:1043</RawString>
    <JSON>{"compiler-options":{"compiler-type":"CC","compiler":"gcc","compiler-options":"-O3 -march=native -lm"},"test-run-times":"10.37:10.22:10.22"}</JSON>
    </Entry>
    </Data>
    </Result>
    <Result>
    <Identifier>pts/iperf-1.0.2</Identifier>
    <Title>iPerf</Title>
    <AppVersion>3.1.3</AppVersion>
    <Arguments>-u -P 16 -t 10 -c 100.100.243.217 -p 5201</Arguments>
    <Description>Test: UDP-16 - Duration: 10</Description>
    <Scale>Mbits/sec</Scale>
    <Proportion>HIB</Proportion>
    <DisplayFormat>BAR_GRAPH</DisplayFormat>
    <Data>
    <Entry>
    <Identifier>Red Hat Virtio device</Identifier>
    <Value>16.80</Value>
    <RawString>16.8:16.8:16.8</RawString>
    <JSON>{"compiler-options":{"compiler-type":"CC","compiler":"gcc","compiler-options":"-O3 -march=native -lm"},"test-run-times":"10.17:10.17:10.17"}</JSON>
    </Entry>
    </Data>
    </Result>

  • #2
    There should always be a RawString if the test ran multiple times. Can you look at your output to verify if for the runs where RawString is absent if it successfully ran multiple times?
    Michael Larabel
    http://www.michaellarabel.com/

    Comment


    • #3
      Originally posted by Michael View Post
      There should always be a RawString if the test ran multiple times. Can you look at your output to verify if for the runs where RawString is absent if it successfully ran multiple times?
      Agreed. Which output should I look at to see that?

      I know there were three trials (see JSON timings) and I know the Value result is in line since this is but one example of many similar runs using this network connection.

      Comment


      • #4
        Originally posted by JesseG View Post

        Agreed. Which output should I look at to see that?

        I know there were three trials (see JSON timings) and I know the Value result is in line since this is but one example of many similar runs using this network connection.
        If you have the terminal output it should be most revealing if there were multiple successful runs. At the moment I don't recall the default behavior regarding the JSON timings if runs that failed to produce a result are still included in that array.
        Michael Larabel
        http://www.michaellarabel.com/

        Comment


        • #5
          Originally posted by Michael View Post

          If you have the terminal output it should be most revealing if there were multiple successful runs. At the moment I don't recall the default behavior regarding the JSON timings if runs that failed to produce a result are still included in that array.
          This is a puzzler! I am also trying to see if I can find a pattern in the output we are generating, unsuccessfully so far. Is there any reason why parsing my be interrupted causing the RawString to not be generated (yet the Value to be set)?

          Here's a bit more detailed output from the run I used above:

          [[email protected] ]# grep RawString composite.xml
          <RawString></RawString>
          <RawString>1.05:1.05:1.05</RawString>
          <RawString></RawString>
          <RawString>4.19:4.19:4.19</RawString>
          <RawString>1044:1043:1043</RawString>
          <RawString>16.8:16.8:16.8</RawString>
          [[email protected] ]# grep Value composite.xml
          <Value>1042</Value>
          <Value>1.05</Value>
          <Value>1042</Value>
          <Value>4.19</Value>
          <Value>1043</Value>
          <Value>16.80</Value>

          This is corresponding terminal output (note: we are using an older version of PTS, 8.0.1, but I tried this "by hand" on the latest version and saw the same problem):

          iPerf 3.1.3:
          pts/iperf-1.0.2
          Network Test Configuration
          Using Pre-Set Run Option: TCP,TCP-4,TCP-16,UDP,UDP-4,UDP-16
          Using Pre-Set Run Option: 10
          Using Pre-Set Run Option: 100.100.243.217
          Using Pre-Set Run Option: 5201

          System Information

          <NOT SHOWN>

          iPerf 3.1.3:
          pts/iperf-1.0.2 [Test: TCP - Duration: 10 - Server Address: 100.100.243.217 - Server Port: 5201]
          Test 1 of 6
          Estimated Trial Run Count: 3
          Estimated Test Run-Time: 6 Minutes
          Estimated Time To Completion: 32 Minutes [19:43 UTC]
          Started Run 1 @ 19:11:55
          Started Run 2 @ 19:12:07
          Started Run 3 @ 19:12:18

          Test: TCP - Duration: 10 - Server Address: 100.100.243.217 - Server Port: 5201:
          1042
          1042
          1042

          Average: 1042 Mbits/sec
          Deviation: 0.00%

          iPerf 3.1.3:
          pts/iperf-1.0.2 [Test: UDP - Duration: 10 - Server Address: 100.100.243.217 - Server Port: 5201]
          Test 2 of 6
          Estimated Trial Run Count: 3
          Estimated Test Run-Time: 6 Minutes
          Estimated Time To Completion: 27 Minutes [19:38 UTC]
          Started Run 1 @ 19:12:37
          Started Run 2 @ 19:12:48
          Started Run 3 @ 19:12:59

          Test: UDP - Duration: 10 - Server Address: 100.100.243.217 - Server Port: 5201:
          1.05
          1.05
          1.05

          Average: 1.05 Mbits/sec
          Deviation: 0.00%

          iPerf 3.1.3:
          pts/iperf-1.0.2 [Test: TCP-4 - Duration: 10 - Server Address: 100.100.243.217 - Server Port: 5201]
          Test 3 of 6
          Estimated Trial Run Count: 3
          Estimated Test Run-Time: 6 Minutes
          Estimated Time To Completion: 21 Minutes [19:34 UTC]
          Started Run 1 @ 19:13:17
          Started Run 2 @ 19:13:28
          Started Run 3 @ 19:13:39

          Test: TCP-4 - Duration: 10 - Server Address: 100.100.243.217 - Server Port: 5201:
          1042
          1042
          1042

          Average: 1042 Mbits/sec
          Deviation: 0.00%

          iPerf 3.1.3:
          pts/iperf-1.0.2 [Test: UDP-4 - Duration: 10 - Server Address: 100.100.243.217 - Server Port: 5201]
          Test 4 of 6
          Estimated Trial Run Count: 3
          Estimated Test Run-Time: 6 Minutes
          Estimated Time To Completion: 16 Minutes [19:29 UTC]
          Started Run 1 @ 19:13:58
          Started Run 2 @ 19:14:09
          Started Run 3 @ 19:14:20

          Test: UDP-4 - Duration: 10 - Server Address: 100.100.243.217 - Server Port: 5201:
          4.19
          4.19
          4.19

          Average: 4.19 Mbits/sec
          Deviation: 0.00%

          iPerf 3.1.3:
          pts/iperf-1.0.2 [Test: TCP-16 - Duration: 10 - Server Address: 100.100.243.217 - Server Port: 5201]
          Test 5 of 6
          Estimated Trial Run Count: 3
          Estimated Test Run-Time: 6 Minutes
          Estimated Time To Completion: 11 Minutes [19:25 UTC]
          Started Run 1 @ 19:14:38
          Started Run 2 @ 19:14:50
          Started Run 3 @ 19:15:01

          Test: TCP-16 - Duration: 10 - Server Address: 100.100.243.217 - Server Port: 5201:
          1044
          1043
          1043

          Average: 1043 Mbits/sec
          Deviation: 0.06%

          iPerf 3.1.3:
          pts/iperf-1.0.2 [Test: UDP-16 - Duration: 10 - Server Address: 100.100.243.217 - Server Port: 5201]
          Test 6 of 6
          Estimated Trial Run Count: 3
          Estimated Time To Completion: 6 Minutes [19:20 UTC]
          Started Run 1 @ 19:15:19
          Started Run 2 @ 19:15:30
          Started Run 3 @ 19:15:41

          Test: UDP-16 - Duration: 10 - Server Address: 100.100.243.217 - Server Port: 5201:
          16.8
          16.8
          16.8

          Average: 16.80 Mbits/sec
          Deviation: 0.00%

          Comment


          • #6
            Thanks, trying to reproduce now. Have you seen it just with iPerf or other tests too?
            Michael Larabel
            http://www.michaellarabel.com/

            Comment


            • #7
              Originally posted by Michael View Post
              Thanks, trying to reproduce now. Have you seen it just with iPerf or other tests too?
              Just iperf as far as I know. Will check with others on the team and update this response if I hear back otherwise.

              Note: we have run ~20 different PTS tests.

              Comment


              • #8
                Originally posted by JesseG View Post

                Just iperf as far as I know. Will check with others on the team and update this response if I hear back otherwise.

                Note: we have run ~20 different PTS tests.
                Still going through the code but so far not seeing anywhere that this odd behavior of not having the RawString would happen.... Have you noticed any behavior like if it's always the first test being run, the last test of that result file, the first test after changing the tests being run, or any behavior like that?
                Michael Larabel
                http://www.michaellarabel.com/

                Comment


                • #9
                  Originally posted by Michael View Post

                  Still going through the code but so far not seeing anywhere that this odd behavior of not having the RawString would happen.... Have you noticed any behavior like if it's always the first test being run, the last test of that result file, the first test after changing the tests being run, or any behavior like that?
                  I went back over our PTS results and found empty RawStrings in several tests, namely these: OpenSSL, PyBench, FIO, RAMSpeed, John the Ripper, iPerf and Netperf. There were only a few instances for OpenSSL and Netperf and a few hundred for iPerf, JTR and PyBench. FIO and RAMSpeed had many more occurences; in fact, about 3/4 of our RAMSpeed runs had no RawString entries as did ~1/2 of the FIO runs. I have not looked at the underlying terminal output to see if there are any patterns. And, yes, the number of tests and the number of missing RawString entries were surprises to me. I checked and each test did produce a (non NULL) Value field.

                  I wish I had an "easy" way to reproduce this behavior, but I do not. Thanks for your help in looking into this.

                  Comment

                  Working...
                  X