Out-Of-The-Box 10GbE Network Benchmarks On Nine Linux Distributions Plus FreeBSD 12

Written by Michael Larabel in Operating Systems on 22 January 2019 at 08:00 AM EST. Page 1 of 3. 15 Comments.

Last week I started running some fresh 10GbE Linux networking performance benchmarks across a few different Linux distributions. That testing has now been extended to cover nine Linux distributions plus FreeBSD 12.0 to compare the out-of-the-box networking performance.

Tested this round alongside FreeBSD 12.0 was Antergos 19.1, CentOS 7, Clear Linux, Debian 9.6, Fedora Server 29, openSUSE Leap 15.0, openSUSE Tumbleweed, Ubuntu 18.04.1 LTS, and Ubuntu 18.10.

All of the tests were done with a Tyan S7106 1U server featuring two Intel Xeon Gold 6138 CPUs, 96GB of DDR4 system memory, and Samsung 970 EVO SSD. For the 10GbE connectivity on this server was an add-in HP NC523SFP PCIe adapter providing two 10Gb SPF+ ports using a QLogic 8214 controller.

Originally the plan as well was to include Windows Server 2016/2019. Unfortunately the QLogic driver download site was malfunctioning since Cavium's acquisition of the company and the other Windows Server 2016 driver options not panning out and there not being a Windows Server 2019 option. So sadly that Windows testing was thwarted so I since started testing over with a Mellanox Connectx-2 10GbE NIC, which is well supported on Windows Server and so that testing is ongoing for the next article of Windows vs. Linux 10 Gigabit network performance plus some "tuned" Linux networking results too.

10GbE Linux vs. FreeBSD Networking Performance Tests

For this current round of testing, all the operating systems were tested "out of the box" for the default Linux networking performance on each OS. While the network stacks can be tuned, and will be featured in the next article those results, the OOTB performance was focused on for showing the intended performance as configured by the OS vendor. The same hardware was obviously used as well throughout all of the testing.

This "client" Tyan dual Xeon server was connecting to an AMD Ryzen Threadripper box running the server benchmark processes. These two systems were connected using an Ubiquiti Networks US-16-XG-US 10G switch with 10Gtek SFP-H10GB-CU2M cabling. More details on the setup can be found from the earlier article.


Related Articles