Windows Server 2019 vs. Linux vs. FreeBSD Gigabit & 10GbE Networking Performance
FreeBSD 12.0, Windows Server 2019, and five Linux distributions were tested for comparing the Gigabit and 10GbE networking performance as part of our latest benchmarks. Additionally, the performance was looked at for the Mellanox 10GbE adapter when also using the company's Linux tuning script compared to the out-of-the-box performance on the enterprise Linux distribution releases.
Recently I posted benchmarks of 9 Linux distributions against FreeBSD 12.0 for looking at the 10GbE networking performance. Comparing to Windows Server 2016/2019 was thwarted there due to the HP NC523SFP PCIe adapter with QLogic 8214 controller had poor/unavailable driver support on Windows. So I restarted the testing from scratch but for this go around was using a Mellanox MT26448 ConnectX-2 PCIe adapter. These Mellanox adapters have a single 10GbE SFP+ port and are half-height cards. These are among the low-cost 10GbE SFP+ network adapters with being able to find them from major Internet retailers for around $20~30 USD, a reasonable price for SOHO environments.
The Mellanox MT26448 was well supported on Microsoft Windows Server, worked on FreeBSD 12.0 when manually loading the Mellanox network driver, and also on the Linux distributions tested with the exception of Clear Linux (its kernel does not ship with the Mellanox network driver). But across Ubuntu 18.04 LTS, Ubuntu 18.10, Debian 9.6, Scientific Linux 7 (RHEL7), it was working out-of-the-box. Mellanox also provides their official driver packages, which includes the mlnx_tune script for making it easy and automated to tune your network adapter for optimal performance. Given the easy and reproducible script, on Scientific Linux 7.6 and Ubuntu I did runs not only out-of-the-box for the OS but also after the mlnx_tune script had been set to the profile for high throughput.
Besides the 10GbE tests, I also ran Gigabit network benchmarks on the operating systems under test for reference. Those Gigabit tests were using an Intel I210 Gigabit adapter found on the Tyan 1U Xeon server being used for the benchmarking. All of the tests were done with a Tyan S7106 1U server featuring two Intel Xeon Gold 6138 CPUs, 96GB of DDR4 system memory, and Samsung 970 EVO SSD. The same hardware was obviously used throughout all of the benchmarking, any reported differences in the system table just come down to what/how the information is exposed by each operating system. Each OS had its available software updates as of testing and except where otherwise noted was tested in its out-of-the-box configuration for reproducibility and the default experience found by users of the given OS.
This "client" Tyan dual Xeon server was connecting to an AMD Ryzen Threadripper 2920X box running the server benchmark processes. That server box was running Ubuntu 18.10 with an HP NC523SFP SPF+ 10 Gigabit network adapter. Between these two systems was an Ubiquiti US-16-XG-US 10 Gigabit switch.
All of these Linux/BSD/Windows network benchmarks were carried out using the open-source Phoronix Test Suite software for test automation.