Announcement

Collapse
No announcement yet.

Windows Server 2019 vs. Linux vs. FreeBSD Gigabit & 10GbE Networking Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by edwaleni View Post
    I have to assume the frame size is at default, 1518 bytes.

    [...]
    One of the screenshots shows a MTU of 1500.

    Comment


    • #12
      These Mellanox adapters have a single 10GbE SPF+ port and are half-height cards. These are among the low-cost 10GbE SPF+ network adapters[
      It's an SFP+ (Small Form-factor Pluggable) port, not an enhanced sunscreen

      Comment


      • #13
        Was Clear Linux not playing nice with the 10GBase-T machine?

        Comment


        • #14
          Originally posted by microcode View Post
          Was Clear Linux not playing nice with the 10GBase-T machine?
          Written in the article, but Clear Linux didn't ship the Mellanox driver. It did play fine though with the QLogic as well as the 10GbE controller in the 2P EPYC server.
          Michael Larabel
          http://www.michaellarabel.com/

          Comment


          • #15
            Hi Michael,

            Just to let you know, perhaps the cheapest consumer 10Gbe card you can buy brand new with out-of-the-box Linux support (no need to download drivers or compile stuff unlike Mellanox) is the Asus XG-C100C which is a consumer card. Cost less than a $100 USD and uses RJ45 copper...
            https://www.asus.com/au/Networking/XG-C100C/

            I don't work for Asus nor affiliated with them. Just that I stumbled across them at an online store and bought a couple for direct cross-over networking (server to back-up server).

            Would be great if you could ping Asus for a sample! Then run your magic!

            Comment


            • #16
              Originally posted by edwaleni View Post

              I only brought up brand names because cards from those NIC chipset suppliers tend to run higher in price, especially when using a fixed physical connection type.

              Ultimately I think a good test would be to connect 1 server, using the same NIC against a Xena 40Gbps test platform. Boot each OS and run it through the paces.

              https://xenanetworks.com
              Why do you need a Xena? Duplicate a Netflix OCA sans harddrives and memory amount, make sure you give it a decent ethernet, the chelsio dual 10GB, quad 10GB or 100GB cards work quite well and you can pickup used Chelsio dual 10GB cards for about $40, put two of those in and boom you've got 40GB to play with. Netflix tells you what needs to be tuned to push traffic.

              I've got no doubt I could build something to give a network a very good workout with a price tag of anywhere from $300 to $1000 depending on how much I want to stress the net. Unless you really need the massive number of ports the XenaNetworks various platforms support, why even think about spending that kind of money?

              This is all assuming that your testing a single box or maybe two. If you need more than 40 to 60Gb/s I would get a Chelsio T6(runs around $600) and you'll get 100Gb line speed.
              https://www.chelsio.com/wp-content/uploads/resources/T5-40Gb-FreeBSD-Netmap.pdf
              https://www.chelsio.com/wp-content/u...DP-FreeBSD.pdf
              https://www.chelsio.com/wp-content/u...d-toe-epyc.pdf

              Comment

              Working...
              X