Announcement

Collapse
No announcement yet.

Samsung 960 EVO NVMe SSD Benchmarks On Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by bug77 View Post
    Well, yeah. Products released before Samsung introduced before Samsung started making V-NAND are planar.
    Samsung 750 with planar was released after 850 EVO with V-NAND.

    Comment


    • #22
      Originally posted by Sonadow View Post
      I'm questioning whether Michael is really using native NVMe or if it is falling back to AHCI mode.
      There is no fallback to AHCI mode.
      ... NVMe drives running in AHCI mode on Linux.
      NVMe drives can only run in NVMe mode and not in AHCI mode.

      Comment


      • #23
        Originally posted by drSeehas View Post
        Samsung 750 with planar was released after 850 EVO with V-NAND.
        My mistake, you're right.

        Comment


        • #24
          Originally posted by bug77 View Post

          Don't worry, sequential speeds are mostly irrelevant at home. It's the fast access time (and thus IOPS) that makes the difference between SSDs and HDDs.
          Thanks for the perspective, I did not consider this aspect (taking it for granted these days) . I can see how IOPS can help with you are multitasking, for me though I require both. If you completely disregard IOPS, my HDD (WD4001FFSX) reads ~120 MB/s where my SSD reads ~550 MB/s. I have 3 games that I play weekly all of which requires a high amount of sequential read/write speed.
          • Arma 2 - The game unloads all resources if you die and reloads everything from disk when you respawn. If you take to long to respawn the conflicts are usually over and all your loot have been taken.
          • Arma 3 - Checks signatures before loading resources, the game is ~25GB and my mods are over 75GB. I typically load more than have my mods.
          • Path of Exile - The use of delta patches that are applied to a game-package-file requires insane write speed, on my HDD it used to take 20 minutes to apply a 5mb patch. It was so frustrating that I created a ram drive for this game before I obtained my SSD. This game also has a clear difference in loading screens, you can easily spot the party members who use HDDs, they fall behind in speed runs.

          Comment


          • #25
            Originally posted by Jabberwocky View Post

            Thanks for the perspective, I did not consider this aspect (taking it for granted these days) . I can see how IOPS can help with you are multitasking, for me though I require both. If you completely disregard IOPS, my HDD (WD4001FFSX) reads ~120 MB/s where my SSD reads ~550 MB/s. I have 3 games that I play weekly all of which requires a high amount of sequential read/write speed.
            • Arma 2 - The game unloads all resources if you die and reloads everything from disk when you respawn. If you take to long to respawn the conflicts are usually over and all your loot have been taken.
            • Arma 3 - Checks signatures before loading resources, the game is ~25GB and my mods are over 75GB. I typically load more than have my mods.
            • Path of Exile - The use of delta patches that are applied to a game-package-file requires insane write speed, on my HDD it used to take 20 minutes to apply a 5mb patch. It was so frustrating that I created a ram drive for this game before I obtained my SSD. This game also has a clear difference in loading screens, you can easily spot the party members who use HDDs, they fall behind in speed runs.
            #1 IOPS does not help only when multitasking. It also helps when Windows decides it's time to run a defrag or a virus scan (it tries to do it when the system is not under load, but I'm not sure how accurate it is). With an SSD, you can pretty much forget about all these and let the OS run amok. I guess this is still multitasking, just not multitasking that you initiate.
            #2 PoE seemed suspiciously slow to load even on an SSD. I gave up on the game sometime this year (too much grind for absolutely no return for someone who plays solo), so that fixed load times for good
            #3 For games that constantly stream stuff, you'd think Windows' Prefetch would step in... I know launching PoE a second time is visibly faster. Then again, if the game loads more than the prefetch buffer can hold, you still end up reading from the disk.

            Comment


            • #26
              Originally posted by bug77 View Post
              #2 PoE seemed suspiciously slow to load even on an SSD. I gave up on the game sometime this year (too much grind for absolutely no return for someone who plays solo), so that fixed load times for good
              ROFL 😂

              Comment


              • #27
                Originally posted by mlau View Post

                I read somewhere that Samsungs windows drivers don't use forced unit access (FUA) like the vanilla windows nvme and ahci drivers do. FUA does kill performance because it forces the disk to write data back to the medium (instead of keeping it in a volatile cache) before it can report the request as completed, but it also increases reliability.
                On Linux is write barrier and it can be disabled by an option in fstab. I tested few SSDs and only Intel one had good performance with write barrier on. Other had a huge performance difference, like for example: http://openbenchmarking.org/result/1...PL-1404261PL16

                Comment


                • #28
                  Originally posted by riklaunim View Post

                  On Linux is write barrier and it can be disabled by an option in fstab. I tested few SSDs and only Intel one had good performance with write barrier on. Other had a huge performance difference, like for example: http://openbenchmarking.org/result/1...PL-1404261PL16
                  I doubt the drive makes any difference, as the write barrier mechanism in the kernel. The drive knows nothing about write barriers.

                  Comment


                  • #29
                    Originally posted by bug77 View Post

                    I doubt the drive makes any difference, as the write barrier mechanism in the kernel. The drive knows nothing about write barriers.
                    He was referring to Windows, not Linux. There the vendor driver may actually do something

                    Also some USB drive cases may not implement write barrier and you get in general the same effect as when disabled.
                    Last edited by riklaunim; 19 December 2016, 10:42 AM.

                    Comment


                    • #30
                      Has anyone experienced worse performance with SSDs on Linux as compared to Windows? I tried running iozone on fresh installs of Debian/stretch and Windows 10 respectively, and I get significantly worse performance on Linux. About ~2200 MB/s reads on Linux and 3800 MB/s on Windows. See attached image.

                      Comment

                      Working...
                      X