Benchmarking M.2 PCIe Adapters for Lab Workstations

Just about a week ago I bought a used HP Z8 G4 Workstation on eBay to be used as a ConfigMgr / Intune lab machine. The machine I found only included a SATA SSD and of course I wanted more performance than that can bring, so I ordered a #shiny 4 TB NVMe SSD from Western Digital for my VMs: The WD_BLACK SN850X, which from its cover can do up to 7,300 MB/s read, and up to 1,200,000 IOPS. That's in theory at least 🙂

Note: If you want to learn more about building the perfect lab environment for MDT, ConfigMgr or Intune, check out my free community course here: https://academy.viamonstra.com/courses/mini-course-building-the-perfect-lab-for-configmgr-mdt-and-intune

The WD_BLACK SN850X from Western Digital

Real-World Workloads

For lab machines used to run a virtual Microsoft infrastructure – like Domain Controllers, ConfigMgr Servers, SQL Servers, MDT servers, Windows 10 and Windows 11 clients etc. – I found two benchmarking tests being particularly useful:

  • Test 1: A DiskSpd test optimized for a ConfigMgr / SQL VM type workload
  • Test 2: Extracting a 200 GB+ ZPAQ archive on the same volume.

From my experience, if the results from these tests are good, the lab machine will run a Microsoft infrastructure just fine.

Download: If you want to run the same workload in your lab, here is the script I used: https://github.com/DeploymentResearch/DRFiles/blob/master/Scripts/DiskSpd/Test-ConfigMgrWorkload.ps1

The M.2 Adapters

The two M.2 adapters I compare are the StarTech.com x4 PCIe 3.0 NVMe adapter (PEX4M2E1), and the HP x8 PCIe 3.0 Turbo Drive Dual M.2 SSD adapter (933576-001). The StarTech.com adapter was $22 on Amazon, and the HP adapter was $130 on eBay. While the HP adapter is capable of having two NVMe SSD's I ran the test with only for fairness, and also because I only had one disk to test with 🙂

Note: If you have $2800 to spare you can also buy the x21 card from Apex Storage. It's a full length PCIe x16 card with 21 (!) slots for M.2 SSD's.

The two adapters

The Result

The DiskSpd result for 100% read with a 64 kb block size was

  • StarTech.com: 53859 IOPS and 3366 MB/s
  • HP: 54293 IOPs and 3393 MB/s

The DiskSpd result for 100% write with a 64 kb block size was

  • StarTech.com: 52319 IOPS and 3269 MB/s
  • HP: 52321 IOPs and 3270 MB/s

The 200 GB ZPAQ Archive extraction on both adapters took 10 minutes.

For the Fun of It – SATA SSD Comparison

While I was running the test, I could not stop myself from running the same test on the SATA SSD that came with the machine. For that test I got 8500 IOPS and 530 MB/s read, and 4850 IOPS and 303 MB/s write. Extracting the ZPAQ archive took 37 minutes. All in all, more than 10x slower than the NVMe disk if looking at the raw DiskSpd data 🙂

Summary

For a single disk workload, the $22 adapter performed about as well as the $130 adapter, minor difference in favor of the HP adapter, but not much. And the results also indicates that it's the disk, rather than the PCIe bus or card, that is the bottleneck.

But I would like to say that if you have an HP workstation, there are two dedicated PCIe x8 slots for this type of board (assuming you have two CPU's). Leaving the other slots available for other peripherals.

One of the two special slots for M.2 Adapters populated in the HP Z8 G4 Workstation.
About the author

Johan Arwidmark

5 3 votes
Article Rating
Subscribe
Notify of
guest
4 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments
Isaac
Isaac
1 month ago

If you have full height full length 16x slot(s) available, the SSD7140a or SSD7540 from Highpoint will take up to eight M2 drives and is around $700 and they can be paired to support 16 drives in a raid array.

Matt
Matt
2 months ago

Hi Johan,
Thanks for your post, amazing like always. I have questions first, so did you use Amazon adapter or HP? Because on last picture I do not see Amazon adapter. Second this Monster HP z8 has 12 slots for RAMs, how many max rams you can add to it? Third, what is the max CPUs I can add to it?

Thanks.
Matt


>