Our testing methods
Before dipping into our benchmark results, let's take a quick look at the mix of rivals we've put together to face the m4, and the methods we use to test storage devices here at TR. We include these details to help you better understand and replicate our results, but if you're already familiar with our approach to storage testing, feel free to skip ahead to the benchmarks. We won't be offended.
Today, the m4 will face off against a collection of solid-state drives based on a handful of different controllers. Note that only half of the SSDs have 6Gbps SATA interfaces. We're using a Sandy Bridge motherboard with 6Gbps SATA connectivity, so those drives have a distinct advantage over the others. The Agility 2 also has somewhat of an edge thanks to a 28% overprovisioning percentage, four times what's typical for consumer-grade SSDs. We've found SandForce-based SSDs tend to run slower when they set aside a more traditional 7-8% of their flash capacity as spare area.
|Flash controller||Interface speed||Cache size||Total capacity|
|Corsair Nova V128||Indilinx Barefoot ECO||3Gbps||64MB||128GB|
|Crucial RealSSD C300||Marvell 88SS9174-BJP2||6Gbps||256MB||256GB|
|Crucial m4||Marvell 88SS9174-BLD2||6Gbps||256MB||256GB|
|Intel X25-M G2||Intel PC29AS21BA0||3Gbps||32MB||160GB|
|Intel 510 Series||Marvell 88SS9174-BKK2||6Gbps||128MB||250GB|
|OCZ Agility 2||SandForce SF-1200||3Gbps||NA||100GB|
|OCZ Vertex 3||SandForce SF-2281||6Gbps||NA||240GB|
|Samsung Spinpoint F3||NA||3Gbps||32MB||1TB|
We've updated all the drives to their latest and greatest firmware revisions with the exception of the Nova. This Indilinx-based drive debuted well into the controller's life, so the initial release should have all of the kinks ironed out. Corsair tells us there are no firmware updates for the Nova.
You'll notice that we've also included a traditional hard drive this time around. The Spinpoint F3 1TB is our favorite 7,200-RPM desktop drive at the moment, and it'll give us a sense of how the m4 and other SSDs compare to the performance of contemporary mechanical storage.
We're in the midst of overhauling our storage test systems here at TR, a plan that was stalled briefly by Intel's Sandy Bridge chipset bug. The new suite of tests is coming soon, and it should be worth the wait. In the interim, we've whipped up an abbreviated version with a handful of new and old tests that cover the basics.
The block-rewrite penalty inherent to flash memory, the TRIM command designed to offset it, and the last workload an SSD tackled can all impact drive performance, so we'll provide a little more detail on exactly how we test SSDs. Before testing, each drive is returned to a factory-fresh state with a secure erase. Next, we fire up HD Tune and run a series of read and write tests covering transfer rates and random access times. HD Tune is designed to run on unpartitioned drives, so TRIM won't be a factor. The command requires a file system to be in place.
After HD Tune, we partition the drives and fire up a series of IOMeter workloads using the latest version of that app. When running on a partitioned drive, IOMeter first fills it with a single file, firmly putting SSDs into a used state in which all of their flash pages have been occupied. We delete that file before moving onto our used-state file copy tests, after which we tackle disk-intensive multitasking. Our multitasking benchmark requires an unpartitioned drive; like HD Tune, it shouldn't be affected by TRIM.
With our multitasking tests completed, we secure-erase the drives once more and launch a final instance of our scripted file copy test. This procedure should ensure that each SSD is tested on an even playing field—and in best- and worst-case performance scenarios.
We run all our tests at least three times and report the median of the results. We've found that IOMeter performance can fall off after the first couple of runs, so we use five in total and throw out the first two. Each drive's performance over the last three runs has been pretty consistent thus far. We've also seen remarkable consistency with our new FileBench copy test, which we're currently running five times while we tune the scripting. We used the following system configuration for testing:
|Processor||Intel Core i7-2500K 3.3GHz|
|Motherboard||Asus P8P67 PRO|
|Platform hub||Intel P67 Express|
|Platform drivers||INF update 184.108.40.2065
|Memory size||8GB (2 DIMMs)|
|Memory type||Corsair Vengeance DDR3 SDRAM at 1333MHz|
|Audio||Realtek ALC892 with 2.58 drivers|
|Graphics||Gigabyte Radeon HD 4850 1GB with Catalyst 11.2 drivers|
Corsair Nova V128 128GB with 1.0 firmware
Intel X25-M G2 160GB with 02M3 firmware
Intel 510 Series 250GB with PWG2 firmware
OCZ Agility 2 100GB with 1.29 firmware
Crucial RealSSD C300 256GB with 0006 firmware
OCZ Vertex 3 with 1.11 firmware
Samsung Spinpoint F3 1TB
Crucial m4 256GB with 0001 firmware
|Power supply||OCZ Z-Series 550W|
|OS||Windows 7 Ultimate x64|
Thanks to Asus for providing the system's motherboard, Gigabyte for the graphics card, Intel for the CPU, Corsair for the memory, OCZ for the PSU, and Western Digital for the Caviar Black 1TB system drive.
We used the following versions of our test applications:
The test systems' Windows desktop was set at 1280x1024 in 32-bit color at a 75Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.
Most of the tests and methods we employed are publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.