Articles: Storage

Bookmark and Share


Table of Contents

Pages: [ 1 | 2 | 3 | 4 | 5 | 6 ]

Two yeas ago our test laboratory acquired a new benchmarking tool called FC Test (see our article called X-bit labs Presents: FC-Test for Hard Disk Drives). The necessity of creating our own benchmark for hard disk drives had been felt for long. As you know, the WinBench 99 suite hasn’t been updated for several years and the file sets it uses don’t reflect the realities of the current day. Intel IOMeter can be made to simulate nearly any workload on the disk subsystem, but it is too abstruse and esoteric for the end-user. After browsing through numerous benchmarks, we found we had to write one by our own hands…

After a long and painful period of thinking about the type of the benchmark (synthetic or tests in real applications), we decided to create a test that would measure the speed of reading, writing and copying files in the Windows operating system. Our choice was due to the following two facts: these operations are quite understandable to any PC user and this OS family is the most widespread today.

So we wrote the test and our first attempt at using it when examining performance of hard disk drives confirmed the necessity to include the FC Test into our list of benchmarks (we first used FC-Test in our Western Digital External Hard Disk Drive Review).

Since then, we ran the FC Test an infinite number of times and the ream of reports became too high as to fit into the desk’s drawer. What reports do I mean? The test just couldn’t save its results into a log file – you had to put the numbers down on paper manually and then type this data into Excel spreadsheets. Of course, custom-made forms in the same Excel made the process easier somewhat (and saved paper!), but we live in an age of technology after all!

So after the drawer couldn’t take in all the reports, we revolted! Another cause for that, besides the above-mentioned lack of a log file, was that you had to sit behind the testbed computer, write down the results each 3-5 minutes and then run the test further. As you understand, this was not a way to reach the maximum efficiency of work. It’s just impossible to focus on the review itself when the benchmark calls for your attention every few minutes. That was even harder when the testing session was simultaneously run on several computers: aching legs, giddiness and other things…

Our revolution was a success and our two demands were taken to consideration:

  • the FC Test can’t save its results into a log file, in a format easy for further processing;
  • the FC Test can’t work automatically, by a script.

Thus, these two problems were met in the development of the new version of the FC Test.

Pages: [ 1 | 2 | 3 | 4 | 5 | 6 ]


Comments currently: 9
Discussion started: 08/16/07 08:25:38 AM
Latest comment: 12/21/15 11:51:56 AM

View comments

Add your Comment