I have a Dell PowerEdge 4400 Server. This server has a 2 channel internal RAID controller (Dell branded, but it's Adaptec) which supports RAID 0,1,5 & 10 and has 128MB Cache.
Channel 1 currently has 2x15K RPM Cheetah's in a RAID 1 mirror; it's partitioned into C: (FAT, NT Server 4) and D: (NTFS, Programs).
Channel 2 currently has 6x15K RPM Cheetah's in a RAID 10 array (3xRAID 1); partitioned into E: (NTFS). It's got no files on it currently, but I bought this server to run MS SQL Server 2000.
My dillema: The RAID 10 was factory setup, but I think it was done incorrectly, since my server hard-locks (KB/mouse freeze, nothing in event log, no blue screen, etc) under heavy load on the drives. I think the stripe sizes between the arrays may be incorrect, and I have to re-create the RAID 10 array (E:).
BTW: Dell tech support is clueless.
I ran Dell diags, mem-test, etc. for 72 hours straight w/not a single error; I reseated all components and connectors, etc..I've done everything I (and Dell) can think of, and this is what I conclude.
My Question: What is the correct stripe size for each RAID 1 mirror, and once that's done, what chunk size should I use for the RAID 0 across the RAID 1's? (remember, this is only going to run SQL server with a few large DB's)
Any help or advice is sincerely appreciated.
Thanks!
<font size=-1>[ This Message was edited by: Symfornix on 2002-01-04 10:28 ]</font>