I became the beneficiary of a Core 2 Q9550 motherboard + RAM that he has retired after building an Ivy Bridge box. As it happens, there was a need for a new server at the office for VPN and email archival purposes so...
The Hardware for a fun "New" server
What I got for free:
1. Motherboard (Foxconn P45.. nothing fancy but it works)
2. RAM (8 GB DDR2-800)
3. CPU: Q9550
4. Power Supply: Seasonic 400 watt fanless that I used for several years before a recent GPU upgrade required a beefier PSU.
5. GPU: Old ATI Radeon 4350 (NO FAN!). This is a headless server so the GPU is actually a pretty nice piece of overkill for what I need, but during the install process I did need a direct terminal and the 4350 performed its job admirably.
So that's most of the guts, but I needed a case, and more importantly a decent chunk of storage:
1. Antec 300 case. ($48)
2. Intel G2 80GB SSD ($60.. but only $40 after a $20 MIR ($0.50/GB), and the G2s are relatively reliable even though they are not speed demons.)
3. 2X WD Red WD20EFRX 2TB drives ($130 each, put into a RAID 1 mirror).
And here are some fun pictures:
Motherboard & disks mounted before dropping in the PSU and wiring everything up:
So I have a 2.5" SSD and no mounting adapters and a case that doesn't have 2.5" bays? No problem: I screwed in the SSD to one side of the 5.25" bays near the base. The SSD is elevated from the base of the bay by several millimeters, so I shoved in pieces of static-safe packing foam under the SSD to provide support and prevent vibration:
And after dropping in the PSU and hooking up the connectors (more on that below) here's the almost finished product (I did clean up the cabling a bit afterward):
Configuration & Software:
As you might have guessed, the SSD is the boot & system software drive while big data get stored on the 2 TB hard drives. Fortunately, the Intel SSD came with the most up to date firmware revision (Linux users: you can check the revision with
- Code: Select all
sudo hdparm -i /dev/sdX
I knew this would be a Linux box, and I've been using Arch Linux for several years. So I thought "I'll just slap Arch on there!" Well, I've been *using* Arch for a long time, but I haven't actually done a fresh Arch install since 2009 (the wonders of a rolling release + disk imaging where I just copied a working install instead of reinstalling). *Wow* is the install process different than what I remembered! In some ways it is easier since the "pacstrap" and "arch-chroot" utilities made installation of packages on the SSD easier, but partitioning the SSD and getting GRUB 2 setup was definitely more complex, including the required for a 2MB dummy partition at the beginning of the drive that GRUB 2 uses for dumping legacy boot image data. Fortunately, Arch has excellent installation documentation so I was able to get it setup with a minimum of grief.
I think the biggest change with Arch is that the old install I have (that is quickly being deprecated into oblivion) relies heavily on a single config file, the /etc/rc.conf file that any Arch user is well aware of. In the new system, all of the functionality that used to be condensed into that rc.conf file is now spread out over multiple configuration files and systems. The big facilitator of killing the rc.conf file appears to be systemd, which now handles all the starting and stopping of services, and is pretty deeply hooked into other services like networking as well. The deprecation of the rc.conf file is so big that I do not even have one on the new server install! The good news is that things appear to be very well supported, and systemd is actually pretty nice once you get the hang of it. Also, it makes the boot time ridiculously fast, but that's less important for a 24/7 server.
Right now I am running an OpenVPN server on the box, and I am using the "getmail" utility to archive emails on the big storage array. More services could be installed as needed in the future too.
Fun Hardware Quirks, or: Never Trust HP
OK, so the hardware setup went.. mostly OK. The backplate for the CPU HSF was not adhesive, so I had to get awkward and hold the backplate with one hand while holding the HSF and screwdriver with the other hand during install. It worked out OK though and the CPU cores idle near 35C, so the (not so huge) HSF is doing its job.
The real issue boiled down to the SATA cables. I need 3 and I thought I had three when I gave my supplies a casual glance. Unfortunately, one of my cables was actually an eSATA cable that doesn't fit the internal connectors. So I needed another SATA cable and I'm stuck at my office which is not exactly a Newegg warehouse. Then I thought: Hey, I almost never use the DVD in my regular work PC, I'll just pull the SATA cable and replace it when I bring in a spare cable later. Oh man, bad idea. The issue is not with using the SATA cable, but with the fact that my office PC is an HP with some of the worse @#%@%# cables I've ever seen. One end of the cable in the HP machine would not release from the motherboard header and... you can see this coming... I managed to rip the plastic socket right out of the motherboard while trying to pull the cable.
I had to take a flathead screwdriver to the connector to finally pop the motherboard header of and get acess to the connector. HP uses such sub-standard parts that this SATA plug is too big and it barely fit in the header on my server motherboard too. Fortunately the server's motherboard is built much better than the HP and it accepts the low-grade plug with a minimum of issues (and this is a Foxconn motherboard, not exactly a high-end board by any stretch of the imagination).
Good news: it looks like there is another SATA header in my crappy HP motherboard, so when I get a real SATA cable I'll be able to reconnect my DVD. The rest of the PC still runs fine BTW.
So that's it for this build. This is actually the first from-parts build I've put together since 2010. I'm making sure I still have enough mad-skillz to put together a functioning Haswell box next year, and so far I haven't caught anything on fire, so I'll call it a success.