Single page Print

Just because
The Temjin handled our standard test system well considering its constrained dimensions, but we wondered just how much hardware could be shoehorned into the case. So we proceeded to stuff it full of more recent parts, including a Sandy Bridge CPU, dual GeForce GTX 560 Ti graphics cards, and four 3.5" hard drives. Thanks to Asus and Noctua for supplying the extra gear for this build. Here's a full run-down of the parts we used:

Processor Intel Core i7-2600K
CPU heatsink Noctua NH-U12P - Single fan in a pull configuration
Fans 1x 180 mm (factory), 1x 120 mm Noctua NF-P12
Motherboard Asus ROG Maximus IV Gene-Z
Memory 4 x 4GB AData DDR3-1333 at 1333MHz
Audio Integrated Realtek ALC889 with default Windows drivers
Graphics 2 x Asus GeForce GTX 560 Ti DirectCU II 1GB in SLI
Hard drive 4 x assorted 3.5" SATA drives
Optical drive 2 x Lite-On iHAS124-04-OEM
Media Card Reader 3.5" External bay-mounted card reader
Power supply Thermaltake TR2 600W

In the end, the only usable space left vacant was the SSD mount below the external 3.5" cage. That spot was left unoccupied because we simply ran out of available SATA ports on the motherboard.

The fact that the case gobbled up our excessive pile of gear is impressive. However, band-aids became a hot commodity during this particular build. Even the removable motherboard tray couldn't save the skin on my knuckles with this much hardware installed. The case's removable parts did alleviate most of the installation pains, but the old car tuner's adage, "there's no replacement for displacement," seems keenly relevant.

In spite of all the hardware, the TJ08-E still had enough displacement left over to easily conceal our excess cables. I've worked in cases twice the size of the Temjin that had far worse cable management systems. That's a solid accomplishment for SilverStone.

None of these parts align with our standard testbed, so we decided to forgo a full testing session and just go for the kill. We loaded the system's plate with four helpings of Prime95 and a heaping portion of the Unigine Heaven benchmark with all the fixins. The CPU topped out at a comfortable 55°C under full load, while the GPUs returned some slightly more interesting data.

Even though the cards were running in SLI mode with roughly equivalent GPU usage being reported by GPU-Z, there was a 22°C temperature differential between them. The GPU in the card nearest the CPU socket ran at a toasty 88°C, while its counterpart was a less balmy 66°C. As you can see in the picture above, the card closest to the socket has its cooler blocked by the second graphics card. The motherboard's slot layout doesn't provide any options for putting more space between the two, so this issue would affect larger cases, as well.

This system was perfectly stable, but I'm not sure I'd be comfortable running this particular pair of graphics cards flat-out for extended periods of time in a hotter ambient environment. A different cooler design or even a water-cooling kit would surely help to keep the hotter card's GPU temperatures in check. But hats off to SilverStone for designing a mini-tower that gives users the option of playing with multi-GPU configurations.