The making of Damagebox 2015

Sponsored by…

We spend a ton of our time around here looking at individual PC components, but at heart, that work is meant to serve a larger goal: putting together the best possible PC. Unfortunately, it sometimes seems like I don’t get to spend enough time actually building complete systems. Yeah, I’m constantly working with test rigs, but that’s not the same as hand-building a PC for your own personal use.

Heck, since I started this gig, my own personal PC has largely been cobbled together from whatever leftover parts I could find in Damage Labs. Intentionally setting out to build something coherent from hand-picked components simply wasn’t compatible with that approach. Heck, I haven’t even upgraded all that often, since so much of my work relies on keeping my main system intact. I learned long ago that, as a hardware reviewer, you’ve gotta be careful about tinkering with your own system. Nothing destroys your productivity more comprehensively than, say, deciding to do a little overclocking and frying your Windows installation in the process.

Follow that line of thinking for a while, and you wind up with my recent dilemma. Although surrounded by beefy test rigs, I was still doing my everyday work on the system known as Damagebox 2011, a PC whose motherboard I never particularly liked and whose CPU was shockingly from a pre-Sandy-Bridge vintage.  The daily experience of using the system wasn’t terrible, but I was keenly aware that I could do better.

This time around, I actually decided to do something about it. I resolved to build a new Damagebox from all-new, hand-selected parts that would form a coherent whole. This system would reflect my own personal tastes while relying heavily on what we’ve learned in our reviews of recent PC components. It would be good looking, with a distinct sense of style. And, if possible, it would enhance my daily workflow by improving on my last system in tangible ways.

With that plan in mind, I went to several of our top sponsors, firms that make components that fit with my aspirations, and asked them to help out. Happily, the folks at Cooler Master, Gigabyte, and OCZ agreed to support this silly project and to contribute a bunch of hardware. I then went shopping on their websites and picked out a selection of components to fit my needs. Next, I hit up the folks at Intel and WD to fill in the gaps, and the full specs for the new Damagebox came together.

Long story short, I got the parts, I built it, and it is most excellent.

What follows is a tour of the parts I chose and a look at how the build came together. I should warn you that Damagebox 2015 isn’t really a high-concept build, not like some of them. It’s not a budget box with components selected for perfect value, nor is it a no-holds-barred masterpiece with a custom water-cooling loop and a special paint job. Nope, it’s just a really nice, modern system meant to fit my needs. It’s a clean build,  is exceptionally quiet, and was in no way difficult to put together. I don’t know whether this exact mix of components would be a perfect fit for anyone else, but it is precisely what I like, for reasons I’ll explain as we go.

The core components
CPU: Intel Core i7-4790K — What, no Haswell-E? Nope. I have nothing against the eight-core monstrosities that Intel sells as high-end desktop parts these days. They are excellent by almost any standard, with few drawbacks beyond the added cost. But my favorite desktop processor right now is the Core i7-4790K, a desktop quad-core with eight threads and a modest 88W power envelope. The 4790K has a base clock of 4GHz and a Turbo peak of 4.4GHz, the highest clock speeds of any Haswell-derived processor on the market.  As a K-series part, it’s unlocked for easy overclocking, and we’ve taken ours to 4.7 and 4.8GHz without much trouble.

I can explain my affection for the 4790K with words, but a few graphs might be more effective. The two most difficult things I’ll ask my PC’s processor to do are gaming and video encoding. Have a look at these results, taken from my Core i7-5960X review.

Thanks to its high clock speeds, the 4790K is the fastest CPU you can buy for PC gaming. The Haswell-E-based Core i7-5960X isn’t bad, but don’t pony up the extra dollars for it expecting bragging rights. In real games, the 4790K outperforms it.

As for video encoding, the 5960X’s eight cores and 16 threads can sometimes give it an advantage, as in the Handbrake test above, but not always. At other times, the 4790K’s combination of eight hardware threads and higher clock speeds wins even in video encoding. The 4790K is more power efficient, too, with less power draw at peak and idle than the 5960X.

Don’t get me wrong. I like the shock-and-awe aspect of a processor showing sixteen threads in Task Manager and commanding a price of just under a grand. But objectively speaking, based on cold, hard data, someone with my needs is better off with a 4790K.

Heck, I even thought the 4790K might be better for video editing work thanks to its built-in QuickSync hardware encoding block. Having now already lived the dream a bit with QuickSync in common software, though, I’d probably edit out that part of the rationale. I quickly decided that CPU encoding is preferable for most of what I do.

Anyhow, that’s the story of why I selected a quad-core processor for the new Damagebox. If years of testing desktop CPUs in practical scenarios has taught be anything, it’s that one should never ignore Amdahl’s Law.  Per-thread performance matters more than you might think, and the Core i7-4790K is the current champ in that department. Choosing it for my build was only natural.

The motherboard: Gigabyte’s Z97X-Gaming G1 — Having decided on a Haswell K-series processor for my system, I needed a Z97-based motherboard to go with it. Gigabyte’s Z97X-Gaming G1 may be the swankiest Z97 board possible, stuffed with slots, ports, and features until there’s no room for more. This board’s red-and-black color scheme also matches perfectly with the rest of the components I’ll be using, as you’ll soon see.

Truth be told, we don’t often recommend boards this high-end in our system guides, simply because you can get away with paying less if you don’t need, say, this board’s dual GigE ports and quad PCIe x16 slots. Gigabyte offers a whole line of Z97-based products with similar firmware and such, and any one of them is likely to serve you well. I decided to indulge here since, well, Gigabyte was offering its best, and I like a few things about this board

One of those things is the integrated audio, which may finally let me dispense with a discrete sound card. Gigabyte has paid close attention to analog signal quality. They’ve created a short, direct route from the onboard Creative Core3D audio chip through the onboard op-amp to the output ports. The traces on that path are isolated from one another, with the left and right audio channels on different layers of the circuit board to prevent crosstalk and interference. The trace paths are  lined with high-quality capacitors, visible as a row of green-coated caps in the picture above. The op-amp is socketed and can be replaced with different types, if desired.

So far, the G1 Gaming’s audio has been good enough that I’ve not been tempted to drop in a discrete sound card. If it becomes an issue, I may try out a few different op-amp options to see how that changes things.

This board has a ton of other extra features, including a Killer NIC backed by a second Intel GigE port, a slot config capable of hosting up to four graphics cards simultaneously, and a metric ton of SATA and USB ports. One of the conveniences of a high-end mobo like this one is an ample selection of practically everything. My favorite little touch on the G1 Gaming is the CPU_OPT fan header meant to power the pumps in liquid coolers. This header can be set to ignore the mobo’s fan-speed control routines, so the voltage to the pump won’t vary. Little touches like that one can remove some of the drama from the build process. You don’t have the spend the extra cash to get a board like this one, but doing so will often make life a bit easier.

 

The core components — continued
Video cards: Gigabyte GeForce GTX 970 G1 Gaming — The selection of a GeForce GTX 970 for this system should be uncontroversial. The GTX 970 offers one of the best price-performance combinations of any graphics card on the market right now. The choice of this particular model shouldn’t come as any surprise, either. I gave this card’s GTX 980 sibling a TR Recommended award not long ago, and this is the same basic setup. Gigabyte’s triple-fan cooler is whisper quiet and deadly effective, yet it’s relatively compact. And Gigabyte’s GM204 cards are distinctive in offering triple DisplayPort outputs, which is what you want for future-proofing.

The intrigue comes from the fact that I’ve chosen to install two of these things in the Damagebox at once, despite my usual warnings that multi-GPU systems aren’t always a smart choice. Hear me out here.

Our usual advice is to resort to multi-GPU only once you’ve exhausted the single-card options for getting higher performance. I think that’s generally a  sound approach, since multi-GPU performance scaling can present all sorts of problems. In this case, though, single GTX 970 cards like these are selling for around $350 with clock speeds high enough to put their performance close to a stock GTX 980. Yet doubling up on these things doesn’t cost much more than a single GTX 980, and it’s fully 300 bucks cheaper than a Titan X. I’d say the value proposition is there.

Beyond that, many of the multi-GPU issues we’ve encountered in recent times have been associated with AMD’s CrossFire tech, not Nvidia’s SLI. My sense is that Nvidia invests substantially in driver development to make sure SLI offers folks a good experience. I wanted to live with an SLI setup in my own rig for a while to see what it looks like.

I’m hopeful that the advent of new APIs like DirectX 12 and Vulkan will force the GPU makers to dispense with alternate-frame rendering and use a more honest approach to multi-GPU load-balancing, one that truly improves the user experience rather than inflating FPS averages like AFR does. I wouldn’t recommend making a move to multi-GPU configs yet on that basis, but I do think better days may be ahead. Having two cards in my system will allow me to track any progress on this front.

Memory: G.Skill Trident X DDR3 2400 dual 8GB DIMMs — For the past year or so, G.Skill DIMMs have been a prominent part of our system guide recs because they’ve had some of the best prices on Newegg for memory modules from an established brand. I figured I might as well build my own system around the same products we recommend, and G.Skill was happy to oblige by sending a pair of Trident X modules.

Their red-and-black heatspreaders are a perfect match for the Z97X motherboard’s color scheme. Going with a pair of 8GB DIMMs gives me the chance to upgrade to 32GB at some point in the future, if I want. 

These modules are rated for an operating speed of 2400 MT/s at 1.65V. I wouldn’t pay a ton more for that extra frequency, since I don’t think Haswell is typically constrained by memory bandwidth (unless you’re using integrated graphics). But G.Skill doesn’t charge a huge premium for its 2400 MT/s modules, so why not?

The storage subsystem
SSDs: Dual OCZ Vector 150 480GB 2.5″ SATA drives — Of course you’re gonna find an SSD doing service as the boot drive in pretty much any modern build, but this time out, I’m taking advantage of the constant price drops in flash-based storage by adding nearly a terabyte of SSD space, all told. 

I’m using one of the fastest SATA SSDs we’ve ever tested, OCZ’s Vector 150. (For what it’s worth, OCZ has since introduced the Vector 180, which has newer flash but very similar performance.) Going with a fast drive at a large capacity has a number of advantages, including a lower cost per gigabyte, oftentimes higher write performance, and potentially more longevity. 

I could be really hard-core and go for a PCIe SSD. Now that a few native PCIe drives for consumers have arrived, they do tend to light up the performance benchmarks. However, our tests have shown that they’re not appreciably faster in desktop workloads than the best SATA drives. PC software may have to be rearchitected to take advantage of faster storage before we see the true benefits of a faster disk interface. For now, SATA storage of this class still looks like a safe bet.

My plan: to boot off one of the SSDs and store my OS and programs there. The other SSDs is dedicated to Steam games.

Just pause a moment to take in the glory of that.

Yes, 480GB of fast flash storage dedicated to games. I can run games quickly locally, and I can copy them across the GigE network to my test rigs at near-wire speeds. Spoiler alert: the rollout of Project Cars in Damage Labs was a wondrous thing to behold.

Hard drives: Dual WD Red 6TB drives — The storage subsystem in the old Damagebox was the one thing I spent the most time waiting on, by far, and it was also becoming disconcertingly overcrowded. My two WD Green 2TB drives in a RAID 1 mirror were getting kind of old and were full to the brim. I wanted to go for a higher-capacity setup this time out based on newer, faster drives, so naturally, I turned to WD again.

My plan was to grab a pair of WD Green 4TB drives to put into a RAID 1 mirror. The folks at WD were happy to oblige, but they massively exceeded expectations by sending out a couple of Red drives with a staggering 6TB of capacity.

Expletive deleted, man. That’s all I can say about that.

WD recommends using its Red drives for NAS devices and RAID arrays, since the firmware in them has some special provisions to prevent array synchronization problems. WD is coy about the exact rotational speed of the platters in these drives, listing them only as “IntelliPower,” but we’re pretty sure they spin at about 5,400 RPM. That’s fine for my purposes, since I mostly want to store data files on these drives, many of them videos and pictures.

I can tell you that these Red drives are clearly faster than my old Green 2TB drives, and from inside the Damagebox, they’re completely inaudible, even with two of them doing the same things in concert as a mirrored volume.

 

The case and cooling
Enclosure: Cooler Master Silencio 652S ATX case — I hate it when a build looks cobbled together from a bunch of different sources with no common sense of style. I also happen to prefer stately, understated industrial design with an emphasis on acoustics and functionality. Those preferences led me inexorably to choose the Silencio 652S from among Cooler Master’s lineup. Just look at this thing.

The Silencio 652S is a mid-tower ATX enclosure with an emphasis on silent operation, and it fits my preferred style perfectly. You can see in the shot above how the Silencio’s motherboard tray is lined with grommets for cable routing and how the area beneath the CPU socket offers a generous-sized cutout for cooler backplace access. Also visible are the case’s five removable 2.5″ drive bays, along with three removable 3.5″ bays.

The case’s three 120-mm Silencio fans and multiple dust covers can’t all be seen in the shot above, but they’re there. The side cover is coated in a noise-dampening foam,  too. The top panel of the case is covered by a removable lid, to dampen noise coming from inside, and the hinged front door serves a similar purpose. Cooler Master hasn’t skimped on much of anything here, yet this case sells for only 120 bucks. I’m cramming much more expensive hardware into this thing, but everything feels completely at home there.

CPU cooler: Cooler Master Nepton 120XL — I didn’t have the best of luck with the closed-loop liquid cooler in Damagebox 2011, but I can’t quite get over how these things transport heat directly to the edges of the case and expel it. Just seems like the right approach, I guess. 

The Nepton 120XL is more than a match for my Core i7-4790K, yet it’s compact enough to fit into the Silencio with no issues. Not only does the Nepton look like it belongs inside of the 652S case, but Cooler Master also uses the same Silencio fans in the Nepton as they do in the 652S, so their acoustic profiles should match. 

Power supply: Cooler Master V750 PSU — The V750 is a semi-modular power supply. That simply means that the parts you’ll probably always need, like the ATX and PCIe power leads, are permanently connected to the box, while the optional cables for powering storage devices and peripherhals can be connected at will. Consider that making the ATX and PCIe power leads detachable adds cost to a PSU for no real benefit, and you may decide that products like the V750 make a lot of sense.

My build has dual Gigabyte GTX 970 graphics cards, remember, so I need four PCIe aux power leads and enough wattage to drive them all. The V750 more than delivers, with a single-rail design and enough excess capacity that I’ll never have to worry about it.

And this thing is quiet. Unlike some PSUs, the V750 doesn’t spin down its fan when lightly loaded—but I had to look up this fact online, since the V750 is impossible to hear most of the time. 

Peripherals
The keyboard: CM Storm QuickFire XT with Cherry MX green switches — Since we’re working with Cooler Master, I had the opportunity to get my hands on some of their latest input devices, and I couldn’t resist. Perhaps my favorite component of the new Damagebox is the QuickFire XT keyboard with Cherry MX green switches. 

QuickFire XT keyboards have a pretty straightforward appeal. They feature your choice of the main varieties of Cherry’s MX mechanical switches, and their full-sized, no-frills layout is exactly what I want. Couple those elements with a rock-solid enclosure and with stellar build quality, and you have one of our favorite keyboards anywhere.

The cherry on top of this particular sundae is the inclusion of MX green switches, which are almost mythical due to their limited availability. You probably won’t find a QuickFire XT with green switches on Newegg or Amazon, but one might occasionally pop up on Cooler Master’s own online store.

The thing is, the green switches are magical sources of incredible positive feedback. Each keypress releases a little finger-gasm of pleasure. Like MX blue switches, they offer a tactile bump and an audible click when actuated, but the greens are much more stiffly sprung. If you’re an old geezer, they might remind you of the feel of some vaunted early PC keyboards. The IBM Model M is the one usually praised in song and legend, but the green switches more closely remind me of the feel of my old favorite, the Northgate OmniKey. Some folks may not want a keyboard that requires this much effort on each keystroke, but for me, this thing is perfection.

The mouse: CM Storm Mizar — I’ve had a bad run with mice lately. Multiple products from, ahem, another brand have failed on me in different ways in recent weeks. Geoff has already noted the Mizar’s similarity in shape to ye olde MS Intellimouse Explorer in his review, and he talked about the 8200-DPI sensor. Those are fine attributes, but what sets the Mizar apart for me is its build quality. The Teflon feet on this mouse are properly placed so that the mouse glides across the desk much more easily than some of its competitors, and Cooler Master has used real rubber grips on its sides to enhance its, umm, grippability.

So far, the Mizar has tracked flawlessly for me on my dark, worn desktop surface, a feat not all modern mice seem able to manage.

The headset: CM Storm Ceres 500 — I’ve been using the same old Plantronics headset for something like 174 episodes of the TR podcast, and it was time for a change.

The Ceres 500 offers larger cans than my old headset. They encompass the entire ear, rather than resting on top of it, so they provide better isolation and a more comfortable fit. The mic is on a flexible boom, and it pops out when not needed. And the Ceres plugs directly into a USB port, with its own built-in USB sound card, to ensure decent audio quality. 

I see that this headset also has a switch for going into something called a “console mode,” but I don’t understand it entirely. Something about the peasantry, I think.

 

The build process

I put this system together completely in one shot, and—wonder of wonders—it booted on the first attempt. That is not my usual outcome, but that’s what happened. I’ll take a little credit, I guess. I believe the whole process took about 90 minutes, start to finish, but I’ll admit that I kind of lost track of time once I got into it. I do know that I took some extra time to get the cable routing right.

One of the benefits of building a PC from all-new, high-quality components is that you have a good chance of everything fitting together well. This build was unusually straightforward on that front. Every device that needed to be plugged in had a corresponding header on the motherboard or lead coming from the PSU, and I didn’t need a single plug adapter or part that wasn’t included in the box with one of the components. Heck, I only needed to snap a single modular lead into the PSU to power the system’s four storage devices. That was it.

Here’s a look at the guts of the assembled system.

The Sliencio swallowed up the extra-long Gigabyte G1 Gaming graphics cards without me having to move any of the drive cages around. Everything else fit nicely, too. There’s a lot going on in there, but there was still plenty of room to work as long as I didn’t install both graphics cards until everything else was connected. I think the final result looks pretty tight, with nary a stray lead or extra wire in sight.

Most of the cabling spaghetti instead hides behind the motherboard tray and on the other side of the drive cages, which is right where it’s meant to be. Perhaps you can tell that I used black twist-ties to secure some of the cables. Cooler Master includes some nice, black zip ties with the case, but to my mind, they’re too permanent and hard to remove. Twist-ties get the job done while allowing for quick modifications if you decide to change something.

I knew from our review that one of the drawbacks of the Silencio 652S case is the amount of space behind the motherboard tray. Since I was aware of the issue, I took a few extra minutes at the tail end of the assembly process to re-route some of the major cable bundles so they took the most direct route possible and didn’t cross any other big cables. That work paid off when it came time to install the side panel; it went on immediately without issue, no extra pressure or drama required.

For what it’s worth, I was in no way diligent about the order of the installation. I didn’t even think about installing the rear bracket for the CPU cooler, visible above, until after the motherboard was in the case. Happily, Cooler Master left a big enough opening in the mobo tray to allow installation later. Phew.

As you may be gathering, though, little episodes like that one were kind of the nature of this build. I’ve learned to be calm and meticulous when building PCs over the years, and in return, component makers have built in a host of little provisions to prevent the worst “gotchas” from making the process frustrating. Frankly, I think the worst part of building a new PC now is installing Windows and applications. The hardware guys have gone to great lengths to facilitate do-it-yourself assembly.

The finished product

The biggest revelatory moment of this whole project came when I fired up the new Damagebox for the first time. After getting over the initial giddiness about the system booting and finding all of the devices on the first attempt, I was hit with a fantastic realization: this thing is quiet.

In fact, my first worry was that the fan leads were somehow not connected properly, because I literally could not hear the system’s fan noise over the ambient noise level in the room. A quick inspection confirmed that the fans were moving air just fine. The truth was that, even without an operating system installed, the new Damagebox was just ridiculously quiet during normal idle operation. I thought my old Corsair 600T was pretty good on this front, but it’s not at all in the same league as the Silencio. I still can’t believe this is a $120 case. Credit all of the other components in the new Damagebox for doing their part, too. 

I’m writing this article on the new Damagebox, and I’ve now been using it for over a week as my main PC. The reality of this thing’s quiet operation has since sunk in, but I’m no less pleased with it when I stop and think about it. A large majority of the time while I’m using it, this system is effectively silent. The fans do spin up under heavy loads, but they’re pretty subdued even at peak loudness.

I’m also pleased to report that I’ve avoided any of the nasty teething problems that one sometimes has with a new build. I did some initial stability and thermal testing with combined CPU and graphics workloads, and the system was rock-solid stable from the start.

I believe the only time it locked up on me was when I first tried to turn up the memory clock via the XMP profiles. The first profile wasn’t quite stable, but switching to the second one solved that problem. These G.Skill Trident X DIMMs are now running at 2400MHz at 10-12-12 timings and 1.65V. As I’ve said, I don’t think Haswell quad-cores are particularly memory-bandwidth-limited in most desktop apps, but wringing out the extra speed was easy.

As you may know, on most enthusiast motherboards these days, messing with the memory timings unlocks a “feature” of the firmware that effectively overclocks the CPU. True story. Gigabyte does it, and so does everybody else. As a result, instead of my Core i7-4790K topping out at 4.4GHz with one core active and dropping off to slower speeds with more cores engaged, the CPU runs at 4.4GHz with all four cores fully loaded.

You can generally turn off this behavior by disabling “multicore enhancement” in the firmware, but I’m not going to do that. I’m happy to have the extra performance. Gigabyte’s hardware monitor utility tells me the CPU uses 118W when running Prime95 with all cores at 4.4GHz, and the Nepton cooler has no trouble dealing with that. I know from past experience overclocking the 4790K that clock speeds of about 4.8GHz should be possible with some overvolting. I might  try to squeeze out higher clocks at some point, but right now, I’m just not feeling the need.

One of the perks of these swanky new SSDs is actual software to use with them. The SSDs in my old system were pretty much black boxes, but OCZ provides a utility called SSD Guru that offers status info and control over the Vector 150. As you can see above, the utility confirms that the drive is attached to a 6Gbps SATA port set properly to AHCI mode, offers a health report, and checks to see whether the drive’s firmware is up-to-date. Another tab lets you manually issue a TRIM command to reclaim empty flash blocks (although manual intervention shouldn’t really be necessary). You can even increase the amount of over-provisioned space on the drive in order to make its performance more robust and improve its longevity. This level of control is a darn sight better than the total bupkis I got from older drives.

All in all, I’m very much pleased with how the new system came together, and it does offer some tangible improvements in my everyday workflow over the older box. The new storage subsystem is huge. Copying Steam games to my test rigs from this 480GB SSD at near-wire-speed is dramatically faster. I’ve already saved a ton of time doing that. I’m also happy to have moved from a Gulftown CPU with a 3.6GHz peak Turbo speed to a Haswell with a 4.4GHz peak, which brings a considerable increase in per-thread performance. Some things that weren’t snappy before are now.

That said, it is amazing how Windows software developers can manage to make even the fastest systems pause for 20 seconds here and there for no apparent reason. (Thanks, Corel.) Fast hardware isn’t always a cure for slow software.

The verdict is still out on some aspects of Damagebox 2015. I’ve not yet spent enough time playing games on the GTX 970 SLI setup to have a sense of what I think. That’s more of a long-term project, and I may need to upgrade to a 4K monitor in order to really push these cards. (Aw, shucks.) I also retain a glimmer of hope that one of this thing’s hardware H.264 encoders, either QuickSync or NVENC, will prove worthwhile when I’m preparing videos for upload to YouTube. I’m slowly discovering, however, that CPU-based encoding seems to be the best option overall in terms of software support and quality at a given bitrate.

The biggest lesson I’ve learned in this process, though, is just a confirmation of something I already suspected: it’s way nicer to build a PC out of all-new parts than to piece one together out of leftovers. I’ve built a number of PCs in recent years around an eclectic mix of cast-off parts from Damage Labs. They’re fine systems that do their jobs for their owners, but they were harder to assemble and have some persistent quirks that this box just doesn’t have.

The other big take-away for me is a simple thing. If you haven’t built a PC for yourself in a while, well, things are really good right now. We’ve watched giants of the PC industry like Microsoft and the big PC makers struggle in recent years due to business challenges, but the DIY PC space has never been stronger. The components I used came from multiple vendors, but they all work together seamlessly.

Similarly, if you’re new to building PCs, have no fear. Putting together your own system is easier than ever. You’ll learn a lot in the process, and if you pick your parts well, you’re almost certain to be pleased with the final results.

Comments closed
    • HERETIC
    • 5 years ago

    Well deserved,
    Graphics just a little bit overkill-well until you get a new monitor…………………
    Where i was SHOCKED, and i even went back and reread, and am still SHOCKED
    that Corsair didn’t get in on the free advertising…………………………………….

    • not@home
    • 5 years ago

    I am in the market for a headset. I wonder hoe the CM Ceres 500 compares to a Tritton headset (what my bro recommends) or to any other headset anyone here recommends. Speaking of which, does anyone have some good recommendations? I would like one with big ear cups that do not press on my big ears, has decent ambient noise mitigation/cancelling, and great sound quality. It would be great if they do not clamp so hard on my head too. Nothing too expediencies also, looking for one of the best bang for the buck, not money solves all.

      • JustAnEngineer
      • 5 years ago

      Have you asked the Gerbils in the [url=https://techreport.com/forums/viewforum.php?f=28<]Echo Vale[/url<]? [url<]https://techreport.com/forums/viewtopic.php?f=28&t=106553[/url<]

    • paternal_techie
    • 5 years ago

    This is very close to my build as well:

    Core i7-4790k
    Gigabyte Z97X Gaming 7
    G.skill Sniper 16 GB DDR3
    CM Evo 212
    Sandisk Ultra II 240 GB SSD (boot)
    Toshiba 512 GB SSD (steam & VM)
    2 WD 2 TB green drives
    MSI GTX 980
    Creative X-Fi Xtreme Gamer
    Samsung DVD writer
    LG Bluray burner
    Corsair HX650
    Fractal Design R4

    I too came from a corsair graphite 600t and got tired of the lights and noise….plus the clamps broke on the side panel and I could not see me putting money into that case. A lot of my parts were brought over from previous builds….like the sound card and hard drives and burners.

    I’ve been using G.skill memory for every build since 2005 and their memory has always been rock solid.

    Now I’m just waiting to grab a deal on 6 TB drive black or red drives cuz I’m running out of storage.

    • bodom81
    • 5 years ago

    I could pick up the same CPU + Mobo for $350. It’s pretty damn tempting but….. Skylake, and 100 series mobos!

      • JustAnEngineer
      • 5 years ago

      It shouldn’t be too much longer before SATAe PCIe NVMe becomes the default interface for SSDs. Look for that on your new Z170 motherboard.

        • Krogoth
        • 5 years ago

        I don’t that is going to be happening anytime soon.

        PCIe SSD cards are going to remain as prosumer-tier products for the foreseeable future. M.2 or SATAe will be successor to current SATA-III interfaces. It will bw mostly driven by capacity and reliability over performance.

          • JustAnEngineer
          • 5 years ago

          My understanding was that the M.2 form factor and the [url=http://en.wikipedia.org/wiki/SATA_Express<]SATAe[/url<] standard allow either the current ho-hum SATA III interface with AHCI or the more-efficient and faster PCIe interface with NVMe. I am expecting that we will see the latter become the default in the next few years. P.S.: The [url=http://www.newegg.com/Product/Product.aspx?Item=9SIA24G2U48520<]Kingston HyperX Predator[/url<] is an example of the sort of SSD that I am expecting to see become standard.

    • Yumi
    • 5 years ago

    1.

    Congrats on the new damage box.

    2.

    I hope that I’m not the only one who would like to see the following, and it can’t hurt to ask.

    The sound card setup gets a lot of hype in the product page for the motherboard, but you seem to have little faith in it since you picked a USB headset, will we get another blind listening test article ?
    Would be nice to know if the old budget ASUS DGX and DSX cards still outperform on board audio.

    Will you have the time to test the following?
    16 PCI-E lanes from the CPU are run through a PCI-E switch, is it possible to choke this switch and does it affect the SLI setup under “normal” usage.

    • ImSpartacus
    • 5 years ago

    I’m REALLY interested in seeing how the sli setup works.

    I’m scared of stuttering and driver issues.

    I want something pretty rock solid and it feels like a single gpu solution is the way to go for that.

      • geekl33tgamer
      • 5 years ago

      SLI in my experience (with up to 2 cards only) is way ahead of AMD from a reliability standpoint. The difference with Nvidia is that the drivers are updated often enough for games to include pre-defined profiles, and if a profile doesn’t exist you can either:

      – Download one from GeForce Experience more often than not
      – Make one yourself, or
      – Leave it, and the cards will initialize AFR best they can

      AMD’s cards disable the 2nd card entirely if there’s no crossfire profile, and adding in custom profiles is only possible with a 3rd party tool. It’s just less streamlined overall, and requires more manual intervention to get it working much of the time.

        • Medallish
        • 5 years ago

        So is no one going to call out this complete lie?

        – The same screen in which you enable Crossfire you can enable that CCC attempts to run Crossfire on titles without a profile.
        – You make your own profiles using Catalyst Control center, NOT a 3.rd party application.
        – You get several Crossfire options when making your own gaming profile, one of which is AFR.

        In my experience multi GPU is never the most smooth experience, but it’s nice to have when it’s supported.

          • geekl33tgamer
          • 5 years ago

          It doesn’t work like that with the 290X’s I had – go make a profile in CCC and watch the drivers / games totally ignore it and just render to the one card.

      • AdamDZ
      • 5 years ago

      I have a very similar build with Gigabyte Gaming series board, two 980s and a G-Sync monitor. Smooth like butter. I have been using SLI for many years and I never had any issues. G-Sync is just an icing on the cake.

      I also have another build that uses the Silencio case, it’s a really nice case.

    • squeeb
    • 5 years ago

    Man, I love me some Gigabyte !

    • bfar
    • 5 years ago

    Really nice build. You’ve got to check out Zowie mice, they’re outstanding.

    Looking forward to hearing your thoughts on Sli. My last experience was hit and miss, but that was back in 2011… hopefully things have got better?

    • K-L-Waster
    • 5 years ago

    There is a huge problem with this article — it makes me want to build another system 🙂

    Bad for the budget, especially when I have a 15 month old desktop and a 3 month old HTCP….

      • anotherengineer
      • 5 years ago

      lol

      My last/current build is from August 2009, albeit a few upgrades along the way. Last one was probably the SSD about 2.5 years ago, er wait the monitor about 1.5 years ago.

      I enjoy building and optimizing new systems too, however rarely seeing the CPU usage ever peg out on me, pushes it from the need into the want category.

      Maybe next year, we will see what AMD and Intel bring to the table, will start saving, just incase.

        • K-L-Waster
        • 5 years ago

        [quote<]I enjoy building and optimizing new systems too, however rarely seeing the CPU usage ever peg out on me, pushes it from the need into the want category.[/quote<] Oh definitely -- heck, even my desktop upgrade last year was more because I got the itch than because any deficiencies in the i7 920 system I had at the time.

        • Ninjitsu
        • 5 years ago

        That’s almost exactly my situation – September 2009! Though the only thing that’s the same from the original build is the CPU.

    • CrazyElf
    • 5 years ago

    I would strongly recommend against using the AIO based coolers and using a dual tower cooler like the Noctua D15 or the Cryorig R1 Ultimate.

    The reason why is the risk of leaks, the fluid loss through evaporation and the pump noise. The other problem is that AIOs actually do not perform well in terms of noise versus temperature cooled compared to dual towers. Only the Swiftech kits and the high end Cooler Master one that can be expanded offer superior performance from a temperature cooled to noise perspective.

    The only reason why on benchmarks these AIOs seem to do better is because of their higher speed fans – which come at the expense of noise.

    You can remove the fins on the TridentX if needed or simply adjust the fans upwards or even put the fan to the rear of the CPU cooler.

    • Ninjitsu
    • 5 years ago

    Hey, I have two questions:

    A. 1.65v memory for Haswell – is it safe? I thought the memory controller didn’t like more than 1.5v?

    B. CM Ceres 500 – how is it? For both gaming and music? Seems like something I could upgrade to, my cheap but good Philips headphones ([s<]1/7th[/s<] 1/3rd* of the price of the Ceres) are developing some issues now. I'm basically looking for something that has inline volume and mute control, a mic and sounds decent. p.s. Console mode might be that you can plug it into your Xbox Controller to get separate audio channels for VoIP and game audio - my friend, for example, used to send game audio to the speakers but use headphones for VoIP. Pretty cool stuff, if you ask me. (yes, I think you can do that with normal Realtek drivers as well, not sure how different programs will need to be configured though). EDIT: *prices have changed since! 😀 This is what I have: [url<]http://www.flipkart.com/philips-shm7410u-97-wired-headset/p/itmdnymvzanaxfmq?pid=ACCCZ553FGVUMMH7[/url<] EDIT2: Yes, I checked, console mode is exactly that.

      • Yumi
      • 5 years ago

      1.65v memory is fine, just in the high end of the volt scale for Haswell.

      With a bit of luck, the CPU will just run the memory at JEDEC specs on that memory and keep it at 1.5v and less than 2400 MHz.

        • Ninjitsu
        • 5 years ago

        Ah, okay, thanks!

          • JustAnEngineer
          • 5 years ago

          Broadwell tops out at 1.35 volts.
          [url<]http://www.newegg.com/Product/Product.aspx?Item=N82E16820148657[/url<] [url<]http://www.newegg.com/Product/Product.aspx?Item=N82E16820226333[/url<]

            • Ninjitsu
            • 5 years ago

            I think Skylake will be the same, considering the DDR3L/DDR4 spec…

    • Ninjitsu
    • 5 years ago

    [quote<] simply because you can get away with paying less if you don't need, say, this board's dual GigE ports and quad PCIe x16 slots. [/quote<] And I stare at the 5390K recommendation, puzzled. :p

      • Krogoth
      • 5 years ago

      Because it is cheapest LGA2011v2 chip that can fully utilize X99 platform.

      5820k only makes sense if you need six-cores, but don’t care or need expansion room provided by X99 platform.

        • Ninjitsu
        • 5 years ago

        Yeah but people who fit that category don’t get to see that option as a recommendation.

        • Andrew Lauritzen
        • 5 years ago

        > 5820k only makes sense if you need six-cores, but don’t care or need expansion room provided by X99 platform.

        Honestly I’m pretty sure that most people care more about 6 cores and quad channel DDR4 than they do about the extra PCI-E lanes and such.

      • Yumi
      • 5 years ago

      Still only 16 lanes from the CPU to the GPU setup.
      The 16 lanes from the CPU are passed through a PEX8747 chip to fake additional lanes.

    • dashbarron
    • 5 years ago

    For what it’s worth, you gave the 980 a Recommended not the hallowed Editor’s Choice?

      • Damage
      • 5 years ago

      Doh. Fixed.

        • dashbarron
        • 5 years ago

        If you read this I wanted to ask you about the dual 970’s.

        Do you have purpose for some high-resolution gaming or is this more personal curiosity to go SLI? When’s the last time you used it for your own personal system? I’m getting the feeling you’re just satisfying a personal curiosity and to chock it up to experience.

        It humors me, because I have a GTX 570 (same tier, older generation), and I went ahead and splurged and bought two too–figured why not. I ended up selling one of them because at 1920×1080, I didn’t feel like I was really utilizing the benefits of two cards and the loss didn’t make a difference.

          • Damage
          • 5 years ago

          This is more about personal curiosity and just living with this solution for a while. I do plan to switch to a 4K monitor soon, and that will be the real test. I’m kinda skeptical it will work well enough in 4K to keep me from wanting to drop to a lower resolution. We shall see.

            • JJAP
            • 5 years ago

            Hardocp did extensive benchmarks at 4k with titans, 980s and 970s. (From memory) the 970s will go great with a 144mhz 1440 monitor, but 4K will need two Titans to just barely give 60fps with modern AAA titles.

    • sonofsanta
    • 5 years ago

    Ok, question, because I’m (hopefully) building a quiet PC towards the end of the year: why did you choose the Coolermaster Silencio 652S over the Fractal Design R5? What are the key differences between them?

    I’d pretty much settled on getting the latter, but the CM is actually cheaper and if there are good reasons to plump for that over the R5, I’d love to know. Cheers!

      • MDBT
      • 5 years ago

      Because one of those cases was free from a sponsor and the other was not.

    • Anovoca
    • 5 years ago

    Scott – question for you.

    What was your reasoning for going with a 120 rad over a 240? I know the 120 can breath easier out of the rear exhaust than the fans on the top mounts can, but I don’t know if that would really contribute to more noise or less cooling then a 240.

      • Damage
      • 5 years ago

      I didn’t think I needed a 240 given that this CPU is rated for 88W and isn’t likely to be massively overvolted. I was trying to decide between the 120 and 140 and talked to Cooler Master about it, and they specifically recommended the 120XL since it has Silencio fans that match the case’s fans and operate at lower decibel levels. That turned out to be a good choice for this build, although a 240 might be nice for near-silent operation while gaming or something.

        • Anovoca
        • 5 years ago

        Yeah I just took note of how impressed you were with the silence of the case and my immediate thought was, Just imagine if you had a low rpm 240 in there. I didn’t consider the fans though. Typically rear exhaust fans are AF where optimal rad fans are SP, and with the side ventilation on the top mount you would definitely need to use different fans and go SP to push through the rad and the narrow ventilation.

        Edit: Actually the side exhaust venting is along the front panel not the top panel. Had the image wrong in my head. Either way though the logic remains the same.

    • DarkUltra
    • 5 years ago

    How do you refill that closed-loop cpu watercooler? Didn’t that break your previous system?

    I have to refill my custom watercooling system about every month.

      • Freon
      • 5 years ago

      They’re completely closed and you cannot add water.

      • f0d
      • 5 years ago

      holy crap where is your water going?

      i have built LOADS of custom water systems and they all keep their water at the same level for years – i just rebuilt one watercooling system that had the same water for about 3 years and the water level diddnt change the whole time

      you must be leaking

      • ColeLT1
      • 5 years ago

      That does not sound good. I have topped of my swifttech micro-res in my custom loop about once a year, takes about 1oz of fluid to fill it back to to the top.

    • Anovoca
    • 5 years ago

    Noooo Scott Noooo,

    Hwhy did you have to pronounce it “Hwhat”!

    [insert Stewie Griffen GIF]

    • tootercomputer
    • 5 years ago

    Sorry to be bah humbug, but I am puzzled by this article. Why is Damage assembling a new system newsworthy? It was not clear to me why the parts were chosen other than they were freely available from suppliers. I really prefer articles that take a critical and comparative view of PC hardware so to help us consumers make informed choices. IMHO, this article does not do that.

      • lycium
      • 5 years ago

      > It was not clear to me why the parts were chosen other than they were freely available from suppliers.

      Maybe you should watch the video, e.g. the bit in the beginning where he explains it.

        • tootercomputer
        • 5 years ago

        I did. He said “I went to some of our top sponsors who agreed to support us” or words pretty close to that.

        I fully expect to get totally slammed for my comments, but this has been my favorite site for years and I’m just want to again see good comparative hardware reviews.

          • Freon
          • 5 years ago

          I’m with you. It seems this selection was based on what he could score from suppliers. I disagree with quite a few of the selections. I threw up a little when I saw the OCZ SSD.

          And you’re also right about the voting. It needs to be removed as it’s not useful for identifying good posts, but more of a passive aggressive outlet.

            • Chrispy_
            • 5 years ago

            What exactly is so bad about the OCZ SSD?

            Toshiba’s QC and support is top-notch. Given the outrageously bad Samsung 840-series fiasco I’d be far more inclined to “throw up a little” if Scott recommended any Samsung drives at the moment. Given their refusal to even acknowledge the issues with some product lines, and their utter incompetence regarding a fix for the EVO line, it’s hard to put Samsung anywhere other than bottom of the pile right now.

            The company “OCZ” you’re thinking of ceased to exist long enough ago that you’d need to be living under a rock to have missed it. Toshiba bought the IP and now makes world-class SSD’s under the brand. Not only are they reliable and validated, they’re also good value for money because Toshiba owns/makes both the controller and the NAND, something only Samsung, Hynix and (for some products) Intel can claim. One would hope that this means better communication between the engineers making the NAND and the engineers making the controllers.

            • DPete27
            • 5 years ago

            The last 4 builds I’ve done used OCZ Arc 100’s. 240GB for $80 after MIR. Even the bargain basement (performance) V300 and Optima can’t beat that.

          • flip-mode
          • 5 years ago

          You make it sound like comparative hardware reviews don’t happen on TR anymore. Obviously that’s total rubbish. If you don’t like this particular article just skip it and wait for the next review instead of getting whiny.

            • lycium
            • 5 years ago

            The best part is, Scott actually quite clearly stated “I’m getting this instead of Haswell-E because it’s what we recommend…”, “because I wanted to try SLI day-to-day…”, etc.

            ADHD generation just can’t sit down and take in information for longer than a minute…

      • Anovoca
      • 5 years ago

      [quote<] Why is Damage assembling a new system newsworthy? [/quote<] Hmmm, a system builder site that published an article on system building. HERESY!!!!!

        • ClickClick5
        • 5 years ago

        lol, not only this but Damage owns the site….so…

      • cobalt
      • 5 years ago

      That’s what the System Guide is for, though, right? Just updated for May:
      [url<]https://techreport.com/review/28198/the-tech-report-system-guide-may-2015-edition[/url<]

        • Anovoca
        • 5 years ago

        Yep, The true value of this report is seeing the site owner stand behind his own product recommendations and store and run his lively hood on a machine built with those parts.

      • K-L-Waster
      • 5 years ago

      The site has numerous articles that are not strict reviews. Some are introductions to new technologies, some are guides for doing xyz, and some are industry news.

      A description of a knowledgeable user picking components for a specific purpose and building them into a system doesn’t seem in any way out of place. Many readers would find it useful. If you don’t, as others have mentioned, you aren’t obligated to read all of it.

      • ronch
      • 5 years ago

      TR always comes up with articles that do comparative analysis, probably better than many other tech review sites out there. What this article does is simply share what Scott did. As a community of PC and computer [s<]geeks[/s<] enthusiasts sharing one's system specs is always welcome. Maybe it's kinda lame for some folks but it's totally cool with us. 🙂

      • Shambles
      • 5 years ago

      Yeah this reads like native advertising. Here’s my new PC! (That was given to me by sponsors).

      • A_Pickle
      • 5 years ago

      Because it’s his website, and his computer?

      Gosh, what bourgeoisie scum! /s

    • TwoEars
    • 5 years ago

    Very good build, similar to my own system actually. I’ve got 4790k, ASUS ROG Ranger mobo, 2400mhz memory, GTX670 in SLI and SSD’s in raid-0.

    I also went with the 4790k over the Haswell-E for the reasons you mentioned, among others.

    Interestingly I would however like to point out that none of your sample systems in the TR systems guide use the i7 cpu. Perhaps something to consider? 🙂

    • MadManOriginal
    • 5 years ago

    [quote<]The selection of a GeForce GTX 970 for this system should be uncontroversial[/quote<] I lol'd.

    • tsk
    • 5 years ago

    Hello Scott, very nice build you got there. The 6gb red drives has had some issues according to user reviews. Also use the intel GigE port, it has lower latency. 🙂

      • NeelyCam
      • 5 years ago

      Hmm… I’ve been thinking of upgrading my 2TB WD Green to something bigger. I should start looking at reviews carefully.

        • anotherengineer
        • 5 years ago

        Here you go Neely 😉

        [url<]https://techreport.com/news/28235/fixstars-crams-6tb-into-a-2-5-ssd[/url<]

    • Chrispy_
    • 5 years ago

    Enjoy your new PC, Scott; I’ve been remarkably happy with two years of Silencio ownership and the mirrored WD Reds haven’t skipped a beat either.

    Our staff are proficient in breaking the front panels off their Silencios, but that’s because they’re clumsy buffoons! They’re easy enough to snap back into place if the connectors survive but I’ve taken to screwing them in place now (a FDD/ODD/2.5″ drive screw tightens well into the 6 provided screw posts CM put on the fascia in addition to the snap-in clips).

    My Das is getting creaky too, I’ve had it since forever – might give the QuickfireXT a shot….

    • lycium
    • 5 years ago

    That was a really cool video 🙂

    • JustAnEngineer
    • 5 years ago

    One notes that the motherboard that Damage selected has four PCIe 3.0 X16 slots (two of which work at x8) alternating with three PCIe x1 slots (which are likely to be covered by the dual-slot coolers of the graphics cards). There are no obsolete PCI slots.

      • Krogoth
      • 5 years ago

      There have been PCI-free motherboards for a while. It depends on platform in question and how many PCIe lanes it provided. You saw it more on higher-end boards starting with X58/LGA1336 which had enough PCIe lanes to go around.

        • chuckula
        • 5 years ago

        My Z87 board from 2013 is PCI free, and it was by no means the first of its kind.

    • anotherengineer
    • 5 years ago

    Just out of curiosity, what did you actually pay out of pocket for that nice system???

      • tootercomputer
      • 5 years ago

      I’m curious as well, and also from where you made your purchases. My main home office system is similarly pre-SB and I’ve been looking to upgrade and was thinking of the same CPU that you used here. I really like the i7 Lynfield that I’ve been using for five years now, but it, too, is getting a bit long in the tooth.

      • JJAP
      • 5 years ago

      It’s up to you to distinguish between “I’m using this because it’s the best part for this situation” and “It’s overkill but free for me.”

    • Sam125
    • 5 years ago

    Nice build. The motherboard’s onboard audio stuck me as particularly cool and the 4 16x PCIe slots made not going SLI seem foolish. 😉

    All-in-all, Damagebox 2k15 sounds like a very high-end build. A few things that stuck out to me while reading through the article in particular was the mention of multi-GPU rendering finally being part of the DirectX standard. That has long been overdue and I think having the flexibility of mixing and matching GPUs in one’s system should’ve been a no-brainer for Microsoft to implement long ago but they were too lazy to do so without Mantle being a reality. Also, with two identical SSDs, personally I would’ve considered placing them in Raid 0 format.

    I also noticed there’s no mention of a speaker setup. IMO, that should be a high priority for a new system that’s quiet enough to not interfere with your music/gaming listening pleasure! 🙂

      • dodozoid
      • 5 years ago

      I was about to ask the raid0 question as well. Why not Zoidberg…errrrr….raid?

        • Sam125
        • 5 years ago

        Well I don’t really want to speak for Damage but perhaps placing two drives into a raid 0 array was too risky.

        If it were me though, I’d raid 0 those puppies up and just do bi-weekly backups because that performance upgrade is pretty juicy. 😉

          • Chrispy_
          • 5 years ago

          There’s just no real-world benefit to RAID0 with SSD’s.

          Sure, you get bigger sequential numbers in synthetic benchmarks but I’ve found that decent SATA3 SSD’s are faster than both the soft-RAID built into intel’s PCH chips and Windows’ dynamic-disk striping.

          Unless you genuinely headbutt the SATA3 throughput limit with your daily workload, RAID0 is both a reliability and a performance downgrade for modern SSD’s

            • Sam125
            • 5 years ago

            So are you saying that two SSDs in raid 0 will have no benefit for application loading times? That’s the juicy performance upgrade I’d be wanting in a gaming PC which is what I was referring to. I hate not being the first to load a map in a multiplayer match. 😉

            • morphine
            • 5 years ago

            Yep, zero impact these days. In fact, if you go read Geoff’s PCIe SSD review, you’ll see that even drives pushing over 1GB/s don’t load games any faster.

            At this point, it’s an operating system / protocol thing.

            • Sam125
            • 5 years ago

            Thanks, I actually missed that article. I’ll read it more thoroughly tomorrow.

            • Chrispy_
            • 5 years ago

            It varies a lot on the application, but the majority of users will only encounter the SATA3 bottleneck in any meaningful way when looking at synthetic benchmarks like ATTO or CrystalDiskMark.

            Take level-loads for example. BF4 is a game with long level loads, and each DLC adds about 4 maps for 5GB. Sure, there will be some shared art assets but maps are in the 1-2GB range. An SSD could sequentially stream that into RAM in three seconds or less, but the level load takes forever because there’s decompression of the assets, loading of various libraries, database-like operations, and god-knows-what-else going on that only DICE could fully answer. Despite SSD’s being 4x faster than mechanical drives at sequential data, level load times aren’t that much faster on SSDs.

            If you want faster load times, it’s often most effective to turn down graphics settings and reduce texture detail. All that eye-candy takes time to load 😉

            The sort of thing you WOULD notice RAID0 on would be instances where you’re frequently saving huge files to disk (Photoshop, 3DSMax autosaves, for example). It can take 30 seconds to dump 15GB of data to disk, but with RIAD0 that would be more like 15 seconds. Most programs that deal with huge files have background save operations, so it doesn’t matter – but some types of save leave you waiting.

            • Sam125
            • 5 years ago

            Interesting, thanks for some perspective on the matter.

            I typically ignore synthetic benchmarks, especially for drives since great synthetic results never seem to amount to anything in the real world. I don’t play BF4 but if it’s like every other FPS, it’ll see 30-100%+ improvements with an SSD over a mechanical drive. Sure, it’ll be seconds that’re shaved off, but those seconds are typically the difference between entering first or last. 😉

            I might have to try a future build with a one NVMe drive and two in raid 0. Just for kicks though, and mostly to satisfy my own curiosity.

            Bringing up huge files does bring up an interesting point for consumer use though. Titanfall and at least one other recent AAA title have been installing textures uncompressed and if that starts to become any kind of a trend then storage solutions would be worth reconsidering for desktop gaming systems. 🙂

    • geekl33tgamer
    • 5 years ago

    On a side note to this great article, I am really liking the addition of more video content on the site lately within in articles themselves.

    Would like to see more – Perhaps a damage labs tour or something?

    • geekl33tgamer
    • 5 years ago

    An un-cannily similar build to [url=https://techreport.com/forums/viewtopic.php?f=33&t=114983<]Mega Beast[/url<]. If it's good enough for the TR chief, then it's defiantly good enough for me! 😉

      • Damage
      • 5 years ago

      Whoa, yeah. Nice build!

        • geekl33tgamer
        • 5 years ago

        Thanks Scott. I don’t suppose you fancy coming over and sorting the cable management in mine do you? 😉

        That’s where my system is a world away from how clean yours is inside (and it looks awesome for it).

          • Damage
          • 5 years ago

          Heh, no thanks, but really, it’s not that hard. I’ve kind of learned to think about cable routing as the main problem to be solved while building. Everything else is easy and should just work, but you want to pay attention to how things will connect. As you can see in my build, mostly what I did was just make sure everything took the most direct path possible from A to B. I’m not awesome or especially OCD about cabling, but I’ve found that approaching it as a high priority makes everything else easy.

            • geekl33tgamer
            • 5 years ago

            I added the cables in only after I placed the hardware, and didn’t really give it a 2nd thought (I mean, as long as it reaches the connector right?).

            Doesn’t look great through the window, and you’ve encouraged me to give it a bit of TLC this evening :-). My pet hate will always be GPU power connectors though. I’ve always felt they would look neater if they were on the end of the card, rather than on that top edge.

            Makes routing multiple ones of those a pain.

    • frontera
    • 5 years ago

    Scott –

    You may be interested to know that Handbrake has added QuickSync support. It was a beta version only feature for a while. Actually, intell worked with the HandBrake folks to get it enabled.

    Before that, developers had to pay intel for the devkit to use QuickSync.

    I’ve used it quite a bit for video with little fuss. You should check it out and let us know how it works for you.

    [url<]http://i59.tinypic.com/rvc007.png[/url<]

      • Damage
      • 5 years ago

      I did, and it just didn’t cut the mustard. 😛

      The quality at a given bit rate was just not there. Encoded the live stream with Kanter and got:

      ~700MB with QuickSync – looks horrible
      ~533MHz with CPU encoding – totally acceptable, is the version I uploaded to YouTube

      QuickSync will let you set a few knobs for quality, and it is FAST, but so far, I’ve not found it to be useful. Understand what I’m generally doing is further compressing something that comes out of Corel VideoStudio at higher bitrate. The default QS settings in Handbrake produced a larger file than Corel did. Ugh.

      I’ll mess with it more, I’m sure, but I’m afraid QuickSync isn’t meant to do high quality at low bitrates, and I’m uploading hour-plus-long videos to YouTube over a 5Mbps upstream. So yeah.

        • ryan345
        • 5 years ago

        QuickSync should be close to x264 as far as quality per bit (except there appears to be an issue at higher bitrate 1080p content, but I don’t think that’s what you are running into). It may be that Handbrake’s default settings aren’t appropriate for this type of video (mostly static) or that Handbrake doesn’t expose enough of QuickSync’s settings.

        q264 is a Windows command-line encoder that exposes (I believe) all of the possible QuickSync settings, so you could determine if the issue is QuickSync itself or just the settings. You can do relatively quick experiments because q264 has an option to limit the number of frames encoded (-nFrames), and it will automatically mux to mp4 if your output file name has the “.mp4” extension.

          • lycium
          • 5 years ago

          Does QuickSync accelerate two-pass encoding? It seems a little unlikely to me, something of a different usage model – fast realtime encoding for streaming, rather than best-possible offline compression (which pays dividends when hosting very large video files).

            • ryan345
            • 5 years ago

            No, however there is a relatively new QuickSync bitrate control feature called look-ahead mode that is evidently doing something similar, but only on the next N frames (up to 100). So instead of leveling the quality (by spending more bits on the complex frames) across the whole video, it tries to do that on the next N frames. That can’t be as beneficial as leveling across the whole video, but does evidently provide an efficiency increase.

            Edit: When I wrote “can’t be as beneficial” I should have qualified that. I was thinking of the average movie where most scenes are longer than 100 frames and the complexity of the scenes varies a lot. I think the more uniform the complexity of the scenes the more equal look-ahead and two-pass become.

            • lycium
            • 5 years ago

            Cool, hadn’t heard of that, and yeah 100 frames is a rather short window if you take it at 30fps, but it wouldn’t surprise me if they can make it arbitrarily long.

            • Deanjo
            • 5 years ago

            Encoders that are worried about quality[b<] do NOT [/b<]use two pass encoding. They use constant quality based encoding.

        • digitalnut
        • 5 years ago

        Hi Scott, doesn’t your video editing software let you use the GPU to render? I have been using Movie Studio 12 Platinum with GPU acceleration turned on. This feature really sped up the rendering for 720p movies. This is paired with a cheap $100 AMD video card.

        • stdRaichu
        • 5 years ago

        Interesting – I do a lot of video encoding and back in the SB/IVB days I experimented with quicksync and likewise found that the quality was a bit pants compared to x264. [url=http://www.missingremote.com/review/intel-quick-sync-examining-haswell-performance<]This article[/url<] suggested that HW performance and quality were much improved though (although judging just by SSIM isn't a very good gauge IMHO). Incidentally, I've found that a great deal of hardware decoders simply don't give very good quality output especially if they have weird post-processing turned on by default; I switched to software-only decode a long time back. It's entirely possible that if your decoder output isn't tip-top there won't be any appreciable difference between what I think is a so-so QS encode and a better-to-my-eyes x264 encode. Colour me surprised when I found out that, on my linux HTPC (IVB and XBMC) at least, software decode even used less power; 24W vs. 27W. Worried I'm beginning to sound like a videophile now so I feel like I should throw something into the ring about gold-plated ground-loop-proofed optical cables, £500 wooden brightness/contrast knobs and how vinyl videos give a much warmer feel and more expansive tonal vibrance to the picture when decoded through tubes.

        • cmrcmk
        • 5 years ago

        [quote<]and I'm uploading hour-plus-long videos to YouTube over a 5Mbps upstream. So yeah.[/quote<] Google Fiber can't get to you fast enough, eh?

    • Captain Ned
    • 5 years ago

    Re: The silence:

    I built my Haswell in the same P182 which held the Conroe and the noise simply vanished. I haven’t hit it with a Kill-a-Watt yet, but the monitoring software on my UPS tells me idle load has been cut in half (and the Haswell box is the only change).

Pin It on Pinterest

Share This