Gigabyte’s G1.Sniper 5 motherboard reviewed

like many other high-end products, pricey PC gear has a tricky value proposition. The best bang for your buck usually lies near the middle of the product range. Diminishing returns set in after that: higher prices typically deliver smaller performance gains and less critical added features.

This trend is especially apparent in motherboards. The increased integration of modern CPUs has largely removed performance from the equation. For most applications and games, high-end motherboards are no faster than their budget peers. They may offer more ports and slots, but the underlying interfaces run at the same speeds, and the additional expansion capacity is another example of diminishing returns. These days, even low-end boards have enough connectivity to satisfy the needs of most enthusiasts.

Yet here we are with Gigabyte’s uber-expensive G1.Sniper 5.

This Z87-based Haswell board sells for $400—more than double the price of the Gigabyte Z87X-UD3H we reviewed last year. The Sniper doesn’t deliver better benchmark scores or smoother gaming frame rates, though, and its mid-range sibling is already sufficiently equipped to host a potent PC. Why spend our time on something that costs so much more?

Because the G1.Sniper 5’s integrated audio combines a Creative processor with fancy capacitors, isolated circuitry, and a swappable amplifier chip. That’s worth a listen. Also, the board has a smorgasbord of networking options, including a Killer NIC that can prioritize gaming packets. That’s a good excuse to spend some time playing Battlefield 4.

And then there’s the fact that the G1.Sniper 5 is Gigabyte’s flagship desktop board. It’s a premium product with all the bells and whistles the company can muster. We want to see what happens when Gigabyte pulls out all the stops. Don’t you?

First impressions

The G1. Sniper 5 is more motherboard in pretty much every sense. It even has a bigger footprint than standard ATX fare. The 12″ x 10.4″ board is 0.8″ wider than that form factor allows, pushing it into XL-ATX territory.

Due to its formidable size, the Sniper may not fit into smaller ATX cases. Even those that accommodate it could still crowd the SATA connectors lining its lower right edge. Folks with larger enclosures should be fine, though. We didn’t encounter any issues installing the board in a spacious Corsair Obsidian Series 750D mid-tower.

Anyone considering the G1.Sniper 5 probably doesn’t have a tiny case. After all, this monster is built to host up to four graphics cards and 10 storage devices—far more hardware than smaller enclosures can take.

The extra SATA ports are nothing special. They stem from a Marvell chip and conform to the same 6Gbps standard as the six ports tied to the Intel chipset. The Z87 chipset’s native ports are faster, and they have more extensive RAID support. We’ve yet to see a third-party SATA controller trump the native implementation in the Z87 Express.

The support for quad CrossFire and SLI configs is more interesting. Most Z87 boards are limited to two-card setups that split the CPU’s 16 PCIe Gen3 lanes evenly between a pair of x16 slots. This dual-x8 mode is powered entirely by the processor, and through the miracle of product segmentation, it’s available only on the Z87 platform. Motherboard makers sometimes add a third PCIe slot fed by up to eight Gen2 PCIe lanes in the Z87 chip, but this config is only approved for CrossFire. Nvidia doesn’t endorse it for three-way SLI.

On the G1.Sniper 5, all four PCIe x16 slots connect to a PCI Express switch chip from PLX. The chip splits 32 lanes of Gen3 connectivity between the slots. The first and third slots get 16 lanes each for dual-card configs, and they also share that bandwidth with the second and fourth slots to enable x16/x8/x8 and x8/x8/x8/x8 setups.

Despite the extra lanes it provides to the x16 slots, the switch is still bottlenecked by its 16-lane Gen3 link with the CPU. However, the switch can pass peer-to-peer traffic between the PCIe slots without burdening the processor. Getting any more PCIe connectivity to the CPU requires stepping up to Ivy Bridge-E, which has 40 Gen3 lanes built in.

In addition to having enough slots for extreme multi-GPU configs, the G1.Sniper 5 has enough space between them to fit beefy coolers. There’s enough room for two triple-wide cards and four double-wide ones. Three PCIe x1 slots are included, as well.

High-end motherboards are usually loaded with exotic electrical components, and the G1.Sniper 5 is no exception. The board is populated with Chemi-Con capacitors that have solid-state cores and 10,000-hour lifetime ratings. There are fancy chokes, of course, and PowerIRstage MOSFETs from International Rectifier.

The Sniper uses the same MOSFETs as Gigabyte’s other enthusiast-oriented Haswell boards, but it has more of them overall: 16 for the CPU. Like the premium electrical components, the extra phases are meant to smooth power delivery to the CPU and to potentially improve overclocking headroom. The thing is, Intel moved voltage regulation onto the CPU die in Haswell. Fancy power circuitry probably helps less now than it did with previous generations of CPUs, which relied more heavily on the motherboard.

Beefy heatsinks sit on top of the Sniper’s power circuitry. There’s a tiny fan, too, and it’s very quiet… for now. I always wonder how long smaller spinners will maintain a low acoustic profile. At least the fan should come in handy with liquid-cooled setups that generate little airflow around the socket. The VRM coolers have a hollow channel within and barbs on either end, so they can be looped into a liquid cooling system, as well.

All this extra heatsink hardware crowds the CPU socket on three sides. Compatibility with some coolers may be compromised as a result, and we can’t test every combination of parts. We can, however, measure the clearances between the socket and other important landmarks.

As on most Haswell boards, the closest source of potential conflict is the DIMM slot next to the socket. Standard-height memory shouldn’t be a problem, but taller modules can interfere with wide CPU coolers.

These relatively short VRM heatsinks are unlikely to compromise cooler compatibility, but they can make installation a little awkward, especially if you have short, stubby fingers like mine. The heatsinks leave little room around the screw holes for cooler retention brackets.

Otherwise, the G1.Sniper 5 is free of annoying clearance issues. All of the onboard ports and slots are easily accessible, and so are critical elements like the internal headers, onboard battery, and CMOS-related switches.

There are actually three CMOS switches. Two control access to the board’s backup firmware chip, while the third resets the settings for the primary one. The backup firmware is a nice touch that’s been available on Gigabyte motherboards for a while. I also like the fact that the CMOS reset switch is an actual button rather than an old-school jumper. That said, I wish the reset button were located in the rear port cluster, where it would be accessible without cracking open the case.

As it stands, the rear cluster has lots of pretty much everything else: one DisplayPort out, six USB 3.0 ports, and pairs of HDMI, S/PDIF audio, Gigabit Ethernet, and USB 2.0 ports. There’s even a combo PS/2 jack for keyboard aficionados who refuse to give up the IBM Model M. Too bad the gold-plated audio jacks are devoid of color coding; one must consult the manual to identify which one is which.

Internal headers expand the Sniper’s USB payload, but there’s a caveat attached. Only the two USB 3.0 ports tied to the primary internal header are connected directly to the Z87 chipset. The remaining two internal ports and all six external ones are routed through a pair of Renesas hubs, each of which splits a single USB 3.0 connection between four ports. That much bandwidth sharing could compromise performance on certain ports if multiple high-speed devices are used concurrently.

We’ve seen similarly funky USB 3.0 configurations on Gigabyte’s other Haswell boards. The G1.Sniper 5’s onboard audio is a little more unusual, as we’ll demonstrate on the next page.

Integrated audio turned up to 11

Most motherboards combine the chipset’s integrated audio controller with a separate codec chip that handles analog-to-digital and digital-to-analog conversions for the onboard ports. This approach is very economical, but the sound quality with analog speakers and headphones tends to be fairly marginal, which is why we typically recommend discrete sound cards to folks with decent speakers or headphones.

The G1.Sniper 5’s audio implementation is sort of like an integrated discrete card:

Instead of ye olde Realtek codec, the Sniper sports a Creative Sound Core3D chip that handles controller and codec duties. The associated circuitry is isolated from other onboard components to reduce interference, and it’s laced with audio-specific Nichicon capacitors that purportedly improve sound quality. There are two amplifier chips onboard, too. One is tied to the front-panel headphone out, while the second drives the same sort of jack in the rear cluster.

We’ve seen headphone amps on motherboards before. However, the Texas Instruments OPA2134PA chip attached to the Sniper’s rear headphone out is socketed, allowing users to swap in different OP-amps. Gigabyte includes a second OP-amp—Analog Devices’ AD827JN—plus the oversized tweezers required to yank the chips out of the socket. The mobo maker also sells a separate OP-amp kit with three additional chips: Linear Technology’s LT1358, National Semiconductor’s LM4562NA, and Texas Instruments’ OPA2111KP. Each amplifier has a slightly different acoustic profile, and we’ll explore their impact on audio quality in a moment. First, we should address Creative’s contribution.

The Sound Core3D identifies itself as a Recon3Di audio device. It comes with Creative’s Sound Blaster Pro Studio software, which serves up surround-sound virtualization for stereo devices, a “scout” mode that makes footsteps easier to hear in games, and echo cancellation to improve voice input. The software also offers other ways to mess with the audio signal, but surprisingly, there’s no real-time encoding for multi-channel digital output. Folks who want surround sound in games will have to use the analog outputs.

We weren’t able to conduct blind listening tests with the Sniper, but I did spend a fair amount of time listening to music with various configurations. These more casual tests were conducted with mid-range Sennheiser HD 555 headphones and a selection of tracks from Neil Young, Radiohead, and The Heavy.

First, I compared the unamplified front-channel output to the amplified headphone jack with the stock OP-amp installed. These two were easy to tell apart; the unamplified out sounded dull and muddled, as if it were working with a limited frequency range. With the OP-amp lending a hand, there was more separation between the various elements in each track, and the sound quality improved noticeably overall. Radiohead was more poignant, The Heavy was more soulful, and Neil Young was more engaging. The amplifier added a crisp liveliness and improved clarity, seemingly without any drawbacks.

Encouraged by these initial results, I grabbed Asus’ Xonar DSX sound card off the shelf. This card was the favorite in our last round of blind listening tests, and it sells for only $60. It’s also our recommended upgrade for folks seeking superior sound quality to typical integrated audio.

The Xonar and the Sniper’s amplified out sounded more closely matched than the two onboard ports from the first comparison. That said, the low end of the spectrum definitely kicked harder and deeper on the Xonar. This bassy grunt added a balanced fullness that was absent on the Sniper. Higher up the spectrum, the clarity I noticed on the Sniper in the first round of tests felt artificial and overly sharpened next to the Xonar’s more natural output.

For an encore, I started swapping OP-amps to see if I could notice their impact on the Sniper’s sound quality. This task required shutting down the system to switch the chips. Perhaps because of the extra time and effort involved, it wasn’t as easy to pinpoint minute differences between the various options. Most of them sounded very similar, but the Linear Technology OP-amp stood out; it seemed to have brighter vocals and more muted bass than the others. None of the OP-amps made the Sniper sound as balanced and natural as the Xonar.

We also tested analog output quality objectively with RightMark Audio analyzer. All the configs from our listening tests were plugged into a separate Xonar Phoebus sound card, which captured their output of a 24-bit, 96kHz test track. RMAA grades analog signal quality on a scale between “very poor” and “excellent.” We’ve translated those values to a numerical scale that starts at low of one and peaks at six. Higher values are better.

  Frequency

response

Noise

level

Dynamic

range

THD

distortion

THD +

noise

IMD +

noise 

Stereo

crosstalk 

IMD at

10kHz

Overall

score

No amp 5 4 4 5 3 5 5 5 5
AD827JN 4 5 5 5 4 6 5 6 5
OPA2134PA 4 6 6 5 4 5 5 6 5
OPA2111KP* 4 5 5 5 4 6 5 6 5
LT1358* 3 5 5 5 4 5 5 5 4
LM4562NA* 4 6 6 5 4 6 5 6 5
Xonar DSX 5 6 6 6 4 6 6 6 5

The Xonar scores the best overall. The unamplified line out doesn’t look too bad according to these coarse measurements, but there are more obvious differences between it and the others if we look at the accompanying graphs. Click on the buttons below each one to switch between the line out, which is displayed by default, and the alternatives.

Stereo Crosstalk


Frequency response


Dynamic range


Total harmonic distortion


Intermodulation distortion


Noise level


These graphs are a little indulgent, but they nicely highlight the differences between the Xonar DSX and the rest of the configs. The discrete card has lower noise levels, less distortion, and a broader frequency response than the Sniper’s onboard audio.

For the most part, the OP-amps look like an improvement over the unamplified line out. However, they have shallower frequency responses at the low end of the spectrum, which explains the punch missing in our listening tests. The frequency response of the Linear Technology LT13581 amplifier falls off at the higher end of the spectrum, too.

I didn’t notice any issues with the integrated audio in our listening or signal quality tests, which were conducted with the system idling at the Windows 8.1 desktop. Firing up a graphics load produced a noticeable buzzing sound, though. This buzzing was audible at normal volume levels with not only the Sennheiser headphones, but also cheap earbuds.

Battlefield 4 and the Unigine Nature benchmark reliably produce the buzzing noise. It’s only apparent on the Sniper’s amplified output, and it’s not just my imagination. RMAA captures the behavior nicely. The following graphs come from “loopback” tests that route the motherboard’s audio through the onboard line input. (They aren’t directly comparable to the graphs above as a result.) The Nature benchmark provided the graphics load.


The amplified output has more distortion and higher noise levels with the graphics load running. The audible buzzing is apparent with hot-clocked GeForce GTX 680 graphics cards and also with a low-end Radeon R7 250. Further investigation with the motherboard installed inside a Corsair Obsidian Series 650D enclosure yielded similar results, this time with the addition of fainter feedback during web browsing and even when moving the mouse rapidly over icons. I’ve reproduced the buzzing with different system components and with the rig connected to a separate wall socket.

This behavior is the opposite of what we’d expect from onboard audio that’s supposed to be isolated from interference. Gigabyte has been working with us to pinpoint the issue, but we haven’t narrowed it down yet. We actually sent our test system to the company after it was unable to reproduce the problem in its labs. Gigabyte tells us it has replicated the buzzing with our hardware, but only with a GeForce GTX 680 installed. We’ve also noticed that a Newegg user review mentions similar sound interference. The investigation continues, and we’ll update this article as we learn more.

Fancy networking and extra goodies

You didn’t think the G1.Sniper 5’s excess stopped at the integrated audio, did you? The board is also loaded with networking options, including dual Gigabit Ethernet connectors. One of the GigE jacks is fed by an Intel controller, while the other is backed by a Qualcomm Killer E2201 NIC.

The Qualcomm controller is supposed to improve online gaming performance by prioritizing related networking packets. The system is managed via software, allowing users to set preferences on a per-application basis. Apps can also be blocked entirely if you don’t want them accessing the network at all.

Packet prioritization is smart, but handling it at the PC level won’t help if your roommate is swamping a shared router with BitTorrent packets. Still, I was curious to see how the software managed traffic on a single machine, so I fired up a Battlefield 4 multiplayer session alongside a BitTorrent download.

Before the download started, BF4 reported a ping time of about 50 ms. The gameplay was smooth, and all was well. With the download active and the Killer NIC’s management mojo at work, the ping time climbed to 100-150 ms. BF4 still felt fine, though. I didn’t detect any noticeable lag, and I didn’t seem to be getting killed more frequently.

After I disabled packet prioritization, BF4 ping times shot up to 180-250 ms. My perception of the gameplay didn’t reflect the increase, though. The game still felt good, perhaps because my reflexes and mad skillz have diminished greatly since my days as a regular online gamer. Or maybe Battlefield 4‘s network code is especially tolerant of this kind of scenario.

Next, I tried Counter-Strike: Global Offensive. It reported similar ping times in each scenario, but I noticed some in-game jerkiness with the BitTorrent download running in the background. Enabling the Killer NIC’s traffic management reduced the jerkiness slightly. Packet prioritization didn’t eliminate the hitching completely, though.

The fact that ping times were reduced in both games shows that traffic management makes a difference. However, my own impressions suggest that the perceived benefits can be more difficult to detect.

I didn’t do any gaming tests with the Sniper’s Intel NIC, but I did compare it to the Qualcomm chip in a quick file copy test. The Killer was clearly the faster of the two; it transferred several gigabytes of mixed files in 82 seconds, while the Intel chip took 13 seconds longer to copy the same folder.

Much of our networking discussion has revolved around wired offerings, but the Sniper has a wireless component, too. Technically, it’s not part of the board. Gigabyte ships the Sniper with a separate wireless card that fits into one of the PCIe slots. That card is just a conduit for the Mini PCIe module that supplies the wireless connectivity. I guess Gigabyte couldn’t find any room to accommodate the mini module directly on the board.

The wireless card is powered by Atheros hardware that supports 802.11n Wi-Fi and Bluetooth 4.0. The lack of faster 802.11ac connectivity is disappointing, especially given the Sniper’s price point. 802.11ac wireless is available on much cheaper Haswell boards.

The G1.Sniper 5 comes with a few extras in addition to the wireless card. CrossFire and SLI bridge connectors are included in the box along with a 3.5″ bay insert with dual USB 3.0 ports.

Most motherboards and cases have only two front-panel USB 3.0 ports, but the Sniper has internal headers for four ports, so the bay insert is definitely useful. I don’t like the white lettering, though. We all know what a USB 3.0 port looks like, and a blank face would be a better match for all-black cases.

Now, I wouldn’t be averse to making a few additions to the bay insert. Some of the features from this corner of the board would be nice to have up front:

The Sniper is equipped with onboard power and reset switches, of course. For the bay insert, I’m more interested in the POST code display and the CMOS reset button. The voltage probing points are pretty cool, too, though you have to be pretty hard-core to probe motherboard voltages manually.

The final extra of note really isn’t an extra at all. Every motherboard comes with an I/O shield, but it’s usually littered with bits of sharp metal that can slice fingers and poke into ports. The Sniper’s shield is nicer to work with; it has a smooth, cushy back that makes installation much easier.

You know what else makes mobo installation much easier? A port block that consolidates the wiring for the case’s front-panel buttons and LEDs. Wiring these connections individually is possibly the single most annoying thing about assembling a new PC. Port blocks effectively remedy the issue, and they cost only pennies to produce, but Gigabyte doesn’t include them with its motherboards. That’s unconscionable for an ultra-high-end product like the G1.Sniper 5.

Firmware and software interfaces

Despite packing a lot more hardware than most of Gigabyte’s Haswell boards, the G1.Sniper 5 has pretty much the same firmware interface and software tuning utility. That’s not necessarily a bad thing; Gigabyte’s 8-series motherboard firmware has the best-looking interface around.

The screenshot doesn’t really do the UI justice. I shrank it to fit on the page, but the actual interface is drawn at 1080p resolution. Everything looks incredibly crisp next to the low-res alternatives on competing boards.

The interface isn’t just a pretty face, either. It’s loaded with mouse-friendly tabs, sliders, and drop-down menus that make navigation a breeze for anyone familiar with modern Windows software. The firmware works with just a keyboard, too. There are shortcuts for switching between the various menus and tabs, and most values can be keyed in directly. Both experts and novices should find the firmware easy to use.

If you don’t like how the menus are organized, up to six tabs can be filled with a custom mix of options. The shortcuts listed in the main screen can also be changed.

This next bit feels like beating a dead horse, but I have to call Gigabyte out once again for engaging in sneaky overclocking. If the firmware’s default memory speed is changed, the Sniper secretly turns up Haswell’s CPU multipliers for multi-core loads. The resulting clock speed never exceeds the maximum Turbo limit for single-core loads. However, that single-core speed is applied to all four cores, even if the processor is fully utilized. This devious tactic is often used to artificially inflate benchmark scores, and there’s no excuse for it.

I also have a bone to pick with the firmware fan speed controls, which are a little restrictive. Manual control is limited to an awkwardly named “speed percentage” setting that changes the slope of the fan speed profile. There’s no ability to target specific temperatures or fan speeds anywhere along that profile. At least individual controls are at least provided for five of the nine onboard fan headers. Users can also choose between three pre-baked profiles for each fan.

More extensive fan controls are available in Gigabyte’s latest EasyTune software. This Windows utility was completely overhauled for the Haswell generation, and it’s a big improvement over the company’s previous efforts.

EasyTune includes a calibration function that measures the actual fan speed across the full range of available voltages. It offers custom controls for six onboard headers, and users can manipulate six points along each fan profile. There’s a fixed-speed mode, as well, plus the usual pre-configured profiles.

EasyTune’s overclocking section is packed with the most common clock, multiplier, and voltage controls. Power-related variables can be tweaked with EasyTune, too, and the whole software interface is easy to use.

That praise aside, the integrated hardware monitor is pretty awful. The real-time graphs are ugly and unnecessarily large, and there’s no way to customize or log what they show. I had to shrink and tightly crop the massive 800×800 monitoring window just to fit it below.

Yeah, forget about monitoring system variables with a discreet window tucked in a corner of your desktop. The main EasyTune interface is even larger, but at least it has a cohesive design. The monitoring window’s white legends and beige horizontal bars completely clash with the rest of the aesthetic.

Overclocking

The G1.Sniper 5 is equipped with numerous overclocking options, so we tested a couple of them. First up: the auto-tuning mechanism built into the EasyTune software. This hands-free feature takes care of the entire overclocking process. It’s also fairly intelligent. Clock speeds are increased incrementally, and stability is tested to determine the optimal configuration for each system.

On our Core i7-4770K, which was strapped to a Corsair H80 water cooler, the auto-overclocker settled on a CPU speed of 4.6GHz with single- and dual-core loads, 4.5GHz with three-core loads, and 4.4GHz with quad-core loads. The chip tops out at 3.9GHz in its stock configuration, so that’s a nice boost for very little effort.

Alas, the auto-tuner was far too heavy-handed with the CPU voltage. It hit the CPU on two fronts, with a higher core voltage and an additional offset, causing the chip to run at nearly 1.55V under load. That’s more voltage than we recommend for Haswell even with a dual-fan radiator attached. CPU temperatures spiked to 95°C, and throttling kicked in immediately.

Auto-overclocking mechanisms can be ideal for newbies, and they can also provide a useful starting point for seasoned enthusiasts. Gigabyte needs to dial this one back to serve both audiences, though.

Since our audience is more of a hands-on crowd, we also overclocked the CPU manually. And we were more successful. The peak all-core speed hit 4.6GHz with zero throttling, and it only took multiplier tweaks and a core voltage of 1.34V to get there. The firmware’s “auto” voltage setting wasn’t very helpful, though. We had to start adjusting the CPU voltage manually at 3.9GHz. With the auto setting, the system kept hard locking under load.

Our system actually made it up to 4.7GHz, but more voltage was required to avoid BSOD errors under load. With more voltage came higher temperatures, which caused the CPU to scale back its clock speed. Tweaking other voltage and power settings didn’t help, and we didn’t have any LN2 on hand, so we called it a day. 4.6GHz is among the fastest overclocks we’ve achieved with this cooler and CPU.

Performance and power consumption

As I alluded in the intro, modern motherboards typically have little impact on overall system performance. The CPU and GPU generally dictate application and gaming performance, while the storage subsystem—specifically, whether there’s an SSD installed—can influence load times and general responsiveness. That’s it for big-ticket items. Even peripheral performance tends to be pretty consistent from one motherboard to the next.

Instead of running the G1.Sniper 5 through our usual motherboard benchmark suite, we ran a handful of tests to confirm that it’s as fast as the other Z87 boards we’ve reviewed. And it is, at least once the firmware is configured to observe the processor’s correct Turbo behavior. The numbers are so close that there’s really no point to graphing them.

That said, the Sniper boots a few seconds slower than most of the Z87 boards we’ve tested. With all the fast-boot options enabled, it takes about 21 seconds to get to the Windows 8 Start screen. Of course, the Sniper also has more on-board peripherals than most of its peers. Those devices need to be initialized, which tends to lengthen the boot process.

Additional onboard devices also contribute to the Sniper’s power consumption, which we measured at the wall socket with our test system at idle, playing a 1080p YouTube video, and under a full load combining Cinebench rendering with the Unigine Valley demo. Here, the differences are worth highlighting.

The G1.Sniper 5 consumes a fair bit more power than the other Z87 boards we’ve tested. The gap between it and Gigabyte’s Z87X-UD3H is close to 20W in each test. Of course, the PCIe switch alone has a “typical” power rating of 8W. Then there’s the Creative audio controller, the amplifiers, and the additional networking and storage chips.

The Sniper’s beefy power regulation circuitry may contribute to the higher power consumption, as well. For what it’s worth, we also measured power consumption with the Sniper’s power-saving profile enabled. Turning on this firmware-level feature only cut power consumption by a couple of watts, though

As always, it’s important to keep the figures in perspective. An increase of 20W probably isn’t going to add a lot to the average monthly utility bill. The Sniper’s higher power draw does generate more heat that must be evacuated from the case, but anyone who drops four bills on a mobo probably has sufficient cooling. They just might have to deal with a smidgen more fan noise.

Detailed specifications

We’ve covered most aspects of the G1.Sniper 5 already. In case we missed anything, here’s a full rundown of the key specifications and firmware options.

Platform Intel Z87 Express, socket LGA1150
DIMM slots 4 DDR3, 32GB max
Expansion slots 4 PCIe 3.0 x16 via CPU and PLX PEX8747 switch

  (x16/x16, x16/x8/x8, x8/x8/x8/x8)

3 PCIe x1 via Z87

Storage I/O 6 SATA RAID 6Gbps via Z87

4 SATA 6Gbps via Marvell 88SS9230

Audio 6-channel HD via Creative Sound Blaster Recon3Di
Wireless 2.4/5GHz Dual-band 802.11n Wi-Fi via Atheros AR5B22

Bluetooth 4.0

Ports 2 HDMI

1 DisplayPort

1 PS/2 keyboard/mouse

6 USB 3.0 via Renesas uPD720210
2 USB 3.0 internal headers via Z87

2 USB 3.0 internal headers via uPD720210
2 USB 2.0 via Z87
6 USB 2.0 internal headers via Z87
1 Gigabit Ethernet via Qualcomm Atheros Killer E2201
1 Gigabit Ethernet via Intel I217-V

1 analog front out

1 analog center out

1 analog rear out
1 analog line in/mic in

1 analog headphone out

1 digital S/PDIF output

Overclocking Per-core Turbo multiplier: 8-80X

Uncore multiplier: 8-80X
Base clock: 80-266.66MHz
DRAM clock: 800-2933MHz

CPU gear ratio: 1.0, 1.25, 1.66, 2.5

CPU voltage: 0.5-1.8V

CPU core offset voltage: -0.3-0.4V

IGP voltage: 0.5-1.7V

IGP offset voltage: -0.3-0.4V

CPU ring voltage: 0.8-1.8V

CPU ring offset voltage: -0.3-0.4V
CPU external override voltage: 1.0-2.905V
System Agent offset voltage: -0.3-0.4V
Analog I/O offset voltage: -0.3-0.4V
Digital I/O offset voltage: -0.3-0.4V
PCH core voltage: 0.65-1.3V

PCH IO voltage: 1.05-1.9V
DRAM voltage: 1.15-2.1V

Fan control All: predefined silent, normal profiles

CPU, CPU Opt, SYS1-4 slope PWM: 0.75-2.5

This might be one of the longest spec tables we’ve ever had in a mobo review. Yikes.

Conclusions

The G1.Sniper 5 is a study in excess on multiple fronts. It supports two-, three-, and four-way CrossFire and SLI graphics configurations, which is a rarity for Haswell boards. It also boasts exotic onboard extras, including one of the most hard-core integrated audio implementations we’ve seen on a motherboard. Don’t forget about the cornucopia of networking options and the extra SATA and USB ports, either. This is without a doubt the most extreme Z87 board we’ve tested to date.

At the same time, it’s also another 8-series motherboard. The Sniper has the same performance and overclocking potential as the other Z87 boards we’ve tested, and the firmware and software are identical to what Gigabyte offers on its other Haswell offerings. That’s not necessarily a bad thing. The Z87 is an excellent platform. Gigabyte’s latest firmware and software utility are great, too, despite a few rough edges here and there. The trouble is, that puts a lot of pressure on the Sniper’s extras to justify the board’s lofty price.

On the networking front, the Killer NIC has fast transfer rates, and its management software should benefit folks who game and download on the same machine. The wireless card is nice, but it’s a separate piece rather than an integrated component, and it’s limited to 802.11n Wi-Fi. Support for the faster 802.11ac standard would be more appropriate for a premium product.

Gigabyte deserves credit for equipping the Sniper with an innovative integrated audio solution. I love the fact that picky listeners can swap OP-amps to suit their tastes, and the OP-amp definitely makes a difference. This is the best-sounding motherboard I’ve heard. At the same time, however, the output quality isn’t as good as that of our favorite budget sound card. The buzzing issue we encountered is also worrisome, though it appears to be limited to certain hardware combinations.

The Sniper does have some nice little extras that are free of caveats. The quad PCIe x16 slots probably deserve to be in that category even if they’re bound by a 16-lane link to the CPU. There’s no way around that limitation with Haswell. The bay insert is a nice touch, and so is the padded I/O shield. Seriously, though, no port blocks? This is a $400 motherboard.

Or it used to be, anyway. As I type this, the Sniper is discounted to $360 at Newegg. Factor in the $100 mail-in rebate that’s good until the end of March, and the effective price drops to only $260. I didn’t start this review expecting the G1.Sniper 5 to be a good value, but that price changes the math completely.

Comments closed
    • Dimscene
    • 6 years ago

    This thing is packed with features. I never had a Gigabyte and I stayed away from them because I had a 6800 Ultra made from them and it overheated or something within the first few months. I know that their motherboards are reliable, but you know, once you get a bad first impression its hard change your mind.

    • flip-mode
    • 6 years ago

    I wish motherboards reviewed by TechReport had more overlap with motherboards I’ve purchased or would purchase. Also, I find it curious that the only Asrock motherboards I can remember TR reviewing have been m-ITX models. Why no love for Asrock’s mATX and ATX boards? And more reviews of motherboards in the $100 to $150 price range?

    • dashbarron
    • 6 years ago

    What is the outlook for having motherboards with a higher PCI-E link to the CPU? Seems like we’ve been stuck on x16 for awhile, eh?

    • ronch
    • 6 years ago

    This board seems to want audiophiles to love it. Oh the irony.

      • tahir2
      • 6 years ago

      Fact is if you are an audiophile you won’t be using on board sound. If you are really serious you would be using a DAC or AV processor of some sort. Gigabyte tried here but they failed due to the laws of physics. The irony.

        • ronch
        • 6 years ago

        Exactly. No self-respecting audiophile would be caught dead using an onboard audio setup, no matter how many spare op-amps are dying to get plugged in there.

    • tanker27
    • 6 years ago

    I have the G1.Sniper M5 and while the board is just fine its the added gimmicks that fail. The Killer NIC just plain sucks as does the Creative onboard sound. There just isnt anyway around it. The UEFI is great though.

    • LoneWolf15
    • 6 years ago

    Someday, when Intel improves significantly beyond Sandy/Ivy, I’ll look at a new mainboard. Still rocking a Gigabyte Z68XP-UD5.

    Great review, but it taught me the couple of things I think I’ve been taught repeatedly for awhile.

    -Having a sound card is good, and nobody’s got onboard sound right yet, so unless you’re slot-restricted (like an mATX or mITX platform) get a card.
    -Gimmicky NICs are nothing special; at the same time, Realteks aren’t great either, please give us Intel NICs and I’m happy. I don’t need to spend extra money on one with packet prioritization, I’ll do that with my router if I need to. Also, how many of us really need dual-NICs? If I wanted to do that, I’d get an Intel PCIe dual or quad-port.

    Things I might be interested by:
    -All of those fancy CMOS buttons and the LED display? How about considering putting them on a 3.5″ panel instead of the USB3 ports? These days, if I’m paying this much for a mainboard, I probably am buying a quality case with USB3 ports. On the other hand, having all of these buttons mounted at the front of my case might be useful if I’m an enthusiast.

    • cynan
    • 6 years ago

    What is the unamplified line out for the audio comparison? The front panel header?

    The product info for this board states “[url=http://www.gigabyte.com/products/product-page.aspx?pid=4487#ov<]Built-in Front Audio Headphone Amplifier"[/url<]. I assume that this means the headphone amplifier is in line with the front L/R channel at the rear port cluster, and not with the header pin that connects to the front-panel audio jack? This article claims that the socketed OP AMP is in the output stage for the rear analog jack. Does this mean that the headphone amplifier circuitry is in the same path as the socketed OP AMP?

      • Dissonance
      • 6 years ago

      The unamplified output is the line out in the rear cluster.

      The OP-amp output also goes to the rear cluster, just to a different jack. You effectively have two front-channel outputs in the rear cluster. And there’s a separate amplifier chip dedicated to the front-panel headphone output, as well. That chip is soldered on the board.

        • cynan
        • 6 years ago

        Interesting.

        What would make sense to me is that the socketed OP-amp would be in the output stage of all analog ports (at least both on the rear cluster), while that amplifier chip (to driver higher impedance head phones) would only be in the path of one port on the rear cluster.

        From a design perspective this makes sense, as one port on the rear cluster would essentially be reserved for (hard to drive) headphones while the other port would be for lower impedance line out.

        That suggests to me that the comparisons being tested were between output stages with and without the headphone amp chip – which is why it would have been significantly louder, not between output stages with and without the socketed OP-amp (which is what I sort of understood from the article.

        (Alternatively, the socketed OP-amp was perhaps only in the output stage with the headphone amp, and not the regular line out – which means you would have been comparing head amp + Op-amp combo, not just Op-amp. [i<]Edit:[/i<]But this config makes less sense, as why wouldn't you also include the ability to change the "flavor" of the line out by switching OP-amps). Not that Gigabyte's documentation makes this easy to discern... That aside, I appreciate the extra TLC spend reviewing the analog output of this board!

    • squeeb
    • 6 years ago

    I’ve used Gigabyte boards for my last 4 builds. Love the ‘ultra durable’ series.

    • Welch
    • 6 years ago

    *Redacted*

    Asked dumb question, answered self.

    • vargis14
    • 6 years ago

    All I can say is that motherboard is Beautiful.

    Is there any way we can get a in depth review of the Xonar DSX compared to the Sound Blaster X since they can be found for around the same price and seem to be in direct competition with each other ??

    I am using the SB X and I am very very happy with its performance, but I would love to see a nice unbiased in depth review of the 2 cards, perhaps a major sound card review including many sound cards and not just the 2 I listed. I am sure many other would be interested in the differences between the 2 or more and the plus’es and minuses of each card so people can make a decision on what card would best suit their needs.

    I would love to know if a Zonar card can do what my Sound Blaster X can do. Take a optical audio signal from a cable set top box and plus it into the zonar card and be able to watch a Cable TV show over your Pc’s speakers with such a small delay that it is unnoticeable and the lipsync is still close enough that the sound come out sounding normal without signs of delay.

    If anyone has a Zonar DSX and could try this I for one would love to know if it can also play sound from the set top cable box without a delay in the lip sync like the SB X can do.

    I think TR is way overdue for a nice multiple sound card review…in fact I cannot remember the last time I have seen a review on multiple sound cards from TR.

    • MadManOriginal
    • 6 years ago

    Pet peeve: calling Solid Polymer Capacitors ‘solid state’ -_-

      • Welch
      • 6 years ago

      Was going to say something about that ahahaha. Also a pet peeve for me. Not as bad as saying you have a Solid State Hard Drive.

    • Chrispy_
    • 6 years ago

    Looks like it has all the bases covered, and I appreciate that you focussed on the software side of things more. The hardware involved is almost all known quantities but poor software implementations often wreck the feeling of quality you should get from a high-end motherboard.

    I think Gigabyte should be commended for their minimalism and tasteful skinning of everything from the motherboard components to the monitoring software. It all looks consistent and focussed for a change.

    Also, whilst I normally agree with you on this issue:
    [quote<]Port blocks effectively remedy the issue, and they cost only pennies to produce, but Gigabyte doesn't include them with its motherboards. That's unconscionable for an ultra-high-end product like the G1.Sniper 5[/quote<] I think this board is only going to appeal to multi-GPU owners and as such anything that raises the profile of ports/connectors in the vicinity of double-wide coolers is going to cause a problem. I've actually found that Asus' port blocks aren't that secure and have stopped using them because stuff has a tendency to come loose more easily - perhaps that's just me being gorilla-pawed during the builds though.

      • Waco
      • 6 years ago

      It’s not just you – even brand new they aren’t tight enough on ASUS boards.

    • Starfalcon
    • 6 years ago

    I currently have the Sniper 3, the Z77 version. I’ve been pretty happy with mine, and about the only complaint I have is the version 1 UEFI they used it is rather clunky. It runs a stable 4.3 OC on my 3770K , and has been problem free since I got it last year. Ive been pretty happy with the several gigabyte boards I bought after being a die hard Abit fan.

      • Voldenuit
      • 6 years ago

      I just got a Gigabyte GA Z87M-D3H*, but my CPU hasn’t arrived in the mail yet, so haven’t been able to test it out.

      From all the reviews I’ve read, though, the windowed UEFI interface is still slow and klunky.

      The best part is all the UEFI screenshots in the manual are from the text-based UEFI interface – apparently whomever wrote the manual must have also found the windowed interface too slow/klunky for their liking and just used the text-based version for their work. 😛

      * Picked it up for $84.99 from newegg, now that’s value for a Z87 board! Too bad it’s up to $95 today.

    • l33t-g4m3r
    • 6 years ago

    There’s a much cheaper version of this board that doesn’t do quad sli.
    [url<]http://www.newegg.com/Product/Product.aspx?Item=N82E16813128671[/url<] Might be worth mentioning. I can easily recommend any Gigabyte product because of their reputation for stability, quality components, and ease of use. Gigabyte boards generally just work, and apparently they've made some improvements to the bios options. A creative sound card, green color scheme, and killer nic is just icing on the cake.

      • MikkleThePickle
      • 6 years ago

      Thanks for the heads up!

    • Voldenuit
    • 6 years ago

    It’s a shame that tech sites are at the mercy of OEMs when it comes to review products. The OEMs want to show off their best (and most expensive) gear. Most readers probably want the best value they can get, but the sweet spot components are rarely reviewed.

    No offense to Geoff, but I stopped reading the article when I got to the pricetag. I have no doubt it was well written and objectively tested with a thorough review process. But that $400 is only $44 less than I got a CPU, mobo, case, PSU and RAM for just recently, and represents a lavish (and unjustifiable) expense for me.

    Others might beg to differ, of course.

      • Steele
      • 6 years ago

      How’s $260 sound?

      Because that’s what it’ll cost ya with a deal and a rebate over at New Egg until the end of the month, but you wouldn’t know that unless yo read the article =)

      To be fair though, that info is presented at the very end of the article.

        • flip-mode
        • 6 years ago

        [quote<]How's $260 sound?[/quote<] Extremely temporary.

        • Voldenuit
        • 6 years ago

        Still a bit rich for me, I’d rather spend that on a good graphics card.

        @l33t-g4mer: yeah, that was a good find. But my $85 Gigabyte Z87 also has USB charging and independent USB fuse protection. I’ve also been reading up on Gigabyte’s Smart Fan, and if it does everything the manual says it does, is finally a worthwhile option for fan controls.

      • l33t-g4m3r
      • 6 years ago

      Gigabyte makes a cheaper version with dual pci-e slots for $168. I posted about this earlier:
      [url<]https://techreport.com/discussion/26113/gigabyte-g1-sniper-5-motherboard-reviewed?post=809655[/url<] The features are the same, except for not having quad SLI. I'd say this review is relevant for the matching features on the other board, and the lower price would make it a better choice for normal people. Gigabyte may have wanted to show off their expensive gear, but the same features are included in their cheaper boards, which gives us a pseudo review of the whole Sniper line. One thing Gigabyte doesn't advertise, and probably should, is UASP support, which is heavily advertised with Asus. [url<]http://www.itvarnews.net/news/analysis/other/gigabyte-to-support-uasp-on-usb-30-motherboards.html/11322[/url<] Apparently it is supported on all their boards, but I haven't seen direct documentation of it other than a few news articles. Add that to their usb charging capability while off, and independent esd/fuse protection, and they probably have the best board on the market.

    • Wirko
    • 6 years ago

    Buzzing sound? Of course you have buzzing sound. I wouldn’t expect anything else if I had audio circuitry and jacks an inch or two away from powerful VRMs on both the motherboard and the graphics card. Even the conspicuous “moat” highlighted with LEDs doesn’t help here. Proper shielding would likely help, but that’s what manufacturers are saving for next year’s “audiophile” motherboards.

    • wirerogue
    • 6 years ago

    since the primary purpose of most reviews is to help the end user make buying decisions, i find that further research on this board must still be done.

    this review was quite thorough in regards to it’s features and cpu performance however, i did find one glaring aspect. since this board is obviously targeted at gamers running triple and quad gpu setups, i would have expected some actual testing with triple and quad setups. simply repeating the manufacturers specifications that it can be done, doesn’t make me feel all warm and fuzzy inside. i have had bad experiences with boards that claimed quad gpu support when in truth, they did not.

    certain products will require that you break the standard review template in order to accommodate and test the features that are emphasized in the product. in this case the 4 lime green pci-e slots that dominate the motherboard.

    • derFunkenstein
    • 6 years ago

    So the moral of the story is if you really want discrete sound, your’e better off spending $150-ish on a motherboar, buying a Xonar (or even a Sound Blaster, if you have to have Creative’s crapware!), and paying less in the process. If this was a mini ITX board with really good onboard sound then great – you don’t have to make a sacrifice. But it’s not. On a full ATX board this is ridiculous.

      • bthylafh
      • 6 years ago

      Now I’ve caught a wild pig and her piglets, but I’m confused where I’m supposed to slot in the Xonar.

        • derFunkenstein
        • 6 years ago

        OMG two edits and I still can’t spell motherboard. I’m leaving it, that made me laugh out loud.

      • ALiLPinkMonster
      • 6 years ago

      Unless you’re filling all four of those PCI slots with a graphics card (because you have unlimited money I guess). I think that was the whole point of the “high end” integrated audio. Like others have mentioned though, it would have been MUCH better implemented with proper shielding. It also wouldn’t have been too difficult to squeeze one of those x1 slots into the top position so you’d still have room for a proper sound card.

      • PenGun
      • 6 years ago

      No, you are better off with one of these:

      [url<]http://www.amazon.com/dp/B0093KZWRE/ref=pe_385040_30332200_pe_309540_26725410_item[/url<] Just amazing.

        • l33t-g4m3r
        • 6 years ago

        Does that device calculate hrtf, or have any sound processing capability other than amplification?

        Products like that are why pc audio has stagnated for so long. Audiophiles find $300 gold plated monster hdmi cables more attractive than buying a dedicated sound card that supports hrtf. Not saying amplification doesn’t have a purpose, but it should be included with your sound card, and software should never be compromised for the sake of op-amps and other snake oil features.

          • Waco
          • 6 years ago

          All sound processing is done in drivers these days…hardware acceleration of HRTF went out the window with Vista.

          Unless I’m totally confused, which does happen.

            • l33t-g4m3r
            • 6 years ago

            Then what’s True Audio? Just because MS ruined the 3d sound market doesn’t mean you can’t buy a sound card that bypasses the MS audio stack via True Audio / OpenAL, or supports additional 3d features like cmss3d / dolby headphone virtualization.

            Regardless of what brand you buy, amd, xonar, creative, dedicated sound cards will have better positional audio than onboard, due to the additional features supported in their drivers. Sure, a lot of it is in software, especially in cases like the xonar, but it’s software that you won’t get using onboard.

            The sound card market has recovered from Vista, so I think it’s time to start recognizing cards that support additional 3d features make a difference over ones that don’t, regardless if it’s hardware or software.

            • Meadows
            • 6 years ago

            The 3D sound market wasn’t ruined. It was equalized.

            • l33t-g4m3r
            • 6 years ago

            How? You’re full of it. AMD/Xonar/Creative features are not available to generic onboard users, not to mention how much lower quality onboard is in general because of cheap components and emi. MS essentially killed everyone off by eliminating the ds3d extension model, meaning you can no longer extend your own 3d code on top of ds3d, which eliminated every single api based off of it. Sound companies today have sidestepped this with alternative api’s and virtualization, which wasn’t extensively used back then.

            Sure, you can do effects in software, but we’ve had SEVEN YEARS of completely worthless software audio, and nobody is extensively using it. It costs too much cpu, and developers don’t want to bother optimizing for it. Titanfall is a perfect example of how poorly software audio is optimized for pc, although you probably think that’s good. Meanwhile, hardware accelerated audio just works, and doesn’t add any overhead. If you think generic software audio over onboard is even vaguely equal or competitive with dedicated sound cards that support hardware accelerated api’s and headphone virtualization after we’ve had years of experience proving otherwise, there’s something wrong with you.

            Vista’s legacy has created abominations such as 5.1 headsets and games like Titanfall. That’s not progress, that’s regress.

            Next up, Meadows sponsors software mode dx11 graphics. Why? That’s just how his mind works.

            • Meadows
            • 6 years ago

            As I already told you once before: hot air, my son. That’s what you are full of.

            Game developers could’ve coded complete realistic audio solutions in software years ago if they bothered at all. They didn’t bother to. That’s not Microsoft’s fault, or anybody else’s.

            That AMD is again championing “hardware accelerated” audio is simply them being AMD again: having a good idea and being faaar too late with marketing it.

            • l33t-g4m3r
            • 6 years ago

            You haven’t said [i<]anything[/i<] that justifies your position, and in reality completely validates what I've already said. [quote<]Game developers could've coded complete realistic audio solutions in software years ago if they bothered at all. They didn't bother to.[/quote<] Yeah, and why's that? Isn't it because software audio completely sucks, and is performance sapping unoptimized garbage? Every time I hear a developer explaining their audio development, that's pretty much what they say. Taking away developer choice IS Microsoft's fault. [quote<]That AMD is again championing "hardware accelerated" audio is simply them being AMD again: having a good idea and being faaar too late with marketing it.[/quote<] 1. You're admitting it's a good idea. 2. It's not too late, rather it's about time, because people are tired of low quality software audio, plus developers will be more apt to use 3d audio when processing effects are "free". We aren't going to get better sound without: A: hardware acceleration B: Massive improvements to MS's api that not only are more efficient, but is easier for developers to implement.

            • Meadows
            • 6 years ago

            Link some of these interviews, will you? I suspect each time a developer says audio processing is “unoptimized garbage”, they just refer to available licensed libraries. They could optimise the living gobbledygook out of that code if they bothered to.

            They do not, because design-by-committee deems processed audio “not important enough” to spend money and man-hours on, so developers are stuck with several year old pre-baked libraries made by people who weren’t actually invested in the thing.

            Truth is, developers fear the bottom line, they’re afraid of the users still rocking one- or two-core systems with technology from 3-5 years ago, despite the fact [i<][b<]even those computers[/b<][/i<] could do moderate spatial effects, head-related or occlusion filtering, or echo or phase shifting (or anything else) in software without suffering [i<]anything close to[/i<] a major performance hit. And another truth is, developers are stupid (or they are commanded to be stupid by lead producers/designers). All you'd need to do is inject a quick hardware-test into the game startup sequence that probes CPU performance on first run, and then adjust the "audio processing" option between "simple-normal-realistic" accordingly. That's all they would have needed to provide. They haven't even figured this out. I pity the fools.

      • LordVTP
      • 6 years ago

      My D2x still going strong, best sound purchase ever!

      • Thresher
      • 6 years ago

      The sad thing for Creative is that people still are dwelling on the crappy driver packages from the past.

      The cards now are damn good and have no bloatware. You can install the drivers alone or add the control panel.

      I’ve listened to the Xonar cards and I am just no impressed with them. It might just have been a one off, but the card I listened to just didn’t seem to have any brightness. It was dull, for lack of a better word, and lifeless.

      To each their own, but the Xonar cards just don’t do it for me.

      That said, even a Fiio headphone DAC/AMP will provide better sound if headphones are your thing.

        • derFunkenstein
        • 6 years ago

        Might be sad for Creative, but I hated their fidgety software so much I just quit buying their products, the idea of bloatware eternally etched into my mind.

        Personally, I don’t have a Xonar, though they apparently do better on RightMark, which means they do a better job of representing actual recordings. Softer, brighter, “more space” – that’s all nonsense to me. I want it to sound like it was engineered, and the Xonar (apparently, based on benchmarks) does an excellent job. I’m using an Avid Mbox USB audio interface for everything. I already have it connected to my monitors and that setup plays Windows bloops and beeps fine, and it plays music back the way it was intended. That’s all I expect.

    • Meadows
    • 6 years ago

    [quote<]"First, I compared the unamplified front-channel output to the amplified headphone jack with the stock OP-amp installed. These two were easy to tell apart; the unamplified out sounded dull and muddled, as if it were working with a limited frequency range. With the OP-amp lending a hand, there was more separation between the various elements in each track, and the sound quality improved noticeably overall. Radiohead was more poignant, The Heavy was more soulful, and Neil Young was more engaging. The amplifier added a crisp liveliness and improved clarity, seemingly without any drawbacks."[/quote<] Promise me you will never ever do this again. Empty buzzwords and unspecific qualifiers.

      • Dissonance
      • 6 years ago

      Lemme fix that for you.

      Empty buzzwords and unspecified qualifiers… backed by an objective RMAA analysis that includes over 40 graphs comparing analog output quality.

        • Meadows
        • 6 years ago

        Yes! That part is fine! But [i<]my god[/i<], the *words.*

          • JosiahBradley
          • 6 years ago

          None of those were buzzwords. Those are actual adjectives that correspond directly to an objective metric. You can clearly hear the difference between a clipped frequency response and a full dynamic range.

          It’s basic physics.

            • Meadows
            • 6 years ago

            It’s a *subjective* metric. Music doesn’t sound “poignant”, “soulful” or “engaging.” That’s audiophile-speak, not physics.

            • Chrispy_
            • 6 years ago

            The technical alternatives to poignant, soulful, or engaging are both long-winded and near-meaningless to most people that would be reading this though.

            Personally, when I describe HiFi kit as punchy, lifeless, empty or rich, people understand far better than when I talk about bias to specific frequencies and non-linear response curves. the simple adjectives provoke engagement in the conversation with at least some understanding of the differences. the technowaffle in usually meaningless to people.

            • derFunkenstein
            • 6 years ago

            I don’t want poignant, soulful, engaging, punchy, rich, or whatever other subjective words you can come up with. I want a low noise floor, a flat EQ all across the audible range, and no stereo crosstalk. That’s it. Same thing with speakers.

            • Chrispy_
            • 6 years ago

            Which is why I use studio monitors 😉

            • indeego
            • 6 years ago

            [url<]http://xkcd.com/841/[/url<]

            • NewfieBullet
            • 6 years ago

            There are meaningless adjectives. If not, describe how you would test the difference between poignant and non-poignant audio. What physical property of sound does this buzz-word refer to? Can you show the difference between soulful and non-soulful audio on an oscilloscope? What about engaging? They are simply different ways of saying I liked this, I didn’t like that.

          • bthylafh
          • 6 years ago

          All those words are terms of art in the audio field.

        • Klyith
        • 6 years ago

        I don’t have as much issue with the fluffy words because how else can you describe a sound, but all of it from listening with the 555s… Don’t get me wrong, I have a pair of 555s myself, until recently I’d used them for like 7 years & love them to pieces. But they are the very definition of a great headphone that doesn’t really require secondary amplification.

        You didn’t say anything about testing the basic volume difference between the line output and amped output, but if you didn’t correct for that it mucks up any plan for neutral comparative listening. A lot of that description sounds like the usual “+3db sounds better” effect.

          • cynan
          • 6 years ago

          What makes you think the amplified output would have more volume than the non amplified one? While it definitely could have, amplification /= volume (gain)?

          Edit: Never mind. I guess that’s what Noise Level in the table indicates…

          Late Edit: Original edit made no sense. I stand by my first question.

          • Dissonance
          • 6 years ago

          I should have noted this, but I normalized the volume levels between the different outputs for the listening tests. (And for RMAA) The amplified one was much louder than the standard line out.

        • keltor
        • 6 years ago

        Unfortunately RMAA analysis, RTAs, Oscilloscopes and the rest all measure artifacts that your ears cannot discern. When it comes to audio for humans, your ears plus an ABX test are what matters. DACs for computers are a totally solved issues, you don’t need fancy, you just need software that doesn’t fuck up the audio and paying basic attention to the chip vendors instructions on layout.

      • TwoEars
      • 6 years ago

      I kind of agree.

      “There was a noticable increase in sound quality to my ears when using the amplified headphone jack” would have sufficed.

      We don’t want to become stereophile here, sound is so subjective.

      • bandannaman
      • 6 years ago

      “The music became almost impertinent in its insouciance. It was more effervescent and ineluctable. Here, look at these charts.”

        • Meadows
        • 6 years ago

        I know, right?

      • DancinJack
      • 6 years ago

      +Meadows!

      • tahir2
      • 6 years ago

      To be fair music is about emotional attachment and using emotional phrases to convey the sound quality is perfectly fine if you understand the context and the reviewer is consistent. Not everything can be conveyed via numbers and mathematics – ask Minsky.

      You carry on Geoff.. Awesome review but need more discussion regarding the sound quality. 😉

      Moar emotivism to annoy Meadows would be welcome too.

    • albundy
    • 6 years ago

    great review. thats some heavy premium cost, probably associated with the creative audio. i mostly do passthrough spdif to my receiver via coax and good ol’ time warner cable cable with screw on coax to rca plugs, so it’s not that important for me to have better audio onboard.

    • StuG
    • 6 years ago

    I refuse to take the step backwards that involves putting small fans on my motherboard heatsinks again. We moved away from that, there should be no need to go back with all the amazing new technology we have.

      • Crackhead Johny
      • 6 years ago

      But you loved lapping the northbridge and strapping a full blown CPU cooler to it to stabilize 100% over clocks on your BP6, why would you not want to go back to those days?

        • StuG
        • 6 years ago

        At least with that setup the fan wouldn’t begin to whine in a year and a half, and might actually survive some dust before failing!

        • Forge
        • 6 years ago

        Hey hey hey, everybody put CPU HSFs on their BP6, that sucker got hot.

        Lapping the NB though? Madness. Need a tiny piece of glass and a steady hand.

        Besides, the BP6 was only ever a gimmick looking for a problem to solve. The P2B-D with a pair of P3 600E@800EBs, though? Hawt. The 440GX managed to do all it’s work while completely uncooled, too.

Pin It on Pinterest

Share This