Gigabyte’s Aorus GTX 1070 Gaming Box external graphics card reviewed

The recent rise of Thunderbolt 3 ports and external graphics enclosures to go with them has cracked a door to an intoxicating vision of the future: a world where gamers carry thin-and-light notebooks without power-hungry discrete graphics chips inside to class or work. Once they’re back at their desks, those same folks plug in a single cable from an external graphics enclosure and get all of the pixel-pushing power of a desktop graphics card. That’s the dream, anyway. The reality of Thunderbolt 3 and external graphics is a lot more complex.

A while back, Gigabyte offered to let us take a step into this vision of the future with its Aorus GTX 1070 Gaming Box, a pint-size powerhouse of an external graphics enclosure. We happily obliged. Despite measuring just 8.3″ long by 3.7″ wide by 6.4″ deep and weighing in at about five pounds (2.4 kg), the Gaming Box contains a desktop GeForce GTX 1070 that’s paired with a small-yet-highly-efficient 450W power supply. One can get this entire package for $569 right now, or about a $120 premium over a bare GTX 1070 card. Given the usually-exorbitant prices of empty Thunderbolt 3 external-graphics enclosures like Razer’s $500 Core v2 and Akitio’s $300 Node, that’s a pretty striking value to start with (so long as you’re willing to sacrifice expansion capacity down the road).

As its name suggests, the Gaming Box is not a complicated piece of hardware. It’s a crinkle-coated steel box with mesh sides and just enough of a PCB inside to support a physical PCIe x16 slot for the graphics card and the ports to the outside world. Two 40-mm fans in front of the power supply help move waste heat from the graphics card, while another tiny fan draws outside air over the internals of the PSU itself.

Gigabyte claims the power supply in this unit, a custom model from reputable maker Enhance, would rate 80 Plus Gold if it were submitted for certification. Without that badge, we can only go by the company’s own “greater than 90%” efficiency figure. Foam gaskets around the air intake for the PSU and the exhaust sides of the fans might be an attempt to avoid cross-contaminating the power supply intake with hot exhaust, though it’s equally possible the foam is just a sound- and vibration-deadening measure.

Because this is 2017, a strip of RGB LEDs under the graphics card provides a bit of extra visual flair to the Gaming Box. Gigabyte’s Aorus Gaming Engine software should allow control of these blinkenlights. Try as I might, though, I was never able to establish control over those LEDs even with multiple software and firmware updates. While I wish I could tweak those lights, the default rainbow cycling mode is inoffensive enough and provides a visual cue that the box is powered on. Hopefully this stubbornness is simply a characteristic of my particular sample and not a broader issue.

On top of its pixel-pushing ambitions, the Gaming Box has what it takes to serve as a basic docking station. To start with, its Thunderbolt 3 port can send up to 100W of charging power to USB Power Delivery 3.0-compliant laptops. Despite Thunderbolt 3’s ability to transport USB 3.1 Gen 2 signals, all of the USB ports on the Gaming Box are good old USB 3.0. That’s probably more than sufficient for the mice, keyboards, or network adapters that might end up connected to this box as a docking station. The three blue ports on this box are the only ones that actually provide data connectivity—the orange port is for power only. Qualcomm Quick Charge 3.0-compliant devices can enjoy a faster top-up from this orange port than devices without. With its peripheral connectivity and charging-dock powers, the Gaming Box’s $120 premium over a bare GTX 1070 seems like a great value.

For folks who want to take both their laptop and the Gaming Box on the go, Gigabyte includes a nicely-padded carrying case that’s discreetly embroidered with the Aorus logo. The case has just enough room for the Gaming Box and its cabling across all of its pockets, so don’t expect it to turn into your go-bag for LAN parties or the like. Still, the carrying case is a nice extra for an accessory that could conceivably hit the road with the laptops it’s poised to augment.

 

Gigabyte turns a shrink ray on the GTX 1070

The GeForce GTX 1070 inside the Gaming Box is a mini-marvel. The conventional wisdom about Mini-ITX graphics cards is that one gives up a lot of performance and quiet manners when building short boards like this one, but the power efficiency of the Pascal architecture is such that Nvidia’s board partners can now put the same chip into a wide range of desktop-friendly form factors without harming performance much, if at all.

Don’t tell anybody I said so, but oversized graphics coolers on Nvidia cards these days (short of those for the GTX 1080 Ti) are often more about making a statement than they are about the actual cooling needs of the chips beneath. Sawed-off coolers like the one Gigabyte employs here can still let the card beneath extract plenty of performance from the GP104 GPU.

 Although most won’t take apart the Gaming Box like we have, it’s worth touring the graphics card inside to appreciate how Gigabyte has pulled off this tiny terror. To make a full-fat GTX 1070 this short, Gigabyte actually moved most of the card’s power-delivery circuitry to the back of the board and covered it with an aluminum backplate for potentially better heat transfer. 

The cooler itself relies on a blow-down-style fan. Three large copper heat pipes run over the GP104 GPU into a dense aluminum fin stack. Despite the sawed-off cooler, Gigabyte still specs this card for 1531 MHz base and 1721 MHz boost speeds—a minor increase over Nvidia’s reference clocks, but an increase nonetheless. The Pascal architecture’s GPU Boost 3.0 feature means those on-paper specs don’t mean much in practice.

In my testing, the card settled in around 1860 MHz and a maximum temperature of 70° C  in operation. Those aren’t the highest clock speeds we’ve ever seen from a GP104 graphics card, but they’re still plenty good for one this compact. You can even overclock this thing if you want, given that all of its knobs and dials are unlocked in the Aorus Graphics Engine software. I tried, and it works.

Even better, my entire notebook-and-Gaming-Box setup doesn’t produce more than 36 dBA at one meter—a figure that most will barely notice even in shared spaces. If nothing else, Gigabyte deserves kudos for putting such a high-performance graphics card into a footprint this compact without compromising performance or acoustics.

The only unfortunate thing about the GTX 1070 Gigabyte puts in the Gaming Box is its complement of display outputs. Gigabyte includes two DVI-D connectors, a single DisplayPort 1.4 connector, and a single HDMI 2.0b connector on this card. That’s a strict downgrade, we think, compared to the trio of DisplayPort connectors and single HDMI output that grace the GTX 1070 Founders Edition, among other such cards. Folks looking to daisy-chain Thunderbolt displays off the Gaming Box will be disappointed by its single TB3 connector, too.

So who’s this for?

 Figuring out just who needs an external graphics enclosure set off one of the most heated debates around the TR water cooler that we’ve ever had, and the discussion wasn’t conclusive by any stretch of the imagination. We’re generally a desk-bound lot, and that inclines us toward compact PCs with full-fat desktop hardware inside if space is a concern. A lot of folks can’t live without mobile systems, though, and we figure that anybody shopping for one of these external graphics boxes is trying to maximize the portability and battery life of the laptop portion of the pairing. That approach points toward an ultrabook-type system without a discrete graphics card.

The folks at Ultrabook Review have assembled a nice compendium of thin-and-light systems with Thunderbolt 3 ports, and the sweet spot for such a system appears to be about $1000. If one builds a docking station for that notebook around the Gaming Box and an external monitor—and as we’ll see, an external monitor is mandatory for the best performance from a setup like this—they could easily flirt with $1800 to $2000 or more in total. Those dollar figures will doubtless cause the many system builders in the TR audience to lament just how powerful a gaming desktop one could build for the same money, but that misses the point of a mobile system: convenience.

For many, there would seem to be a lot of value in being able to unplug their laptop from an external graphics box and take it on the go. A dorm dweller might find it a lot easier to unplug their laptop on the way to class in the morning and plug it into the Gaming Box upon returning in the evening. That same dorm-dweller might find it a lot easier to pack up an external monitor and the Gaming Box at the end of the semester than a separate desktop PC, too. Mobile folks who call small apartments home base might appreciate the floor or desk space saved by not having even a Mini-ITX PC to go with their laptops. There’s also something to be said for not having to care and feed an entire separate PC with its own files and Windows license, period. The Gaming Box might not be ideal for everybody, but it certainly has a niche it might fill.

 

Meet our Alienware 13 R3 mothership

Although a thin-and-light notebook without a discrete graphics card inside would seem to be the ideal platform for testing an external graphics enclosure, my own tastes in notebooks run toward something a bit different. As it happens, I picked up a fully-loaded Alienware 13 R3 from Dell’s refurbished stock a little while ago to scratch an itch for an OLED screen that I couldn’t get to any other way.

Although this notebook already has a powerful GeForce GTX 1060 6GB inside, its Thunderbolt 3 port makes it an ideal candidate for exploring the performance of the Gaming Box. By pure luck, it turns out that the Alienware 13 likely has the most ideal configuration imaginable for Thunderbolt 3 this side of a MacBook Pro. Using HWiNFO64, I verified that the GeForce GTX 1060 6GB inside only needs eight of the CPU’s 16 PCIe 3.0 lanes, so Dell runs four more of those precious lanes directly to the system’s Thunderbolt 3 controller. That means no chipset silicon or DMI link stands between a Thunderbolt 3 device and the CPU or main memory.

All that is important to note because not all Thunderbolt 3 ports are the same. Intel’s Thunderbolt controllers can have as many as four PCIe 3.0 lanes dedicated to them, but systems designers are free to use two PCIe 3.0 lanes if they choose. Witness Dell’s XPS 13 and XPS 15 systems, for just one example. Those lanes also might not come directly from the CPU—they could just as easily come from the chipset, which talks to the CPU over the DMI 3.0 bus. Recall that DMI 3.0 uses roughly the equivalent of four lanes of PCI 3.0 for all of the traffic from devices and peripherals connected to the chipset. For reasons we’ll examine shortly, the Gaming Box really wants all of the PCIe bandwidth from a Thunderbolt 3 controller that it can get, so it’s important to choose a well-configured system to get the most out of an external graphics box.

Using the Alienware 13 R3 as my test bed also lets me indulge another curiosity in the form of the company’s 100% proprietary Graphics Amplifier external GPU enclosure. This $200 box ($150 when I bought it on sale) doesn’t come with a graphics card inside. What that money does get you is a 460 W custom power supply with two eight-pin PCIe connectors and what would seem like plenty of room for lengthy desktop graphics cards. Alienware also puts a USB 3.0 card in the Graphics Amp that offers four such ports from the back of the unit.

Alienware doesn’t say just how much bandwidth the Graphics Amplifier gives you on the enclosure’s product page, but another quick trip into HWiNFO64 with the entire rig fired up reveals that we’re getting four lanes of PCIe 3.0 connectivity. As it does with the Thunderbolt 3 controller in the Alienware 13 R3, Dell devotes four PCIe lanes directly from the CPU to the Graphics Amp port.

Alienware’s six-foot proprietary cable is much longer than that for the Thunderbolt 3 Gaming Box, but it can’t be soft-plugged like the Thunderbolt 3 cable can. Shifting pixel-pushing duties between the Graphics Amp and the internal graphics card requires a system restart.

Aside from being 100% proprietary, the Graphics Amp has some issues. For one, this huge external enclosure is almost as large as a Mini-ITX PC in itself, mostly thanks to the full-size ATX PSU inside. Despite the monster footprint, the minimal headroom around the PCIe bracket in the Graphics Amp means it can really only hold reference versions of common graphics cards. Any graphics card that’s more than two slots tall or has a cooler that extends beyond the edge of the PCIe expansion bracket will probably prevent the Graphics Amplifier from closing.

Being restricted to reference designs isn’t the worst thing in the world if you’re using a power-efficient Nvidia Founders Edition card, but it’s annoying for noisy cards like the Radeon RX Vega 56 and RX Vega 64 where upcoming aftermarket designs might be more desirable. Some lower-end graphics cards don’t even have reference designs available, so buyers would need to carefully eyeball the footprint of those pixel-pushers before pairing them with a Graphics Amplifier. Contrast that with the wide potential compatibility and grab-and-go nature of the Gaming Box, and it’s easy to see the appeal of a Thunderbolt 3 enclosure.

Given its proprietary nature, why include the Graphics Amp in this review at all? Curiosity, really. The Graphics Amp’s dedicated PCIe 3.0 x4 connection lets us see whether and how much Thunderbolt 3 is limiting performance. You see, despite the 40 Gb/s figure that Intel and its partners tout for Thunderbolt 3, the real-world performance of the interface isn’t quite so simple.

Probing Thunderbolt 3 raises more questions than answers

The do-everything, single-plug nature of Thunderbolt 3 conceals the complexity behind a port that can handle data, DisplayPort, and USB 3.1 Gen 2 traffic all at once. The 40 Gb/s figure you’ll incessantly hear regarding Thunderbolt 3 products might not be dedicated to any one of those activities, and the controller can carve up that bandwidth to serve connected devices as needed. The major point to be aware of is that saturating that 40 Gb/s link will usually involve multiple protocols on the same cable, namely DisplayPort plus USB or general data traffic in what I imagine will be the most common scenario.

A breakdown of various Thunderbolt bandwidth allocations in common potential scenarios. Source: Intel

Perhaps because of that great flexibility and multi-protocol readiness, Intel offers conflicting accounts of just how much of that 40 Gbps will be available to connected “data” devices without competing forms of traffic on the wire. In its technology brief (PDF), the blue team claims that a Thunderbolt 3 link can easily handle 40 Gb/s of bi-directional data traffic, no questions asked. Later on, however, the same document suggests that even without display traffic in the mix, Thunderbolt 3 “data-only” connections are limited to 22 Gb/s of bi-directional bandwidth. I asked Intel for clarification on this point, and the company says that 22 Gb/s figure is “a conservative estimate of the net or effective bandwidth” available from the connection to data devices once overhead is accounted for.

Given our results, I’m not sure I’d call that 22 Gb/s figure conservative. Even with four CPU PCIe 3.0 lanes running to our host laptop’s Thunderbolt 3 controller, the Gaming Box doesn’t appear to be consuming anything close to the 3.94 GB/s maximum transfer rate those lanes would seem to allow. AIDA64’s GPGPU benchmark and the CUDA-Z utility are favored tools for evaluating the performance of one’s host-to-device link among the external-graphics community, so I only saw it fitting to run those benchmarks as I began my own exploration of the performance of the Thunderbolt 3 interface itself.

In AIDA64’s GPGPU benchmark, the memory write test is a directed measure of host-to-device bandwidth. By that figure, the benchmark turns in a result of 2218 MB/s with the GTX 1070 in the Gaming Box. Plop that same card in the Alienware Graphics Amplifier, and you get 2841 MB/s—roughly 28% better. That’s still well short of the theoretical maximum we’d expect to see from four lanes of PCIe 3.0, but it’s still substantially better than Thunderbolt 3’s result. For perspective, the GTX 1060 6 GB notches a 6021 MB/s memory write result with eight lanes of PCIe 3.0 behind it.

The numbers play out similarly in the CUDA-Z utility, which has a host-to-device bandwidth test of its own. With this utility, we get a 2141 MB/s host-to-device bandwidth from the Gaming Box and a 2802 MB/s figure for the Graphics Amplifier. Once again, the GTX 1060 6 GB puts its eight lanes of PCIe 3.0 to good use and turns in a 5806 MB/s result.

In fairness, these are tests of compute performance, not graphics performance, and we don’t see anything approaching saturation of the PCIe 3.0 bus from these benchmarks even with 16 lanes running directly from a Core i9-7900X to a GeForce GTX 1080 Ti. Still, the memory write tests for the Gaming Box from AIDA64’s GPGPU benchmark and the host-to-device bandwidth tests in CUDA-Z land well under even the 2.75 GB/s or so that Intel’s 22 Gb/s guidepost would allow in practice.

Given that the number of lanes devoted to a Thunderbolt 3 can vary, and that their sources can change from laptop to laptop, some potential characteristics of a Thunderbolt-to-CPU link don’t seem ideal for latency-sensitive applications like graphics. We might have limited bandwidth, multiple controller chips, and several potential hops through intermediate nodes on the way to the CPU and system memory.

Worse, there’s no way to figure most any of this out without a given system in one’s hands. Laptop makers are notoriously opaque about the internals of their systems, and a full run-down of the source and number of PCIe lanes dedicated to a given system’s Thunderbolt controller basically requires an independent analysis. Dell at least provides a table of its Thunderbolt 3 systems and the number of lanes it devotes to their controllers, but not whether they come from the CPU or the chipset. To my knowledge, though, most companies aren’t nearly as forthcoming, and that’s a problem for folks looking for the best performance out of their Thunderbolt 3 peripherals.

With all of these ground rules laid, let’s see how a Thunderbolt 3 external graphics enclosures performs in practice with the Gaming Box

 

Doom (1920×1080)
Doom remains one of the fastest and most furious games around, so it was a natural choice as we began our explorations of external-graphics performance. Unless otherwise noted, Doom and all of the other games in our test suite were run on an external monitor using the GTX 1070’s DisplayPort connection.

While we’d usually benchmark Doom with the Vulkan renderer, the game wouldn’t start on the Gaming Box with Vulkan enabled. I decided that it was better to have performance data from OpenGL than to not benchmark Doom at all, but that incompatibility is just one of the many unexpected paper cuts one might encounter when gaming on an external graphics enclosure.

Doom gets off to a good start on our external graphics enclosures. Both external graphics boxes offer a nice boost over the Alienware 13 R3’s internal GTX 1060 6GB. Even so, the Gaming Box’s GTX 1070 loses at least 13% of  its performance potential to Thunderbolt 3 versus how it runs on the PCIe 3.0 x4 connection of the Graphics Amplifier, and its 99th-percentile frame time increases by 15%.


These “time spent beyond X” graphs are meant to show “badness,” those instances where animation may be less than fluid—or at least less than perfect. The formulas behind these graphs add up the amount of time our graphics card spends beyond certain frame-time thresholds, each with an important implication for gaming smoothness. Recall that our graphics-card tests all consist of one-minute test runs and that 1000 ms equals one second to fully appreciate this data.

The 50-ms threshold is the most notable one, since it corresponds to a 20-FPS average. We figure if you’re not rendering any faster than 20 FPS, even for a moment, then the user is likely to perceive a slowdown. 33 ms correlates to 30 FPS, or a 30-Hz refresh rate. Go lower than that with vsync on, and you’re into the bad voodoo of quantization slowdowns. 16.7 ms correlates to 60 FPS, that golden mark that we’d like to achieve (or surpass) for each and every frame, while a constant stream of frames at 8.3-ms intervals would correspond to 120 FPS.

At the 50-ms and 33.3-ms marks, the Gaming Box puts some time in the bucket thanks to one big spike toward the beginning of our run. Its frame-time plot otherwise meshes with the vanishing amount of time all of these cards spend past 16.7 ms. The real story appears at the 8.3-ms mark, where the Gaming Box spends less than half the time under 120 FPS that the internal GTX 1060 6GB does. The Thunderbolt 3 interface keeps the Gaming Box from total dominance, though, as the Graphics Amp spends half the time past 8.3 ms that the Aorus box does. Still, both external graphics boxes offer a major boost over the Alienware 13 R3’s stock guts.

Next, let’s switch over to the Alienware 13 R3’s internal display and see how the choice of monitor affects performance at this resolution.

 

Doom (1920×1080, internal display)




Asking the GTX 1070 to pipe finished frames back to the internal display on the Alienware 13 R3 has a major effect on performance and delivered smoothness. The Gaming Box loses a whopping 31.2% of its performance potential this way, and its 99th-percentile frame time rises by about 35% compared to piping finished frames to an external screen. Those drops are enough to take the Gaming Box’s performance well below that of the internal GTX 1060 6 GB.

The extra bandwidth that the Alienware Graphics Amplifier has on tap isn’t enough to stave off a large performance loss of its own. The PCIe 3.0 x4 connection does lessen the blow, however, resulting in a roughly 20% decrease in average FPS compared to running an external display. 99th-percentile frame times rise by about 18%, but they still stay well below that of the GTX 1060 6 GB, to say nothing of the Gaming Box in this state. Doom is still plenty fast and smooth with this setup, where it doesn’t feel nearly so nice with the Gaming Box.



Apologies for the rather unwieldy graph setup above, but we can’t easily do a side-by-side comparison with our time-spent-beyond data. Piping the output of these cards to the laptop’s internal display has practically no effect on the vanishing amount of time these cards spend past 16.7 ms. Flip over to the 8.3-ms mark, though, and we get a fine picture of just how the round trip out to the graphics card and back to the display goes. The Gaming Box goes from a nice lead over the GTX 1060 6 GB to trailing it by a substantial margin. The Graphics Amp’s degraded performance in this scenario still leaves it with a substantial lead over the GTX 1060 6 GB, to say nothing of the Gaming Box.

Next, let’s flip back to an external monitor to see how Doom performs at 2560×1440 on these setups.

 

Doom (2560×1440)

When we push up the resolution in Doom, both external graphics boxes still deliver impressive frame rates, but the Gaming Box’s 99th-percentile frame times take a dive even past those of the internal GTX 1060 6GB.


Our time-spent-beyond analysis shows just how hard of a dive we’re talking. Despite the high average frame rates it allows the GTX 1070 inside to deliver, the Aorus Gaming Box puts a few milliseconds up past 50 ms and 33.3 ms. Those tough frames further add up to a little over a second and a half spent beyond 16.7 ms for the Gaming Box. That’s still nothing much compared to the nearly seven seconds that the GTX 1060 6GB spends on similarly tough frames, though. A trip to the 8.3-ms graphs shows a wider gap between the external graphics boxes. The Graphics Amp spends nearly seven fewer seconds under 120 FPS than the Gaming Box. Both boxes still offer higher performance than the GTX 1060 6GB, but it’s clear that the higher resolution has a greater-than-expected negative effect on performance with our external graphics contenders.

 

Doom (2560×1440, internal display)




Driving the Alienware 13 R3’s internal display at its native resolution isn’t a fun task for either external graphics enclosure, but it’s especially trying for the Gaming Box. The Thunderbolt 3 enclosure loses 30% of its performance potential to the display-routing decision alone, and its 99th-percentile frame time runs way closer to the 33.3-ms mark than I’d prefer from a graphics card this powerful. You’ll feel that result in distinctly hitchy and loping gameplay at these settings with the Gaming Box.

The Graphics Amplifier softens the blow to the GTX 1070, even if it does still lose about 24% of its performance potential versus an outboard display.  Weirdly enough, though, the GTX 1070’s 99th-percentile frame time actually improves in the Graphics Amp versus its behavior with an external monitor. Odd. Even so, that result means the Graphics Amp still delivers the smoothest gaming experience possible for the GTX 1070.



Given the average-FPS ranges we’re working with, it’s most useful to consider how much time past 16.7 ms these cards spend working on tough frames. There’s no polite way to put it: the Gaming Box falls flat on its face when we ask it to drive an internal monitor in this title. The Thunderbolt 3 enclosure racks up more than an extra 10 seconds past 16.7 ms when we switch from an internal display to an external one, and that extra time will translate to a less-than-pleasant gaming experience in Doom at these settings.

While we won’t be testing every game in our test suite with both internal and external monitors, our results with Doom do suggest that the performance losses inherent to driving a laptop’s internal display with today’s external-graphics interfaces are so dear that this is simply not the ideal way to enjoy the power of an external-graphics setup. Future interfaces will apparently need to provide much higher bandwidth to bring external graphics enclosures on par with internal cards if a gamer wants to drive a notebook’s internal monitor.

 

Gears of War 4 (1920×1080)
Gears of War 4 is another DirectX 12 title that’s become a staple for modern GPU testing. Thanks to the guiding hand of Microsoft and The Coalition, this title offers an GPU-vendor-neutral DirectX 12 environment from which to draw performance results. To judge its performance, I took a one-minute stroll through a convenient section of “The Raid” at the beginning of the game. I used the Ultra preset at 2560×1440 to make our cards sweat.

Gears of War 4 at 1920×1080 isn’t the kindest to the Gaming Box. The GTX 1070 loses 22.2% of its performance in the move from a PCIe 3.0 x4 link to Thunderbolt 3, and its 99th-percentile frame time rises by 25%. That means the Gaming Box allows the GTX 1070 to turn in only a slightly smoother and more fluid gaming experience at these settings than our internal GTX 1060 6GB.


For all that, Gears of War 4 turns in neat-and-clean percentile curves and practically no time of note spent past 16.7 ms on any of these cards. The extra performance potential of the Graphics Amplifier comes out at the 8.3-ms mark, though, where the Alienware box spends a whopping 10 fewer seconds under 120 FPS compared to the Gaming Box. As our 99th-percentile frame time numbers suggested, Thunderbolt 3 takes the Gaming Box a lot closer to the performance of an internal GTX 1060 6GB than not.

 

Gears of War 4 (2560×1440)

The move to a higher resolution in Gears of War 4 slows the GTX 1060 6GB enough that the Gaming Box-hosted GTX 1070 opens a considerable lead in both average FPS and 99th-percentile frame times. The Thunderbolt 3 interface still takes 15% off the GTX 1070’s potential with a full four lanes of PCIe 3.0 to work with, though, and 99th-percentile frame times rise 17%. Even with those losses, the Gaming Box still isn’t far off an optimally smooth gameplay experience. Let’s see just how far off with our time-spent-beyond graphs.


At the 16.7-ms mark and on either interface, the GTX 1070 shows its superiority over the GTX 1060 6GB by spending practically no time at all on tough frames that would drop instantaneous frame rates below 60 FPS. That’s a fine result for the Gaming Box. A look at the 8.3-ms mark shows that the PCIe 3.0 x4-attached GTX 1070 can spend almost six seconds less on tough frames compared to the Thunderbolt 3 Gaming Box, though.

 

Hitman (1920×1080)
Hitman‘s DirectX 12 renderer can stress every part of a system, so we cranked the game’s graphics settings  and got to testing.

Hitman proves no kinder than Gears of War 4 to the Gaming Box at 1920×1080. The Thunderbolt 3 interface barely allows the GTX 1070 to outpace the internal GTX 1060 6GB in my laptop, and it actually causes the more powerful card to turn in a slightly worse 99th-percentile frame time compared to the GTX 1060 6GB. On a PCIe 3.0 x4 interface, the GTX 1070 delivers a third more frames on average and a 30% lower 99th-percentile frame time than it can on Thunderbolt 3.


Our time-spent-beyond analysis shows just how hard Hitman hits the GTX 1070 on Thunderbolt 3. On the PCIe 3.0 x4 bus, the GTX 1070 spends barely any time at all on tough frames that take longer than 16.7 ms to render. Throw Thunderbolt 3 into the mix, and that time increases by almost four seconds. A look at the 8.3-ms mark also shows a rough break for the Gaming Box, as the Thunderbolt 3 interface holds up the GTX 1070 for almost ten seconds longer than the good old PCIe bus does.

 

Hitman (2560×1440)

At 2560×1440, Hitman lets the Gaming Box-enclosed GTX 1070 separate itself from the internal GTX 1060 6GB a lot better, at least in our average-FPS measure of performance potential. Despite its 20% higher frame rate, though, the 99th-percentile frame time of the Gaming Box-hosted GTX 1070 remains a lot closer to that of the GTX 1060 6GB than not. That result means that even though the TB3-attached external card might provide a more fluid gaming experience than the GTX 1060 6GB, it might not be appreciably smoother than the less-powerful card.


That higher 99th-percentile frame time plays out as a lot more time spent on frames that take longer than 16.7 ms to render for the Gaming Box compared to the Graphics Amplifier. The Gaming Box still delivers an impressive reduction in time spent on frames that take longer than 16.7 ms to render relative to the GTX 1060 6GB, but it’s clear that Thunderbolt 3 is still leaving a lot of potential smoothness and fluidity on the table.

 

Rise of the Tomb Raider (1920×1080)
Rise of the Tomb Raider remains a gorgeous game today, and its DirectX 12 renderer means it remains useful for assessing our graphics cards’ performance, too.

Back at 1920×1080, Rise of the Tomb Raider once again causes the GTX 1070 to perform more like a GTX 1060 than not, both in performance potential and delivered smoothness. The wider bus afforded by the Alienware box really lets the GTX 1070 stretch its legs at this lower resolution.


Although the Gaming Box spends nearly three seconds less than the internal GTX 1060 6GB past 16.7 ms, the same card in the higher-bandwidth Graphics Amplifier spends practically no time working on tough frames that take more time than that to render. That means that Thunderbolt 3 is potentially resulting in a less-smooth gaming experience, all else being equal.

 

Rise of the Tomb Raider (2560×1440)

The Gaming Box turns in a much better result with Rise of the Tomb Raider at 2560×1440. Where the GTX 1060 6GB’s average frame rate and 99th-percentile frame times crumble, the TB3-connected GTX 1070 stays strong. It loses less than 10% of its performance potential to the Graphics Amplifier and only ends up slightly behind in the 99th-percentile frame time race.


That slightly higher 99th-percentile frame time translates into a few more seconds spent past 16.7 ms working on tough frames for the Gaming Box compared to when its GTX 1070 is inside Alienware’s Graphics Amp. Still, Rise of the Tomb Raider proves a strong showing for the Gaming Box relative to the GTX 1060 6GB in my laptop.

 

The Witcher 3 (1920×1080)

The Witcher 3 doesn’t like any of these platforms much. The Gaming Box actually falls behind the GTX 1060 6GB in both performance potential and delivered smoothness, and the 99th-percentile frame time of the Graphics Amplifier is worse than we might expect for a 75-FPS average.


Those high 99th-percentile frame times hurt the GTX 1060 6GB and the Gaming Box more than they do the Graphics Amplifier, though. Where the Alienware box spends a little over a second on tough frames that take longer than 16.7 ms to render, the internal GTX 1060 6GB turns in nine seconds past that threshold. The Gaming Box heaps almost four more seconds on top of that.

 

The Witcher 3 (2560×1440)

The course of our testing so far suggests that higher resolutions would help the Gaming Box, but The Witcher 3 runs worse. Much worse, in fact. The GTX 1060 6GB and the Gaming Box turn in about equal measures of performance potential, but the external enclosure’s 99th-percentile frame time falls well behind that of the internal graphics chip. The Graphics Amp remains relatively unperturbed by the increase in resolution, turning in a perfectly playable (if not outstanding) performance.


Unusually for any recent review covering high-end graphics products, it’s most informative to start our analysis at the time-spent-beyond-50-ms mark. You can feel the nearly-a-second that the Gaming Box spends past 50 ms as sharp jerks in frame delivery, and they’re not pleasant. Worse, the Gaming Box spends nearly four seconds longer than the internal GTX 1060 6GB working on tough frames that take more than 33.3 ms to render, suggesting a particularly gritty frame-delivery experience. Where the Graphics Amp-housed GTX 1070 spends a less-than-optimal 10 seconds on tough frames that take more than 16.7 ms to render, both the GTX 1060 6GB and the Gaming Box need about twice that much time.

 

Conclusions
Gigabyte’s Aorus GTX 1070 Gaming Box offers us a tantalizing look into a future where discrete-graphics power is just a single cable away for notebook PCs. This tiny, quiet external graphics card can sit on a desk without taking up any more room than a large hard-drive enclosure, and for predominantly mobile folks who want to game but don’t want to keep a separate PC around, the Gaming Box really does provide desktop-class graphics horsepower to mobile systems.

For all the finesse in Gigabyte’s external-graphics execution, though, Thunderbolt 3 doesn’t quite prove the match it’s made out to be for the Gaming Box. The 40 Gb/s figure commonly cited when discussing Thunderbolt 3 bandwidth just isn’t representative of the throughput available to the Gaming Box in our real-world testing. In those benchmarks, the Gaming Box can tap less than even Intel’s 22 Gb/s “conservative estimate” of typical throughput on the interface. That limited bandwidth would seem to lead to lower performance than we’d expect from the GTX 1070 inside the Gaming Box.

Even with our best-case testing setup using an external monitor, the GTX 1070 in the Gaming Box tends to perform more like a mobile GeForce GTX 1060 6GB at 1920×1080, a resolution where we would really expect the more powerful card to stretch its legs. That’s still a good gaming experience for folks whose laptops might not have a discrete graphics card at all, but it’s undeniable that Thunderbolt 3 leaves quite a bit of the card’s performance on the table at lower resolutions. Enlarging one’s canvas to 2560×1440 seems to let the GTX 1070 realize more (but not all) of its potential from this interface.

Either way, an external monitor is mandatory for getting the most out of a Thunderbolt 3 external graphics setup. Accelerating a laptop’s internal display over that interface leads to even greater performance losses—enough so that the Gaming Box trails the internal GTX 1060 6 GB in our test laptop. We simply can’t recommend dropping well over $500 on an external graphics enclosure like this one if you aren’t planning to use an external monitor with it in light of that behavior.

External graphics performance could vary with a given implementation of a Thunderbolt controller in a notebook, too. Laptop makers that don’t run four lanes of PCIe 3.0 to a Thunderbolt controller might get an earful from enthusiasts wondering why the performance of their external graphics enclosure is falling far short of expectations. Some kind of disclosure requirement on the lane-allocation choices behind a given Thunderbolt 3 implementation would seem to be in order.

Last but not least, some games just don’t seem to like external graphics enclosures or limited bandwidth at all. Witness The Witcher 3‘s performance and the unwillingness of Doom‘s Vulkan renderer to even start on the Gaming Box. We can’t say for certain what games will or won’t perform well with external-graphics setups without extensive hands-on testing, so potential buyers will have to stomach the potential risk of performance pitfalls or outright bugs in some titles.

As with all mobile PC hardware, the Gaming Box’s substantial compromises—and, most likely, the compromises of other Thunderbolt 3 external graphics enclosures—are ultimately about convenience. So long as you have a well-configured Thunderbolt 3 notebook and the space for an external monitor, the Gaming Box will allow you to take your notebook out and about and plug in a big shot of gaming power when you’re back at your desk. If you can live with the same compromises that the Gaming Box makes in the name of that single-cable convenience, its $560 price tag is an excellent value among today’s external graphics enclosures. Just be aware of exactly what you’re getting into before you take the plunge.

Comments closed
    • squeeb
    • 2 years ago

    I’m running an R3 w/ AGA (RX 470) – its kind of cool that we can do this now. The PSU and fan can be replaced with other 3rd party options. As mentioned, the case is mostly limited to reference models.

    • tootercomputer
    • 2 years ago

    kudos to Gigabyte for taking a stab at this with a finished package, including a tote bag even. They are always good about trying out new hardware configurations, e.g., I recall they had a ram drive prior to SSDs, something like that. Despite some of the limitations you found, a lot of folks might still find this a great solution.

    Second thought is that throughput if always less than advertised. Go back to the first sata specs, usb, firewire, and if my memory serves me right, none of them ever sustained their advertised throughput.

      • derFunkenstein
      • 2 years ago

      Yes, Gigabyte always liked doing weird stuff. I believe [url=https://techreport.com/review/9312/gigabyte-i-ram-storage-device<]this doohickey[/url<] is what you're recalling.

        • tootercomputer
        • 2 years ago

        Yep, that’s it. Thanks. January, 2006, really not that long ago, but eons ao in storage terms.

        Here’s an interesting computer they desinged a little ways back: [url<]https://www.theverge.com/circuitbreaker/2016/12/27/14092678/gigabyte-brix-compact-gaming-pc-strange-design[/url<] They have had some innteresting cases over the years.

    • Belldandy
    • 2 years ago

    I tested the Aorus GTX1070 Gaming box with a MSI GS43VR-7re laptop’s thunderbolt 3 port which is 4 lanes off the PCH. I’m getting a Device to Host rate of 2670.71 MiB/s and Host to Device rate is 2138.58 MiB/s in Cuda-Z

    It would seem that having the thunderbolt3 port on the PCH isn’t necessarily a deal breaker.

    • Firestarter
    • 2 years ago

    Your analysis of the performance with the laptop’s monitor vs the external monitor makes me wonder if we will see “eGPU-ready” laptops that feature a displayport *input*, high/variable refresh display and dedicated lanes for the thunderbolt port. Such a laptop+eGPU bundle would be a much easier sell than if you also had to buy a gaming monitor, I think. I don’t think the “eGPU-ready” laptop itself would have to be very expensive either, as it’s just a case of the right configuration, not necessarily expensive components

    • Kretschmer
    • 2 years ago

    Why wouldn’t you bench against a desktop GTX 1070? Sure, some of the lower res tests would be skewed by the CPU delta, but that’s the only way to prove a value frame of reference to the buyer.

      • tsk
      • 2 years ago

      I saw a comparison vs desktop but I cannot find the article right now. However the 1060 was pretty close in performance, the 1070 started to fall considerably behind and the 1080 barely saw any performance improvement over the 1070 when used via eGFX.

      Will dig for the source.

        • Kretschmer
        • 2 years ago

        I know we could go to other site for that info, but I want the best TR articles possible. 🙂

      • derFunkenstein
      • 2 years ago

      As soon as you bench a desktop vs a notebook you’ve got a whole new set of bottlenecks and unknowns. Is the laptop slower because it is limited by Thunderbolt or because the CPU and RAM are slower? And if they perform equally then obviously you’re running into the graphics card’s limitations. How is that useful, and what does that tell us about eGPUs?

        • The Egg
        • 2 years ago

        In theory you could get a desktop CPU with same architecture and number of cores/threads and downclock the CPU and RAM to be the same, but you still have potential differences in cache and chipset.

        IMO, a large part of testing needs to be done on a desktop so you can remove the card from the enclosure and get control numbers on how it performs in a direct x16 slot (if any different from a reference card). Most importantly, you can see exactly what performance is lost by going through Thunderbolt 3 and each specific enclosure — on identical hardware.

          • derFunkenstein
          • 2 years ago

          I doubt you’re going to find a desktop board with both a x16 slot and eGPU capabilities in the Thunderbolt 3 ports

        • Kretschmer
        • 2 years ago

        Most games benched at 1440P and above will be GPU limited.

        It’s imperative to bench an eGPU enclosure against desktop GPUs, because the #1 question for this tech is “Can a laptop and eGPU efficiently replace my gaming desktop?” If you can’t answer that question, the entire discussion is moot.

          • derFunkenstein
          • 2 years ago

          I’m not saying you can’t bench against desktop GPUs. I’m saying you can’t introduce other changes and have a scientific result, even at 1440p.

    • Lianna
    • 2 years ago

    Thank you for the excellent article, nice research (PCIe lane configurations) and great comparison (direct PCIe extender).

    As suggested in Zotac Amp Box news piece, it could be nice to put a speedy and big SSD like Samsung 960 EVO* and**/or an Aquantia 10/5 GbE card to get some work done faster, especially on light 4-core CPU powered ultrabooks.

    * Considering increased latency and limited bandwidth it would be more for capacity than speed reasons and things like Optane 900 would be probably wasted on it.

    ** Would it support PCIe dual-riser/multiplexer?

    • Shobai
    • 2 years ago

    Why a 450W PSU? What other ancillaries are being supplied? Allowing 100W for TB3, where else is the remaining 350W of power used? TR’s recent 1070Ti review suggests that 450W would be enough to supply the whole tested Vega 64 configuration, and that’s widely derided as power hungry…

      • Jeff Kampman
      • 2 years ago

      – Economies of scale (Gigabyte also makes a GTX 1080 Gaming Box with the same chassis)
      – Efficiency (PSUs are often most efficient near half their rated loads, not near 100%)
      – Quality of service over design lifespan (PSUs often lose some of their rated wattage over time due to physical degradation)
      – Simultaneous operation of the graphics card and USB Power Delivery, as you note

        • Shobai
        • 2 years ago

        Thanks for the detailed response!

        I don’t recall hearing about a GTX 1080 version, but that would definitely affect purchasing considerations and especially in such a low volume item. In light of your other comments, though, such a version doesn’t sound particularly compelling.

        I wonder whether you happened to test this enclosure on your usual wattmeter; it doesn’t look like it could be used in direct comparison to your usual numbers, what with possible under-utilisation and other vagaries, but it would be interesting to see the behaviour of the box. I realise the efficiency curve dips at the top end, but it does at the bottom end as well – I would think that, without the USB-PD component, the box would tend to fall off the front of the curve rather than sit in the middle.

        I guess it’s also implementation-specific, and I may have missed TR’s article on it, but can the enclosure charge a connected laptop which is plugged in to its usual power supply?

        Another point you might list would be to enable overclocking headroom, also.

    • James296
    • 2 years ago

    Hmmm, I wonder if you do a MTX build in that case?

    • tsk
    • 2 years ago

    Excellent article Jeff. Been waiting for something like this to look at all aspects of these eGFX boxes. It seems that it’s pointless to put anything more powerful than a 1060 class card in this as you start to hit diminishing returns above that. It would be interesting to try with AMD cards to see if the performance is similar.

    • Thac0
    • 2 years ago

    Will you guys consider putting together a an article that reviews the deminishing returns of increasingly powerful GPUs in external cases? Say from a 1050 TI to a 1080 TI?

      • Jeff Kampman
      • 2 years ago

      If there was a lot of interest (and I mean [i<]a lot[/i<]) I might put an SFF GTX 1050 Ti and the GTX 1060 duo through this exercise, but it's already clear with the GTX 1070 that whatever bandwidth TB3 provides is not sufficient. It also doesn't make a ton of sense in real-world terms to put a $150 graphics card in a box that's at least twice as expensive, if not more so. Gigabyte isn't packaging GTX 1050 Tis in eGPU enclosures.

        • SkyWarrior
        • 2 years ago

        I see that and I am adding one more. A possible mini review with a mac mini or macbook pro. I really wish to see if this makes any sense.

          • derFunkenstein
          • 2 years ago

          The Mac Mini as it is will be an even tighter bottleneck, since it’s stuck with 20Gbps Thunderbolt 2. A current-gen MBP with the full four lanes on its Thunderbolt controller (so any of the four-port options using the ports on the left) should be roughly equivalent to the Alienware tested here. Plus you’re stuck with testing in macOS because those MBPs don’t boot Windows 10 with the eGPU enabled without [url=https://egpu.io/bootcamp-setup-guide-tb3-macbook-pro/<]a bunch of fiddling[/url<]. If Jeff was going to do this, I'd want to see it on a system with a Core i5-8250U or i7-8550U, since those CPUs should let the graphics card stretch its legs. edit: looks like the Alienware 13 already has a Core i5-7300HQ or Core i7-7700HQ inside, so that would be an even better test of pushing the CPU. Slower CPUs will only hide some of the delta.

      • Bauxite
      • 2 years ago

      I’m gonna start with a mandatory link to anyone interested in these: [url<]https://egpu.io/[/url<] You can find benches for quite a few cards in the last couple generations of both colors, everything from 750ti to Vega release has probably been done, hell I saw a Titan V the other day. Your #1 concern is the model of laptop! Each configuration makes a huge difference, not many have proper x4 connections, almost all of them drive it from the PCH instead of the cpu and there are other poorly-documented things like U-series cpus often downclock the bus even more (GT# OPI). There are a couple recent models that are plug-n-play though. Somewhat counter-intuitively these are better for higher resolutions than lower, they are not as far behind at 4k versus 1080p. Intel needs to get out of its own way with this stuff though, ~22Gbps is a rather annoying problem and still no hint of combining bandwidth. 2-4 bridged thunderbolt controllers straight from cpu lanes seems so obvious where it needs to go, along with a reference device board. (hint: be lazy, use ITX, done) MSI made a proper full 3.0 x16 laptop+dock combo (adapted connector blade with power etc) during haswell era but nothing else like it has come along since.

        • Bauxite
        • 2 years ago

        Also, its nice to see that the 8550u and other quad core U-series cpus are actually pretty up to snuff for this, especially if you run external mode and boot with the igpu disabled. They have decisively shown the bottleneck is all at the thunderbolt interface.

    • thill9
    • 2 years ago

    Who cluttered up a picture of a Force of Will judge promo card with some electronic gizmos?

      • Jeff Kampman
      • 2 years ago

      It occurred to me after that I should have pulled out a Lightning Bolt, but c’est la vie…

        • CampinCarl
        • 2 years ago

        Why not an actual [url=http://gatherer.wizards.com/Pages/Card/Details.aspx?multiverseid=4561<]Thunderbolt?[/url<]

          • Jeff Kampman
          • 2 years ago

          [url<]https://twitter.com/jkampman_tr/status/944261024869486593[/url<]

Pin It on Pinterest

Share This