Intel’s NUC8i7HVK “Hades Canyon” gaming PC reviewed

Back at CES this year, one could say that Hades finished freezing over when Intel revealed the full details of its first products with its own CPUs and Radeon graphics on board. As a brief refresher, eighth-gen Core G-series parts join quad-core Kaby Lake CPUs with a Radeon RX Vega M GH graphics processor and 4 GB of HBM2 RAM, all on one package. Intel added another dash of silicon exotica to G-series CPUs by using one of its Embedded Multi-Die Interconnect Bridges (EMIBs) to snug that HBM2 RAM right up against the Radeon RX Vega M GPU itself.

The Intel Core G-series package.

Intel touted a few design wins for Kaby-G processors at CES, but the most interesting among them was its own Hades Canyon NUC, also known as the NUC8i7HVK. This tiny gaming system displaces just 1.2 L, but it houses the most powerful G-series CPU so far: the Core i7-8809G. With a 100-W package power rating to share between a four-core, eight-thread CPU and Radeon RX Vega M GH graphics, plus fully-unlocked CPU, graphics, and memory multipliers, the i7-8809G is easily the most tantalizing implementation of this unholy union for enthusiasts.

For a system its size, Hades Canyon bristles with more connectivity than even some high-end desktops we’ve tested. Around front, we get a USB 3.1 Gen 2 Type-A port and a USB 3.1 Gen 2 Type-C port. Digging around in HWiNFO64 shows an ASMedia ASM2142 controller behind those ports, and it’s connected directly to the CPU with two of the eight PCIe 3.0 lanes remaining from the CPU (of a possible four from that split). An SD card reader occupies another x1 lane from the CPU, and the remaining three lanes from that bunch go unused. A front-panel HDMI 2.0 port delivers pixels from the Radeon RX Vega M GH. Intel also positions a consumer IR receiver and a quad-microphone far-field array with beamforming on the front of the unit. That mic array should make Hades Canyon useful with voice assistants like Microsoft’s Cortana.

On the back panel, buyers get four USB 3.0 ports and two Thunderbolt 3 ports powered by an Intel JHL6540 Alpine Ridge controller that’s connected to the HM175 chipset. On top of data transmission, those Thunderbolt 3 ports act as DisplayPort connectors for compatible monitors. They get their pixels from the RX Vega M GH. Two more Mini-DisplayPort 1.2 connectors and an HDMI 2.0 port also run off the Radeon graphics chip. As a result, all of the NUC’s dedicated display connections can go hand-in-hand with FreeSync monitors to deliver a tear-free gaming experience.

The lid atop Hades Canyon is easy enough to remove with the included Allen wrench. Doing so reveals a metal frame that supports the RGB LEDs and diffusers beneath this system’s customizable skull logo. Only one more screw stands between lifting this frame out of the unit and revealing the user-serviceable parts of the NUC.

Since this is a barebones system, builders will need to bring their own M.2 SSDs and RAM to the party. Our review system came with one of Intel’s Optane SSD 800P 118-GB SSDs as its system drive and an Intel SSD 545s 512 GB SATA drive as its auxiliary storage. Some further poking around in HWiNFO64 reveals that Intel hooked up the two M.2 slots inside Hades Canyon to the HM175 chipset, as well. Intel also preinstalls one of its Wireless-AC 8265 cards in the NUC’s M.2 E-key slot, and its internal antennas don’t clutter up the clean exterior of the unit.

While I would have liked to have shown the cooling system that Intel uses to move heat away from the Core i7-8809G, tearing down the NUC further than we have here is a fraught process thanks to the range of antenna and lighting cables snaking around the edges of the motherboard. The brave folks at Gamers Nexus did fully take apart their unit, however, and their bravery shows us that Intel’s heatsink combines a relatively large copper vapor chamber with a pair of fin arrays nestled against a pair of blower fans.

Disassembly challenges aside, I find Intel’s PCIe lane routing choices inside this NUC somewhat peculiar. The CPU’s PCIe lanes aren’t fully tapped even as bandwidth-hungry ports and slots sit behind the DMI 3.0 bus that connects the CPU and chipset. It seems especially odd to me that Intel relegated its gee-whiz Thunderbolt 3 controller to chipset PCIe lanes while putting the SD card reader, of all things, on a direct line to the CPU. Few (if any) games are I/O-bound today, but if gamers ever want to supplement Hades Canyon’s onboard graphics with an external graphics card, it’ll have to contend with traffic from the other devices hanging off the HM175 chipset.

To power the NUC, Intel includes a massive 230-W power brick that’s nearly as large as the unit itself. At 7.8″ long by 3.9″ wide by 1″ thick  (19.9 by 9.9 by 2.5 cm), this brick has to hide somewhere if an owner wants to keep the NUC’s desktop footprint to a minimum. This power brick might be the one dent in Hades Canyon’s compact cred, although it’s not atypical for powerful SFF PCs to require bulky bricks like this one.

Here are all of Hades Canyon’s specs, condensed into a handy table:

  Intel NUC8i7HVK (Hades Canyon)
Processor Intel Core i7-8809G
Memory Not included; two DDR4 SO-DIMM slots available
Maximum memory capacity 32 GB
Chipset Intel HM175
Graphics AMD Radeon RX Vega M GH with 4 GB HBM2 RAM
Storage Not included; two M.2 slots with SATA and NVMe support available
Expansion and display outputs Front panel:

One USB 3.1 Gen 2 Type-A port

One USB 3.1 Gen 1 Type-A port with charging capability

One USB 3.1 Gen 2 Type-C port

One HDMI 2.0 output

One SD card reader

One 3.5-mm combo jack for headphone/microphone
Rear panel:

Four USB 3.1 Gen 1 Type-A ports

Two Gigabit Ethernet jacks

Two Mini DisplayPort 1.2 connectors

Two Thunderbolt 3 ports

One HDMI 2.0 output

One combo audio/optical mini-TOSLINK jack

Communications Intel Wireless-AC 8265 (802.11b/g/n/ac)

One Intel I219-LM Gigabit Ethernet controller

One Intel I210 Gigabit Ethernet controller

Bluetooth 4.2

Quad-mic far-field array with beamforming

Front-panel consumer IR receiver

Dimensions (W x H x D)  8.7″ x 1.5″ x 5.6″ (221 x 39 x 142 mm)
Power adapter (L x W x H) 230 W

7.8″ x 3.9″ x 1″ (199 x 99 x 25 mm)

OS None included

Intel sells the NUC8i7HVK as a barebones unit without storage, memory, or an operating system pre-installed for $999. That’s a lofty price tag, but it’s not as high as it might seem at first glance. We put together a Mini-ITX PC powered by an unlocked Core i5-8600K CPU, 16 GB of DDR4-3200 memory, a GeForce GTX 1060 3 GB for roughly comparable graphics performance, and a Mini-ITX Z370 motherboard with Thunderbolt 3 support, all capped off with a 1-TB M.2 version of Crucial’s MX500 SSD and a 120-mm closed-loop liquid cooler. Putting 16 GB of DDR4-3200 SO-DIMMs and that same Crucial SSD in a Hades Canyon NUC carries just an 11% premium over those parts, at least at the time we put them together.

What’s more, the Cooler Master Elite 110 we chose to house that build has a volume of 15 liters—more than 10 times that of Hades Canyon. Even if you add in the half-liter that Hades Canyon’s power brick displaces, a Mini-ITX build will still be larger than this NUC by several times. That exercise suggests it’d be difficult, if not impossible, to match Hades Canyon’s potential performance-per-liter with off-the-shelf parts, not to mention that few Mini-ITX motherboards can challenge this system’s massive array of connectivity options. On the whole, Intel has put together a uniquely high-end yet truly compact system in this NUC.

What’s in a name?

We covered the Radeon RX Vega M GPU’s performance potential in depth in our launch coverage, but some questions have arisen recently over just what’s in a name with this graphics processor. Gordon Ung at PCWorld noticed that the AIDA64 utility returns “Polaris 22” as the internal code name for Radeon RX Vega M graphics, and that the DirectX feature-level support of that chip hews closer to Polaris graphics cards than it does to Vega products. Further testing by Paul Alcorn at Tom’s Hardware suggests the RX Vega M GPU doesn’t have Vega’s next-generation compute units inside—or at least that they aren’t fully enabled—since it doesn’t accelerate half-precision workloads with Vega’s Rapid Packed Math capabilities in the same way that desktop Vega graphics chips (and Raven Ridge APUs) do.

Intel’s response to this line of inquiry is that Vega M is “a custom Radeon graphics solution” that’s “similar to the desktop Radeon RX Vega solution with a high bandwidth memory cache controller and enhanced compute units with additional ROPs (Render Output Units).” It’s worth taking Intel at its word here, since we don’t know just what the blue team’s design goals were when it began development of the Vega M GPU with AMD. That said, it would be a bummer if Vega M doesn’t actually support headlining Vega features like Rapid Packed Math, the Draw Stream Binning Rasterizer, or the Next-Generation Geometry Path. Those features could become more useful with time, and it’d be a shame if Vega M can’t take advantage of their benefits.


The levers and dials

The RX Vega M GH is a Radeon through and through, and that means it would normally receive driver support through AMD’s regular software-distribution channels. Those trekking into Hades Canyon will ultimately be greeted by a version of Radeon Settings that’s rebranded for Intel’s one-stop processing powerhouse, though. Intel will also be responsible for qualifying and distributing Radeon drivers for this system.

Aside from the blue hues and Intel branding on the bumper, Radeon Settings retains the same set of controls we’ve come to expect from AMD’s post-Crimson Edition driver releases. Owners can control the clock speeds of the Radeon RX Vega M GH and its memory through Radeon WattMan, enable or disable features like FreeSync, Enhanced Sync, and Radeon Chill, and stream or record gameplay using Radeon ReLive. Folks who have built their own gaming PCs with Radeons inside will feel right at home with this software.

The other dedicated piece of software Intel includes with this system is, of course, the RGB LED control application for Hades Canyon’s backlit skull logo. While it’s easy to roll one’s eyes at the mention of anything RGB LED these days, Intel actually includes a couple cute tricks on top of the usual range of blinking and breathing animations. The skull can be set to display different lighting patterns depending on whether the system is fully powered on, asleep, or shut down, and those patterns can be static or respond to system states like processor temperature or overall power draw. The three LEDs on the front of the NUC and the one behind the power button are RGB diodes, as well, and they can be similarly tweaked. I found these options both fun and functional, but killjoys can turn off all the running lights if they want.

Intel’s own UEFI offers the full range of settings we would expect from an enthusiast desktop PC, as well. Tweakers can control CPU multipliers, memory multipliers and timings, fan profiles, and pretty much every other feature we would expect to be exposed in a full-fledged desktop motherboard. While I’m not going to delve into every one of those features here, it is worth noting that Hades Canyon’s firmware is quite polished and well-organized, on par with the best I’ve seen from the big four motherboard vendors.

Our testing methods

As always, we did our best to deliver clean benchmarking numbers. Each test was run a minimum of three times, and we took the median of those results.

Our test systems were configured as follows:

  Core i5-8400 test bed Alienware 13 R3 Intel NUC8i7HVK
CPU Intel Core i5-8400 Intel Core i7-7700HQ Intel Core i7-8809G
CPU TDP 65 W 35 W 100 W (package power)
Motherboard Gigabyte H370N Wifi N/A
Memory size

and configuration

16 GB (2x 8 GB)

Crucial Ballistix Elite

16 GB (2x 8 GB) DDR4-2666 16 GB (2x 8 GB) DDR4-3200
Memory type DDR4 SDRAM
Memory speed 2666 MT/s (rated)

2666 MT/s (actual)

2666 MT/s (rated)

2666 MT/s (actual)

3200 MT/s (rated)

3200 MT/s (actual)


GTX 1050 Ti SC

Nvidia GeForce

GTX 1060 6 GB

Radeon RX Vega M GH
Graphics memory 4 GB GDDR5, 7 GT/s


6 GB GDDR5, 8 GT/s



800 MHz

Storage Samsung 960 Pro


Samsung PM961


Intel Optane SSD 800P 118 GB NVMe SSD

Intel SSD 545s 512 GB SATA SSD

Battery N/A 76 Wh Li-ion N/A
Operating system Windows 10 Pro

Some additional notes on our testing methods:

  • We fully updated each system with the latest firmware, Windows updates, graphics-card drivers, and software versions of the applications we use in our testing before gathering data. As a result, data collected in this review should not be used for comparison purposes with test results gleaned in past TR reviews.
  • All test systems were fully patched against the Spectre and Meltdown vulnerabilities using Windows updates and manufacturer firmware updates. We confirmed those mitigations were in place using the InSpectre utility.
  • We configured all of our test systems to use Windows’ Balanced power plan.
  • Unless otherwise noted, our gaming tests were conducted at a resolution of 1920×1080 and a refresh rate of 60 Hz. Vsync was disabled in the graphics driver control panel for all of our gaming tests.

Our testing methods are generally publicly available and reproducible. If you have questions about our methods or results, hit up our forums or leave a comment on this article to discuss them.


Memory subsystem performance

Let’s get started with some basic memory-throughput benchmarks, courtesy of the AIDA64 utility. We shouldn’t expect any eyebrow-raising results here; all three of our chips use the same basic memory controller. Any differences in these multithreaded tests should be down to core counts and memory speeds.

The i7-8809G generally takes a lead in these tests thanks to our use of DDR4-3200 SO-DIMMs, but the i5-8400 and its DDR4-2666 RAM aren’t far behind. The mobile Core i7-7700HQ does lag the pack, though.

In memory latency, the i7-8809G proves itself a true desktop-class chip. Even with the relaxed timings from its DDR4-3200 SO-DIMMs, the Hades Canyon NUC beats out the Core i5-8400 and its DDR4-2666 memory here.

Some quick synthetic math tests

AIDA64 offers a useful set of built-in directed benchmarks for assessing the performance of the various subsystems of a CPU. The PhotoWorxx benchmark uses AVX2 on compatible CPUs, while the FPU Julia and Mandel tests use AVX2 with FMA.

The PhotoWorxx benchmark doesn’t seem to benefit from the Core i5-8400’s extra cores, but the Hash benchmark really lets the Coffee Lake CPU strut its stuff versus the four-core, eight-thread competition.

In these tests of floating-point throughput, six is better than four, and the Core i5-8400 pulls far ahead in the single-precision Julia and double-precision Mandel benchmarks alike. Let’s see what these results portend for real-world performance now.



The usefulness of Javascript benchmarks for comparing browser performance may be on the wane, but these collections of tests are still a fine way of demonstrating the single-threaded performance differences among CPUs.

The story here is quite simple: the Core i5-8400 Turbos up to 4 GHz, and the i7-8809G can hit 4.1 GHz in these tests. Unsurprisingly, the i7-8809G comes out on top much of the time. What’s really disconcerting is how far back the Core i7-7700HQ falls behind in these benchmarks. That’s because Alienware chose not to implement SpeedShift in the Alienware 13 R3, even if its i7-7700HQ can clock up to 3.8 GHz with Turbo Boost. Intel’s quick-response clocking tech seems to help a lot in these hard-and-fast sequences of microbenchmarks.

Compiling code in GCC

Our resident code monkey, Bruno Ferreira, helped us put together this code-compiling test. Qtbench records the time needed to compile the Qt SDK using the GCC compiler. The number of jobs dispatched by the Qtbench script is configurable, and we set the number of threads to match the hardware thread count for each CPU.

The Core i5-8400 and its six cores take a win in this test, but the Hades Canyon NUC sure comes close with its eight threads. The Core i7-7700HQ can’t clock as high as these chips in sustained workloads, so it falls well back.

File compression with 7-zip

In the common task of zipping and unzipping compressed archives, the i7-8809G stays hot on the heels of the i5-8400.


In the hardware-accelerated AES portion of this disk-encryption benchmark, the Core i7-8809G takes a surprising lead over the Core i5-8400, perhaps thanks to the greater memory bandwidth available to it from DDR4-3200 RAM. In the pure-elbow-grease Twofish portion of this test, though, the i5-8400 takes a wide lead thanks to its higher core count and the apparent lack of an advantage that Hyper-Threading conveys in this workload.

Handbrake is a popular video-transcoding app. To see how it performs on these chips, we converted a roughly two-minute 4K source file from an iPhone 6S into a 1920×1080, 30 FPS MKV using the HEVC algorithm implemented in the x265 open-source encoder. We otherwise left the preset at its default settings.

The i7-8809G can’t catch the i5-8400 in Handbrake. It seems that greater physical core counts matter more than logical ones here.



The evergreen Cinebench benchmark is powered by Maxon’s Cinema 4D rendering engine. It’s multithreaded and comes with a 64-bit executable. The test runs with a single thread and then with as many threads as possible.

The i7-8809G takes a small lead in the single-core portion of Cinebench, but it can’t parlay that performance into a multi-threaded win. The i5-8400 still comes out on top despite its lower thread count.


Corona, as its developers put it, is a “high-performance (un)biased photorealistic renderer, available for Autodesk 3ds Max and as a standalone CLI application, and in development for Maxon Cinema 4D.”

The company has made a standalone benchmark with its rendering engine inside, so it was a no-brainer to give it a spin on these CPUs.

No surprises. The i7-8809G turns in an ever-so-slightly-faster performance than the Core i5-8400 here.

Indigo Bench

Here’s a new benchmark for our test suite. Indigo Bench is a standalone application based on the Indigo rendering engine, which creates photo-realistic images using what its developers call “unbiased rendering technologies.”

The i5-8400 and i7-8809G once again trade blows at the top of the leaderboard here.


Blender is a widely-used, open-source 3D modeling and rendering application. The app can take advantage of AVX2 instructions on compatible CPUs. We chose the “bmw27” test file from Blender’s selection of benchmark scenes to put our CPUs through their paces.

Blender happily takes advantage of all the threads available to it, and the i7-8809G comes within striking distance of the i5-8400’s six physical cores thanks to Hyper-Threading.

Fluid dynamics with STARS Euler3D

Euler3D tackles the difficult problem of simulating fluid dynamics. It tends to be very memory-bandwidth intensive. You can read more about it right here. We configured Euler3D to use every thread available from each of our CPUs.

Wow, a meaningful difference in performance! Even with its DDR4-2666 deficit, the Core i5-8400 can still significantly outpace the i7-8809G and its DDR4-3200 complement in Euler3D.

Overall, the i7-8809G delivers CPU performance quite similar to that of the Core i5-8400, a fine CPU in its own right. Folks who want to do somewhat CPU-intensive work on the NUC8i7HVK won’t be displeased in the least. Let’s move on to the much more exciting domain of gaming performance.


Doom‘s Vulkan renderer has more than earned its place as a staple of our game-testing suite, and I have high expectations for its performance on Hades Canyon’s Radeon RX Vega M GH graphics chip. We chose a mixture of high and ultra settings to see what Doom can do on our contenders’ various graphics processors.

Doom lets the RX Vega M GH get off to a strong start in our gaming tests. You wouldn’t know that the Hades Canyon NUC was anything but an enthusiast’s gaming PC from these figures. With high average frame rates and a 99th-percentile frame time well under the magic 16.7-ms figure we look for, the NUC8i7HVK does Doom‘s fast-paced apocalyptic gameplay justice.

These “time spent beyond X” graphs are meant to show “badness,” those instances where animation may be less than fluid—or at least less than perfect. The formulas behind these graphs add up the amount of time our graphics card spends beyond certain frame-time thresholds, each with an important implication for gaming smoothness. Recall that our graphics-card tests all consist of one-minute test runs and that 1000 ms equals one second to fully appreciate this data.

The 50-ms threshold is the most notable one, since it corresponds to a 20-FPS average. We figure if you’re not rendering any faster than 20 FPS, even for a moment, then the user is likely to perceive a slowdown. 33 ms correlates to 30 FPS, or a 30-Hz refresh rate. Go lower than that with vsync on, and you’re into the bad voodoo of quantization slowdowns. 16.7 ms correlates to 60 FPS, that golden mark that we’d like to achieve (or surpass) for each and every frame. 8.3 ms corresponds to 120 FPS, the lower end of what we’d consider high-refresh-rate gaming.

At the critical 16.7-ms mark, Hades Canyon takes an infinitesimal lead over the mobile GTX 1060 6 GB. The GTX 1060 strikes right back at the 8.3-ms threshold by spending nearly six fewer seconds on tough frames that would drop the instantaneous frame rate below 120 FPS. That translates into a notably more fluid gaming experience from our Alienware notebook. Still, the NUC’s Vega M GH graphics are off to an impressive start for such a compact system. The 100-W package power for the Core i7-8809G is only a bit higher than the nominal 80-W total power the mobile GTX 1060 6 GB enjoys all by its lonesome.


Hitman‘s DirectX 12 renderer can stress every part of a system, so we cranked the game’s graphics settings at 1920×1080 and got to testing.

Hitman and Radeons get along great, so it’s no shock that Hades Canyon comes within spitting distance of the GTX 1060 6 GB here.

Our time-spent-beyond graphs show us that the Radeon RX Vega M GH spends a bit over two seconds longer than the GTX 1060 6 GB on tough frames that need more than 16.7 ms to render over our one-minute test run. The GTX 1050 Ti falls far behind. This is excellent performance, and it burnishes the capabilities of Intel’s dynamic power-sharing approach for this compact gaming system.


Rise of the Tomb Raider

Rise of the Tomb Raider doesn’t run quite as swiftly or smoothly on Hades Canyon as it does on the Alienware 13 R3 and its mobile GTX 1060 6 GB. We’re still well within the realm of playability on the RX Vega M GH, but the GTX 1060 6 GB handily outpaces the on-package Radeon and delivers a truly fluid gameplay experience.

Our time-spent-beyond-16.7-ms graph tells us just how much more fluid a gaming experience the beefy Nvidia chip delivers. The GTX 1060 6 GB spends over 11 fewer seconds of our test run on tough frames that force frame rates below 60 FPS. To its credit, the Radeon RX Vega M GH doesn’t put up any time on the board at thresholds like 33.3 ms and 50 ms, so one will still enjoy consistently smooth animation from the NUC here. It’s like discussing the difference between buttery spread and butter.


Prey uses Crytek’s CryEngine to bring its brain-bending sci-fi horror to life. We maxed out the limited range of settings it offers gamers to see how Hades Canyon handles it.

Despite the Radeon splash logo that graces its loading screen, Prey falls behind even the GTX 1050 Ti in this benchmark. All three of our contenders clear the bar of a 60-FPS average, but the GTX 1060 6 GB flies far ahead in average frame rates. Neither 99th-percentile frame times nor our frame-time plot suggest any roughness or grave inconsistency in the RX Vega M GH’s frame delivery—the chip just can’t churn out the frames like the GTX 1060 6 GB can here.

All of our data suggest the GTX 1050 Ti is delivering an ever-so-slightly-smoother gameplay experience in this test compared to the RX Vega M GH, and our time-spent-beyond-16.7-ms graph bears that out. The difference isn’t stark, at least: the RX Vega M GH spends about seven-tenths of a second longer on tough frames than the GTX 1060 6 GB does. Flip over to the time-spent-beyond-8.3-ms mark, though, and the GTX 1060 6 GB demonstrates its vast superiority. All that said, Hades Canyon still delivers a perfectly enjoyable Prey experience.


Grand Theft Auto V
Grand Theft Auto V remains a popular PC title years after its release, and it helps that it can still put its teeth into graphics cards and CPUs alike.

Grand Theft Auto V should run well on any graphics card these days, and the Radeon RX Vega M GH is no exception. Hades Canyon clears both the 60-FPS average and 16.7-ms 99th-percentile frame time we want to see for a truly fluid gaming experience. The GTX 1050 Ti only trails the RX Vega M GH slightly here, but its 99th-percentile frame isn’t quite as impressive. Let’s see just how that plays out in our time-spent-beyond analysis.

Not so bad, as it turns out. Neither the GTX 1060 6 GB or Hades Canyon spend appreciable amounts of time under 60 FPS, and the GTX 1050 Ti notches just under half a second of our one-minute test run on those tough frames. A click over to the 8.3-ms mark shows that the GTX 1050 Ti and NUC8i7HVK are as closely matched as their average-frame-rate figures suggest. All of these graphics processors are delivering an enjoyable gaming experience, though.


Watch Dogs 2
Watch Dogs 2‘s lovingly-detailed and large-scale reproduction of San Francisco gives us ample opportunity to put the hurt on graphics cards, so we did just that with Hades Canyon.

Watch Dogs 2 is a punishing title for any graphics card, but even so, it’s a bit surprising to see the GTX 1060 6 GB open as wide a lead as it does here. Perhaps power has something to do with it. Both the GTX 1050 Ti and the Radeon RX Vega M GH are up against power limits—the GTX 1050 Ti from its slot power arrangement, and the RX Vega M GH from its package power limit. Recall that the GTX 1060 6 GB has a nominal 80 W power budget all to itself. In any case, neither the GTX 1050 Ti nor the RX Vega M GH notch any ugly spikes in our frame-time chart. Their performance, again, seems to be a matter of just how many frames they can churn out.

Our time-spent-beyond-33.3-ms graph is a good place to start given what we know about our contenders’ performance in Watch Dogs 2. Happily, neither the GTX 1050 Ti nor the RX Vega M GH put up any noticeable time on this chart. Even with lower frame rates, we’re not looking at a rough gaming experience—just a considerably less fluid one than the GTX 1060 6 GB provides.


Ghost Recon Wildlands
Ghost Recon Wildlands offers sweeping South American vistas. We took a one-minute walk down a winding path with much of the game’s open world in view to see how our contenders handled it.

Even though Nvidia cards might claim to be the way Ghost Recon Wildlands is meant to be played, Hades Canyon opens a nice lead on the GTX 1050 Ti here. As we’ve come to expect, though, the NUC can’t quite catch up to the GTX 1060 6 GB. Even so, Intel’s tiny terror turns in a pleasingly flat frame-time plot, and that gels with the smooth gameplay I noted from it in our test run.

Our time-spent-beyond-16.7-ms graph bears out my subjective experience. The NUC spends almost five fewer seconds past 16.7 ms on tough frames in our test run, and that’s a noticeably smoother experience than the GTX 1050 Ti provides in Wildlands. The GTX 1060 6 GB remains in a different league for fluid gameplay, though.


The Witcher 3

In the process of rendering The Witcher 3‘s still-gorgeous visuals, Hades Canyon just misses a 60-FPS average, but its 99th-perecentile frame time is pleasingly close to that of the GTX 1060 6 GB’s despite the Nvidia card’s wide lead in average frame rates. The Witcher 3 is plenty enjoyable to play on the NUC as a result.

Our time-spent-beyond-16.7-ms graph tells us just how much more enjoyable that NUC-powered experience is. Hades Canyon spends just under two seconds on tough frames that drop the instantaneous frame rate below 60 FPS—way less than the roughly nine seconds the GTX 1050 Ti puts on the board by that same measure.


Deus Ex: Mankind Divided

Even this far past its release, Deus Ex: Mankind Divided remains one of the most challenging games one can run on a PC. We put Hades Canyon to the test in this title with a range of high settings.

Deus Ex: Mankind Divided remains one of the most punishing games to run in recent memory, but its teeth aren’t quite so sharp without the settings cranked. As we’ve perhaps come to expect, Hades Canyon slightly outpaces our desktop GTX 1050 Ti, but it can’t come anywhere close to catching the mobile GTX 1060 6 GB in either average frame rates or 99th-percentile frame times.

A quick look at the 16.7-ms threshold does deliver good news in the NUC’s favor, though. The Intel all-in-wonder puts nearly three fewer seconds on the board past 16.7 ms, and that means gamers can expect a smoother gameplay experience than the GTX 1050 Ti in DXMD.


Far Cry 5
Far Cry 5 is one of the more controversial titles of late. Muddled messaging aside, its rendering of the Montana backwoods is rich with light, shadow, and foliage. That makes it a perfect test for the capabilities of our trio of systems.

Even with AMD’s apparent involvement in Far Cry 5‘s development, Hades Canyon and its Radeon RX Vega M GH just can’t open that wide a lead over the GTX 1050 Ti. The GTX 1060 6 GB runs away with the gold once again. That said, the NUC and the GTX 1050 Ti mostly deliver smooth frame-time plots and pleasing 99th-percentile frame times. I had no complaints about the gaming experience the NUC delivered over the course of our test run.

At the 16.7-ms threshold, the NUC proves its advantage over the GTX 1050 Ti by putting four fewer seconds of time spent on tough frames on the board. The GTX 1060 6 GB is just a hair shy of perfect, though. That result may draw the bright line between discrete cards with dedicated power budgets and those with power limits once more.

Our experience with Far Cry 5 describes the experience of gaming on Hades Canyon in general. Sometimes, the Radeon RX Vega M GH can get within spitting distance of the mobile GTX 1060 6 GB. More often than not, however, Intel’s one-package wonder falls much closer to a desktop GTX 1050 Ti in delivered performance. Whether this is a matter of drivers, power envelopes, or just good old differences in graphics-processing resources will likely become clearer with time, but given how closely the RX Vega M GH and the GTX 1060 6 GB seem to match upon paper, it’s a bit odd to see that the Radeon chip falls much closer to a GTX 1050 Ti in practice. We probably ought to overclock Hades Canyon to see what the deal is for certain, although our quick experiments in doing so spoil the NUC’s surprisingly pleasant noise character.

Most importantly, it’s clear that Intel’s dynamic power-sharing special sauce doesn’t negatively affect smooth frame delivery. Even if Hades Canyon’s average frame rates might not have been as high as we would have expected, it didn’t trip or stumble much at all in the course of delivering those frames. As any long-time TR reader will tell you, consistent frame times go a long way towards maintaining the illusion of lifelike animation on its own. Pair this NUC’s smooth frame delivery with a FreeSync monitor, and you’d be even more assured of an exceedingly pleasant small-form-factor gaming experience.



For all of the packaging wizardry behind it, Intel’s Hades Canyon NUC needed to deliver on three criteria to make the grade as a small-form-factor gaming PC. One, small-form-factor systems actually need to be small, not just “small.” Two, they need to deliver no-compromises gaming performance. Inconsistent frame delivery has spoiled the appeal of many a tiny gaming terror. Finally, they need to be quiet. Good performance isn’t worth much if a tiny system requires its owner to turn their headphones or speakers up to 11 to drown out the shriek of its fans.

We already knew that Hades Canyon would be small. With its 1.2 L volume, this system occupies a footprint barely larger than a hardback novel (massive power brick notwithstanding). Mission accomplished for folks who want their gaming PCs to occupy the least amount of space possible.

It was less obvious at first glance how well Hades Canyon could get in the game, but our tests remove all doubt. With a shared 100-W package power to play with, the Core i7-8809G turns in average frame rates and 99th-percentile frame times that fall between those of a desktop GeForce GTX 1050 Ti and a mobile GeForce GTX 1060 6 GB overall. That kind of performance is good for pleasant 1920×1080 gaming at high settings in most of today’s titles, and the Radeon RX Vega M GH’s support for widely-available FreeSync monitors can help smooth out any minor imperfections in frame delivery. Outside of games, the Core i7-8809G’s swift quad-core CPU with Hyper-Threading pulls close with a Core i5-8400 in many workloads. Can’t complain about that.

Finally, Hades Canyon is quiet at stock speeds. Our sound meter registered between 36 dBA and 39 dBA from the NUC at 18″ (0.5 m) from the system. It’s totally possible to enjoy headphone-free gaming with Hades Canyon if you like, and it won’t drive other people crazy under load if they’re sharing a space with you. Overclocking this NUC makes a racket, though it’s a wonder that a system this small can be overclocked at all.

Intel NUC8i7HVK

April 2018

Most surprisingly, this fully-unlocked NUC doesn’t even sell for that much of a premium over similarly-specced Mini-ITX systems. By our reckoning, a complete Hades Canyon NUC is just about 11% more expensive than a Mini-ITX PC with similar features, an overclocking-friendly motherboard and processor, and comparable graphics performance. What’s really incredible is that even with a diminutive Cooler Master Elite 110 housing that Mini-ITX system, the NUC is just 1/13 the volume (power supply excluded). That’s amazing performance-per-liter, even if you do have to find somewhere to hide the NUC’s power brick.

All told, Hades Canyon nails everything we want in a tiny gaming PC and then some. Now that Intel has competent gaming hardware to play with, the company will need to back it up with the day-one driver support gamers expect from AMD and Nvidia. Given that AMD’s driver team is providing support to Intel behind the scenes, the blue team shouldn’t have a major challenge keeping up, but we won’t know for certain until some time passes. For the moment, I’m not aware of anything else quite as small or fast or quiet as the NUC8i7HVK, and Intel’s fine execution on all of those points make this system an easy TR Editor’s Choice.

Comments closed
    • Shobai
    • 1 year ago

    The colours appear mixed up on the Tomb Raider “beyond 33ms” plot – the 1050ti is blue, etc


    The Watchdog 2 “beyond 33ms” plot is interesting as well; the colours are OK, though

    [Edit 2]

    The frame-to-frame variance (thickness of line on the frame number plot) for this part is quite impressive, for the most part. It looks like AMD has been having success in tightening that performance up.

    • adisor19
    • 1 year ago

    @apple ! Take all this, pack it up into an aluminum brick and bring out the new MacMini.

    We beg you !


      • chuckula
      • 1 year ago

      Fear not, the RDF generators in Apple’s 2020 miracle chip have been tuned not only to obliterate this pathetic x86 CPU, but to destroy Vega’s GPU as well.

      All will be revealed soon.

      • deruberhanyok
      • 1 year ago

      The Mini could certainly do with an update. They wouldn’t even need to use the 8809G – there’s a few other options for Kaby-G that could work and are rated for much less power draw.

    • deruberhanyok
    • 1 year ago

    I was firmly in the “going to buy one of these” camp when they were announced. I’d had a Skull Canyon NUC as a primary use system for some time and was very happy with it, but something with a little more built-in gaming oomph would have been nice for the occasional trip to Diablotown or HeroesOfTheStormville. A few things have happened since the original announcement that have changed my mind.

    First, I started playing a lot more console games last year and I’m not spending near as much time gaming on PC as I once did.

    I had to decide if a $1200 (with SSD and RAM) investment was worth it for what I was still playing. So it was already looking like a bit unlikely – that’s a lot to pay for 2-3 games I play casually.

    Next, I put together a little system using a Ryzen 2400G and I’m very happy with its performance. With parts I had on hand (RAM I hadn’t sold yet from my gaming PC part out, an SSD I picked up used a year or two ago and never got around to using and an Antec ISK 110) my total out of pocket was about $350 (processor, motherboard, heatsink, pico PSU). I even painted it red. It’s about 3 times the volume of the NUC (ISK 110 is 3.7L ish) but it’s also all mine and at 1/3rd of the price. 🙂

    Now I had to decide if the NUC was worth the performance increase over that system. The lack of comparisons to Raven Ridge in all of these reviews is a bit odd, but ultimately, I don’t think there’s $1200 worth of increased performance. If I really wanted it that badly I could just take what I have, put it in a bigger case (still collecting dust under my desk) and use a regular video card.

    Last, this kerfuffle about the GPU’s actual lineage. I ditched an RX 580 because of continual crashes I was getting in a few Blizzard games that went months without fixes. Seemed to happen on Polaris across the board. I figured Vega wouldn’t have the issue (and my APU build has been solid) but I don’t know if I’d want to jump back a generation in a $1200 system, especially to one that had a series of issues with the few games I still play regularly.

    I can’t deny the appeal of a tiny box with all that I/O but there’s a few too many things working against it for me now. I’m glad there were some delays in getting them out – looks like they’re going to be shipping end of April to mid-May now – since it basically saved me $800.

    Would like to see some more relevant comparisons to other low-end / integrated systems, especially Raven Ridge APUs, but also against things like Zotac’s Zbox Magnus 1060, Gigabyte’s 1050ti and 1060 gaming brix systems, and other OEM SFF boxes that carry similar price tags.

      • Jeff Kampman
      • 1 year ago

      With regard to testing against Raven Ridge, you’re really looking at a whole different class of experience on this machine.

      Let’s take Doom, for example. In this review, the NUC posts nearly a 90-FPS average and a 15.8-ms 99th-percentile frame time at 1920×1080 with ultra settings.

      The Ryzen 5 2400G turned in a 48-FPS average and a 27.8-ms 99th-percentile frame time in our review at 1600×900 and medium settings. It’s really not even in the same league (and I don’t say that to rag on your system build—it’s just what the numbers tell us).

      Again, no offense, but the 2400G is about staying on the right side of a 33.3-ms 99th-percentile frame time in a lot of games at relatively low resolutions and low settings. This NUC is more about coming close to a 16.7-ms 99th-percentile frame time at 1920×1080 and high settings. It’s hard to design a single test that’s meaningful on both proposed systems unless you just enjoy looking at ugly frame-time graphs from the 2400G.

        • MOSFET
        • 1 year ago

        Very nice answer Jeff, and it wasn’t even my question. Would you check triple monitor functionality on this NUC, if you happen to have a TB3-to-dual-DP/HDMI thingy from [url=<]StarTech[/url<] or Plugable? When the Kaby Lake NUC7 series came out a year ago, Intel advertised triple monitor with TB3-to-dual plus the HDMI 2.0 -- then 3 months later, they changed their website as apparently ECS didn't route two DisplayPorts to the TB3. I bought two for work, plus the StarTech adapters, during the 3 months they claimed it would work. I love the formfactor and functionality, but I'll need to know for sure before buying another one. (It doesn't help that one of the two works fine and the other is stuck in an infinite no-boot loop.)

        • deruberhanyok
        • 1 year ago

        Thanks for the quick reply Jeff!

        I agree that they’re not really competing parts, but what other integrated GPUs are there for a comparison? I think this counts as an integrated GPU – it even looks like a stretch limo version of Clarkdale!

        Granted, I don’t think this is the beginning of a new wave of significantly higher end iGPUs, but Iris Pro in Broadwell (or the Skull Canyon NUC) also wasn’t widely available for purchase/use in other systems, that didn’t stop comparisions with the Kaveri APUs.

        In the case of the Hades Canyon NUC it’s kind of a niche product – for those who want a tiny box with discrete-class graphics but also don’t want the faster, actually discrete MXM graphics like what is available in ASRock, Zotac, Gigabyte or Shuttle SFF gaming systems (most of which cost more).

        In isolation, compared to other OEM SFF boxes, I think it’s reasonably priced for the performance on tap. I’m just wondering where a Ryzen APU falls on that particular scale, since, as far as “competent” onboard graphics goes, it’s really the only option for a DIYer today.

    • Andrew Lauritzen
    • 1 year ago

    Great review, thanks!

    Not really a lot else to say… thorough and insightful 🙂

    • Mr Bill
    • 1 year ago

    [quote<] That's amazing performance-per-liter[/quote<] A new TR metric!

    • WhatMeWorry
    • 1 year ago

    Does anyone know if Intel plans to sell the Core i7-8809G on its own?

    And why doesn’t AMD do something similar? Ryzen ain’t half bad.

      • DancinJack
      • 1 year ago

      Highly doubt it, in both cases.

      • Bauxite
      • 1 year ago

      Its BGA so “no” for end users but its like buying a laptop cpu so technically yes to OEMs. Already in some laptops, will probably see some other nuc-like machines. Maybe ones that do better pcie arrangements.

      • tipoo
      • 1 year ago

      Just like the last one this is something of a laptop part that they gave a higher TDP and put in here. It’s used in the XPS 15 2 in 1 and HP something or other, possibly Apple soon, etc. Just like the old Iris Pro one.

      It’s a BGA part so no direct to consumer.

      I think AMDs difficulty is lack of EMIB to bridge an APU to eDRAM.

    • tipoo
    • 1 year ago

    On an interesting note, lack of RPM had some questioning its Vega credentials, some wondered if it was scaled up Polaris rather than scaled down Vega. Would be an interesting topic.


      • Voldenuit
      • 1 year ago

      Yeah, the intel IGP on this even has RPM enabled, so seems weird that its *cough*Vega*cough* component doesn’t.

      • jarder
      • 1 year ago

      It uses HBM 2 doesn’t it. That means it’s Vega with RPM switched off.
      It’s a very minor issue for most people anyway, probably some bean counter didn’t want to fork out the extra for DirectX 12.1 compatibility and thought 12.0 was enough for now.

        • DancinJack
        • 1 year ago

        Not necessarily? Hawaii had HBM. It’s not like companies can’t design products to certain specs different from their originals. I personally don’t think the lack of RPM is that big of a deal, but just flat out saying “this is Vega because it has HBM” is pretty short-sighted.

        The bigger issue IMO is that Intel didn’t even give you a choice of which GPU you want to use in this machine (for any task). Everything is hooked up to the “Vega” GPU.

          • RAGEPRO
          • 1 year ago

          What design used Hawaii with HBM?

            • DancinJack
            • 1 year ago

            Right, I said the wrong one. I clearly meant Fiji, I had just confused the codename for Fury. That wasn’t the real point, though. I was trying to convey that AMD had done an HBM card before and just because it has HBM/2 doesn’t necessarily mean it’s this generation or that, or that revision or this.

    • chuckula
    • 1 year ago

    Today TR answers the question: Is Kaby-G more like Warren-G or Kenny-G?

    [url=<]WHY NOT BOTH![/url<]

      • derFunkenstein
      • 1 year ago

      Just a couple a OGs doin their thang

    • dodozoid
    • 1 year ago

    Hi Jeff,
    nice article and incredible machine (albeit with some dubious routing choices).
    Any chance you could update it with more competition in near future?
    (such as RX 4(5)60 / 4(5)70)
    I wonder how it would compare against my current HTPC in fraction of its volume.

    • ptsant
    • 1 year ago

    I’d rather have the power brick on the machine, even if it means a slightly bigger footprint (and maybe some minor changes in cooling). I find it’s much better to carry one item of slightly bigger size than two items. And, frankly, it’s so small it would still look small even if the volume almost doubled.

    A cool piece of hardware.

      • TheRazorsEdge
      • 1 year ago

      That could lead to cooling issues.

      Power supplies typically generate heat equal to 10%-25% of the components they power. If the NUC gets loud under a modest overclock, it’s already near its thermal limits.

      This is part of the reason that power was moved outside of SFF systems in the first place. (The other reason was a lack of power supply standard smaller than ATX, which has since been addressed.)

        • DancinJack
        • 1 year ago

        No offense but, “if the NUC gets loud under a modest overclock, it’s already near its thermal limits,’ is something that you have no way of knowing. Maybe they’re just loud fans. Maybe the fan curve is just really aggressive.

        Of course the louder it gets, chances are it’s moving towards its thermal limit, but “near its thermal limit” may be a stretch without other info than “gets loud.”

        • EndlessWaves
        • 1 year ago

        I doubt a system like this would be using a 75% efficient power supply. Even my USB charger costing under a tenner is over 80% efficient.

        10% is likely easily doable, even if it has to work with the less efficient voltages of Japan and North America. External power bricks have been available at over 90% efficiency for years.

        So that’s an extra ~12W on the outside of the case that can easily be thermally insulated from the rest of the machine as it only connects with a couple of wires.

        I don’t buy the argument that it’s too much extra heat in close proximity to the case.

        If there are cooling issues I suspect the bigger one is reduced surface area for airflow to exit from the other components.

    • Voldenuit
    • 1 year ago

    Can TR keep the pull-down navigation bar on the last page? Sometimes I want to double back to benchmarks even after I finish reading an article.

      • Mr Bill
      • 1 year ago

      +3 [url=<]"For The Honor"[/url<]

    • thecoldanddarkone
    • 1 year ago

    Do we have power usage from these (I might be blind or missed it)? I’m aware they are not directly comparable, but honestly that makes no difference to me.

      • DPete27
      • 1 year ago

      Agreed. There was only a little blurb about system noise on the conclusion page.

      I was interested in knowing if the system throttled at all, since that’s pretty much a staple measurement to take when reviewing any laptop.

      • DancinJack
      • 1 year ago


        • Voldenuit
        • 1 year ago

        Eep. 229.9W under (admittedly synthetic) load is pretty darn close to the max spec of the 230W AC adapter (although that’s power draw at the PSU, not power output, so there should be a few more percent headroom).

        I’d predict real world load usage to be considerably lower, although it’s still not looking great at perf/watt vs Pascal.

          • DancinJack
          • 1 year ago

          They say “at the wall” with a 4K display connected.

            • Voldenuit
            • 1 year ago

            Nearly all sites measure power draw at the wall (a few, like xbitlabs – RIP – or some of the GPU centric sites sometimes splice connectors to measure actual component draw, but these are few).

            The suppplied AC adapter is rated to power output, so factor in PSU efficiency and you have some headroom in power draw at the wall.

            Of course, the 4K monitor probably isn’t drawing any power from the NUC’s AC adapter.

        • DPete27
        • 1 year ago

        9,000 second testing runs? I have to believe that is a typo.

          • DancinJack
          • 1 year ago

          I don’t think so? 30 minute intervals x 5. At least that’s how I read it.

    • KaosDrem
    • 1 year ago

    “Disassembly challenges aside, I find Intel’s PCIe lane routing choices inside this NUC somewhat peculiar. The CPU’s PCIe lanes aren’t fully tapped even as bandwidth-hungry ports and slots sit behind the DMI 3.0 bus that connects the CPU and chipset. It seems especially odd to me that Intel relegated its gee-whiz Thunderbolt 3 controller to chipset PCIe lanes while putting the SD card reader, of all things, on a direct line to the CPU. Few (if any) games are I/O-bound today, but if gamers ever want to supplement Hades Canyon’s onboard graphics with an external graphics card, it’ll have to contend with traffic from the other devices hanging off the HM175 chipset.”

    Have you reached out to intel directly for an answer to this?

    It seems weird that literally nobody routes Thunderbolt3 to CPU PEG Lanes instead of putting in on the DMI/PCH.

    • Fonbu
    • 1 year ago

    Crack some Skulls (Canyon) on that one!

    • DancinJack
    • 1 year ago

    I’ve been reading a few reviews the past few weeks. Overall, I think it’s a pretty good machine. The power brick is pretty hilarious considering the size of the actual computer, but it is what it is.

    I still think there were some odd choices made. I said this previously, but I think it’s odd that Intel decided to route ALL the display outputs through the “Vega” GPU. Here’s to hoping other OEMs give you the option some how, some way. Really though, there is some good combination of I/O on this thing that would be really useful to me. Kudos there.

    I think it’s a great start and gives OEMs a good idea to base their designs off of. I could even do with a tiny bit less GPU power if that brick was a bit smaller.

      • UberGerbil
      • 1 year ago

      What’s the practical basis to objecting to the size of the power brick? I mean, sure, I could see some benefit to the brick being say ½ or ¼ the size it is now. But how is a “[i<]bit[/i<] smaller" power brick substantially more portable or less inconvenient than this one?

        • DancinJack
        • 1 year ago

        It’s not as much the practical applicability to me honestly. More just a “I’m quite nearly offended the power brick is pretty much the size of that PC,” kinda thing to me. I’d happily live with this machine on my desk without issue, but I don’t necessarily need all the GPU power for some use cases and wouldn’t mind a power brick, say, half that size. Of course that math doesn’t line up (if we’re literally cutting it in half to ~115W) I’m just saying it’d be nice.

        The 8706G might fit me better for what I’m thinking about. 65W package compared to this 100W package.

    • chuckula
    • 1 year ago

    TR editor’s choice!?!?

    Ah Haaades no!
    — Jen Hsun

Pin It on Pinterest

Share This