review gigabytes brix pro reviewed

Gigabyte’s Brix Pro reviewed

You’ve seen all the bleak headlines: the PC industry is shrinking, collapsing, withering away… It’s certainly true that, overall, PC sales are on a downward trend. However, it’s also true that parts of the market are very much alive and still growing. One of those parts is PC gaming hardware, and another comprises mini-PCs, including Intel’s fabled NUCs.

In a recent interview with the guys at Ars Technica, Intel’s PC Client Group VP Lisa Graff didn’t mince words about the success of NUCs and mini-PCs. “The whole category is growing,” she said. “A million units last year; I think we’re going to see something like 50 percent growth this year. We’ll see what it is, but it’s going to be strong, positive growth… NUCs are growing, our OEMs’ businesses are growing.”

It’s not hard to see the appeal of NUCs. Most of them have a smaller footprint than a CD jewel case. (Kids, ask your parents.) They’re small enough to strap to the back of a monitor with a VESA mount. Yet they offer speedy processors, fast solid-state storage, and relatively plentiful connectivity. For a lot of folks, that’s all that’s needed.

The first, Intel-built NUC (short for Next Unit of Computing) debuted a little over a year ago, and our own Scott Wasson picked it apart at the time. Today, we’re back with a mini-PC that’s based on the same form factor but trades the power-sipping mobile CPU for a quad-core desktop specimen. Say hello to Gigabyte’s Brix Pro:

The Brix Pro packs a surprising amount of power inside of a surprisingly small chassis. The processor under the hood is a Core i7-4770R with Iris Pro 5200 integrated graphics. It has quad CPU cores, each with a peak Turbo speed of 3.9GHz, and its integrated GPU features 128MB of dedicated eDRAM cache. That cache can do wonders for real-time 3D graphics, where rapid access to assets is paramount—and vanilla DDR3 memory can bottleneck performance.

In short, this is a very fast machine for its size. It should be no slouch in games, even without a discrete GPU.

The version of the Brix Pro we’re looking at today is the BXi7-4770R, which sells for $649.99 at Newegg without storage or memory. Intel sent us this system pre-configured with a 240GB 525 Series solid-state drive and eight gigs of DDR3L-1600 RAM. As you can see below, those components nicely round out the Brix Pro’s other hardware:

Processor 3.2GHz Intel Core i7-4770R (65W)
Graphics Intel Iris Pro graphics 5200
Platform hub Intel HM87 Express
Audio Realtek ALC269 HD audio
Wireless 802.11ac Wi-Fi and Blueooth 4.0
via Asus AW-CB161H
Ports 1 HDMI
1 Mini DisplayPort
4 USB 3.0
1 Gigabit Ethernet via Realtek RTL8111G
1 headphone jack with S/PDIF
Expansion SATA port for 2.5″ hard drive/SSD
Dimensions 2.4″ x 4.3″ x 4.5″ (62 x 111.4 x 114.4 mm)

That’s definitely a lot of PC for such a small chassis. Really, the dimensions in the spec sheet almost fail to do justice to just how small this thing is. Here it is next to a standard 3.5″ mechanical hard drive:

You wouldn’t think one of Intel’s fastest Haswell desktop CPUs could fit in there—but it does, albeit in a 65W incarnation. The regular Core i7-4770K is rated for 87W and has a slightly higher base clock speed, at 3.4GHz, although its peak Turbo speed is the same: 3.9GHz. The i7-4770R’s Iris Pro graphics, however, aren’t offered on the i7-4770K.

Now, to be fair, the Brix Pro owes some of its diminutiveness to the lack of an integrated power supply. The system gets DC power from an external brick that is, believe it or not, wider than the machine itself:

Yeah, that doesn’t look quite as good as those glamor shots. Ah, well. Chances are, the power brick will spend most of its lifetime under a desk, anyway. The Brix Pro also ships with a VESA mounting bracket, so you can strap it to the back of a monitor, entirely out of sight.

We’re going to look at the Brix Pro’s performance in just a little bit. First, however, I expect many of you are wondering the same things I did when I unpacked this thing: how does all that hardware fit inside, and how easy is it to upgrade? Join me on the next page for a thoroughly documented gutting of the Brix Pro.

Gutting the Brix Pro
More often than not, miniaturization hampers upgradability. Taking apart a phone or a laptop is usually trickier than cracking open a desktop PC. Happily, though, the Brix Pro is quite straightforward to disassemble.

Four Philips-head screws hold the bottom panel in place. Undo the screws, and the panel comes off, granting access to all of the Brix Pro’s internal slots and ports. (More on those in a second.) The bottom of the panel has an empty drive cage with room for a single 2.5″ drive, either solid-state or mechanical. If I were buying the Brix Pro for myself, I probably wouldn’t think twice about strapping in a 1TB mass-storage mechanical drive, like this one for 60 bucks. I need space for my music collection and other downloads.

Here’s a closer view of the Brix Pro’s expansion area. See that piece of Scotch tape? It holds down a lone Serial ATA power and data connector, which is meant for whatever drive winds up in that 2.5″ cage we talked about. I assume the connector was taped down to prevent it from rattling about, since our sample didn’t ship with a 2.5″ drive.

This part of the motherboard plays host to a couple of SO-DIMM slots (both filled by 4GB modules), an mSATA slot (populated by a 240GB Intel SSD), and a Mini PCIe slot (which accommodates the system’s Wi-Fi and Bluetooth module). It’s hard to see in this picture, but the Mini-PCIe slot sits below the mSATA one. The only sign of it here is a black antenna wire snaking below the solid-state drive.

The Brix Pro can be taken apart further, but that requires a few more steps. First, one must slide out the panel that covers the rear I/O ports. Then, one must remove the SSD and unhook the two antenna wires from the wireless card below. Finally, a couple extra Philips-head screws must be undone. The screws sits on either side of the I/O port cluster. Once all that is taken care of, the motherboard and everything still fastened to it will come out of the machine without resistance.

Yep. That’s all of it (and my ugly mitt for scale). I’ve left the SSD unplugged in order to reveal the Mini PCIe wireless controller. Even with it installed, the Brix Pro’s innards are impressively small considering the desktop-class processor they house.

Speaking of which…

The top of the board accommodates all the expansion and connectivity; the bottom, pictured here, is where the processor and chipset live. I was too chicken to remove the cooling apparatus, but there wouldn’t be much point. It’s not like you can throw in a Thermaltake Frio in there—or any other desktop-style cooler, for that matter.

The lack of support for desktop CPU coolers makes sense, of course, but that sliver of copper still looks awfully slim for a 65W chip. So does the blower fan. Generally, there are only two ways to prevent overheating with an inadequate heatsink surface area: spin up the fan like crazy or throttle the processor’s clock speed. Neither option is exactly ideal.

We’ll see how the Brix Pro fares under a heavy system load in a minute. First, though, let’s have a look at gaming performance.

Gaming benchmarks
We already know Intel’s quad-core Haswell parts are blazing-fast in desktop apps, so that isn’t really worth rehashing. You can look at the numbers from our original Haswell review, and they’ll give you a decent sense of the Brix Pro’s productivity performance.

What we were really curious about was the Brix Pro’s gaming chops. The Core i7-4770R’s Iris Pro 5200 integrated graphics aren’t available in retail-boxed desktop CPUs, and we’ve only ever tested them on a 47W processor—the Core i7-4950HQ—before. The Core i7-4770R has a 65W TDP and should be even quicker.

Let’s start with some inside-the-second tests in a couple of games: Battlefield 4, which is new and fairly demanding, and Borderlands 2, which is a little older and should be easier on a slow GPU. For comparison purposes, I included AMD’s A8-7600 “Kaveri” APU, which also has a 65W TDP and relatively speedy integrated graphics. This isn’t a straight-up, apples-to-apples competitive matchup, since the Kaveri APU is supposed to cost $119, while Intel prices the i7-4770R at $358. Still, the Kaveri part should give us an important frame of reference—and it is, to be fair, the only 65W member of the Kaveri family.

Battlefield 4

At these settings, the Brix Pro’s Iris Pro 5200 IGP fares quite well. Frame times are relatively low and relatively consistent overall, and the occasional spikes aren’t too dire. Subjectively, the game feels surprisingly smooth—even if it doesn’t look its best. The Brix Pro certainly delivers a better experience than the lower-end Kaveri test machine, which teeters on the threshold of playability.

Borderlands 2

Here, we were able to raise the resolution and tack on a little antialiasing without degrading performance on the Brix Pro’s Iris Pro 5200 graphics. Our 65W Kaveri chip didn’t yield playable frame times at these settings, but the Iris Pro 5200 handled itself just fine. Borderlands 2 is plenty playable on the Brix Pro.

Now, there are a couple of caveats to consider.

First, the Brix Pro actually draws more power running Borderlands 2 than the Kaveri test system. At the wall, I measured 83W for the Brix Pro and 69W for Kaveri, although the Brix Pro did draw less power at idle: 15W vs. 24W. (The systems used different power supplies, but their energy efficiency seems to be roughly equivalent, so the numbers should be comparable.) Given the Brix Pro’s skimpy cooling, that comparatively high power draw translates into rather noisy fan whine. We’ll explore that issue in more detail on the next page.

Second, the Brix Pro’s Iris Pro 5200 graphics seemed to cut corners on texture filtering compared to the Kaveri APU’s built-in Radeon. In Borderlands 2, I noticed some weird, circular artifacts on textures that should have been mostly plain:

The Kaveri APU didn’t exhibit any such artifacting. It rendered the gray slabs in the image above smoothly, without bizarro circles.

Perhaps this is a bug specific to Borderlands 2. Other titles, as far as I could tell, didn’t suffer from similar artifacting or filtering issues. Still, this isn’t something I recall ever encountering with AMD and Nvidia graphics processors in recent history. Intel’s Iris Pro 5200 may be fast, but the competition still seems to offer more consistent image quality. (For what it’s worth, I used Intel’s latest beta graphics drivers for all game testing.)

Subjective gaming
In addition to the games on the previous page, I tried a handful of other titles in order to get a better feel for the Brix Pro’s gaming capabilities.

In each case, I tinkered with detail levels to find the best compromise of image quality and performance, and then I played a little bit while keeping an eye on frame rates using Fraps. Frame rates only tell part of the story, of course, but our empirical benchmark data shows the Iris Pro 5200 IGP isn’t prone to egregious frame latency inconsistencies. I didn’t notice any obvious spikes in frame times in the games below, either.

BioShock Infinite‘s Unreal Engine 3 graphics look good but aren’t overly taxing on today’s gaming hardware. Running on the Brix Pro at 1366×768 with the “low” detail preset, the first few levels of the game felt quite smooth and very playable, with frame rates in the 45-70 FPS range. Image quality was surprisingly good despite the detail preset and resolution used.

I used the same 1366×768 resolution with a “normal” detail preset to wander Tomb Raider‘s abandoned mountain villages. Here, too, the Brix Pro performed well. Frame rates did dip a little lower than in BioShock at times (Fraps reported 35-60 FPS), but I’d still say the experience was quite playable.

This is an oldie but a goodie: Counter-Strike: Global Offensive, the latest remake of the popular Half-Life mod. Here, I was able to crank the resolution up to 1080p and max out all the detail settings except for MSAA, which I left disabled. (FXAA did a decent enough job of buffing out jaggies.) Frame rates hovered between 50 and 75 FPS, which was enough to help me secure a number of headshots—and to have quite a lot of fun along the way.

Noise and heat
While the Brix Pro does a commendable job of running both newer and older games, it generates quite a lot of noise in the process. The system is almost whisper-quiet at idle, but under load, that little blower makes the machine sound almost like a hair dryer at full blast. It definitely detracts from the enjoyment of a good 3D shooter. Here’s a video of the Brix Pro running Borderlands 2 at the same settings used for our benchmark:

Yeah, it’s loud. And it sounds worse in real life than in the video, believe it or not. The system does run quieter in basic desktop tasks like web browsing, but the fan still has a tendency to spin up as soon as CPU load increases even a little.

Now, a loud fan can be forgiven if it keeps the processor appropriately cool. Is that the case with the Brix Pro? To find out, I fired up Prime95’s small-FFT test and kept track of both CPU temperatures and clock speeds using AIDA64. I let the system sit idle for five minutes, ran Prime95 for about 10 minutes, and then let the system cool off for another five minutes. The results of that little experiment are summed up in the line graph below:

The Brix Pro’s Core i7-4770R hits 100°C and clearly throttles its clock speed in order to avoid heating up beyond that threshold. I’m not sure what happened in the last few minutes of the cool-down run—perhaps a background process kicked in—but in any event, it seems the Brix Pro’s CPU cooler isn’t up to the task of keeping the processor cool enough to sustain its base clock. Prime95 is admittedly a demanding scenario, but in Borderlands 2, I saw CPU temperatures in the same ballpark (and clocks occasionally dipped below the rated base speed). The Brix Pro runs hot and loud even outside of torture tests.

This appears to be a deliberate design choice by Gigabyte. When I asked the company if the behavior we saw was normal, I was told that Intel’s spec allows for temperatures up to 100°C. Intel confirmed this fact, saying Haswell has a thermal throttle point at 100°C that “keeps the part operating within its intended specifications.” I then asked if running a chip at such a temperature for extended periods would have a notable impact on chip longevity. Intel conceded that “temperature obviously encourages electron migration and doping level changes in the silicon more quickly,” but it went on to say, “At this temperature, the warranty period for part replacement should remain valid and product life should be viable throughout that warranty period.” Intel covers retail-boxed desktop chips for three years. Coverage for OEM CPUs like the Core i7-4770R depends on the system vendor; in the Brix Pro’s case, coverage seems to be one year.

There’s no question Intel has created an exciting new category of PCs with the NUC. The tiny systems push the envelope of what’s possible in a small-form-factor PC, and they do so with potent processors that can deliver very compelling performance.

In the case of the Brix Pro, Gigabyte has pushed the envelope even further. The system is very fast, and its gaming performance is commendable for a machine without discrete graphics. I was also impressed by how easily one can access the expansion slots and, if needed, add an auxiliary hard drive or SSD. That’s a big plus for enthusiasts.

Perhaps the envelope was pushed just a bit too far, though. While the system did operate within spec in our testing, its cooling apparatus was very loud and failed to cool the Core i7-4770R processor enough to avert clock throttling. That throttling didn’t degrade performance in a noticeable way on our test bench, where ambient temperatures were around 21°C (70°F). Nevertheless, the fan noise alone made the system difficult to put up with. I wouldn’t be thrilled to have that kind of noise in my bedroom or home-theater setup.

It’s too bad the Brix Pro’s dimensions weren’t enlarged slightly to make room for a larger heatsink and fan. Even with twice the existing cubic area devoted to CPU cooling, this machine would still be amazingly tiny, and it would still be an awesome feat of engineering. As it is, though, the Brix Pro is a little too noisy for my taste, at least in this iteration with the 65W Core i7-4770R. Another variant with a slower Core i5-4570R processor does exist, but as far as I can tell, its CPU has the same 65W thermal envelope as the i7-4770R.

0 responses to “Gigabyte’s Brix Pro reviewed

  1. What’s with the hate on old Harold Bluetooth? Great guy. Father of Denmark. His children went on to conquer the civilized world back in 2003 when I played Medieval Total War, Vikings Expansion. Totally deserves to have IEEE 802.15 carried forward in his name.

  2. Watching the noise test video brought back painful memories of the early 2000s, when CPUs were hitting new peaks in power consumption, but quiet computing hadn’t yet caught on as a mainstream concern and overclockers considered it a badge of honour that their PC sounded like a hair dryer. One difference being back then, the noise level would be the same at idle.

  3. Would the person (or people) who came up with the term [b<]Next Unit of Computing[/b<] please show yourselves? We plan on sending you to the same restaurant flipping burgers with the guys who came up with the name Bluetooth. May you never brand anything again.

  4. [quote<]Bulldozer was an experiment [/quote<] Oh... my bad; I guess I misunderstood. Looking at all the hype, I thought it was supposed to be a real product.

  5. The last sentence [url=<]here[/url<] is relevant to this review... [quote<]Intel is also planning to launch a socketed variant of the Core i7-4770R, which is based on the company's Haswell GT3e silicon, which features the Iris Pro 5200 graphics core, with 40 execution units, and 128 MB of L4 cache. [/quote<]

  6. [quote<]Most of them have a smaller footprint than a CD jewel case. (Kids, ask your parents.)[/quote<] Shots fired.

  7. Miserable choice of CPU + Cooling solution. Hair dryer? Come on. Wow. Just wow. I sure won’t be getting one. 🙁

  8. An i3 with a high clock speed and Iris graphics would be pretty interesting. They could call it the i4.

    Of course, I’m not sure why they don’t make Iris standard on the i7 processors.

  9. I can vouch for giving every teacher a PC with a DVD player as the best solution if they have a projector.

    1) You don’t have to train them to rip DVDs.
    2) Easily pocketable equipment gets lost very quickly in a school.
    3) A teacher’s entire day revolves around giving presentations and speaking in front of a class, so don’t make it hard on them by putting a bunch of bureaucracy between them and essential tools of the trade.

  10. Laptops are driving the market. They are by far the most dominant form factor, and it makes sense for manufactures to focus on that market and expand the tech into other areas.

    It’s more efficient to product lots of one config that slightly varies then many different configs that widely vary. The hardware market has razor thin margins, so companies have to cut costs where they can.

    Most people aren’t going to miss the expansion capability, and they will appreciate the smaller footprint. Honestly, most PCs never have their config changed from the way it was set at the factory anyway. There are some that will become frankenboxes, but most don’t.

  11. That’s not really apples to apples. Bulldozer was an experiment that wasn’t fully baked versus a well focused effort from a company at the top of its game.

    AMD would still have to pick and choose where they put their resources, so you’re right. Intel’s size helps them.

  12. I am still going to go forward with A8-7600 (when/if it comes out). I can control the noise and it will be much cheaper. Still disappointed in the benchmarks. Wish AMD could keep up.

    The goal is to upgrade this system that I built 2.5 years ago – [url<][/url<]

  13. Bulldozer and Sandy Bridge didn’t have process parity though. While they both were built on a 32nm process, Intel’s chips at any given process size have pretty much always had less leakage than equivalent chips from TSMC and GlobalFoundries.

    I also admit Intel’s architecture is superior, but I don’t think it’s that big of a gap. An Intel architecture chip built on the process that AMD is using would not be able to compete either.

  14. And that’s why I used the qualifier “sort of” 😉 There are some killer looking cases out there though, but since Intel keeps changing the port layout some of the old ones just aren’t options anymore, [url=<]no matter how much you want them to be.[/url<]

  15. While I agree that Intel’s manufacturing advantage helps a lot here, it’s not just that. Sandy Bridge is quite a bit more efficient than Bulldozer while having process parity. So to me it seems that Intel’s CPU architecture is also just plain better… and that’s not really surprising, considering how much larger Intel is.

    In other words, it’s not exactly clear if AMD could compete even if they had access to Intel’s fabs

  16. That’s why I said “almost”. I understand that it’s a free market and government interference and regulation is the opposite of capitalism, but when the market is monopolistic, anti-competitive, and anti-consumer, sometimes it’s the duty of the government to step in to protect the consumer.

  17. I’m not sure if I am comfortable buying the boards from eBay, even if it is “easy”. All the more popular shopping links I can find are either selling the kits, or they are selling the boards in bulk of 50 or something. 😮

    As for the separate cases, I have looked at the Silverstone one. No power brick included, and they did not mention the specs of the brick that is needed. It is as if they don’t want the info to be published so you end up go buying the kit.

  18. [quote<]When that's the best performance AMD can squeeze into 65 watts, I almost wish the government would force Intel to allow competitors to use their fabs. The lead Intel has in the manufacturing process makes competition impossible.[/quote<] If it's any consolation, I seem to recall that nvidia's mobile GPUs tend to be more power-efficient than HD4xxx HD5xxx at similar performance levels.

  19. While respecting your right to your own opinion, anytime I hear “government” and “force” in the same sentence I shudder.

  20. [quote<] For comparison purposes, I included AMD's A8-7600 "Kaveri" APU, which also has a 65W TDP and relatively speedy integrated graphics. This isn't a straight-up, apples-to-apples competitive matchup, since the Kaveri APU is supposed to cost $119, while Intel prices the i7-4770R at $358.[/quote<] When that's the best performance AMD can squeeze into 65 watts, I almost wish the government would force Intel to allow competitors to use their fabs. The lead Intel has in the manufacturing process makes competition impossible.

  21. I’m looking to retire my old Intel Q8200 based HTPC so I appreciate these reviews of mini-pc systems. Considering the tradeoffs I’m seeing though I’m getting the itch to build another itx based system with discrete graphics. Hmm… I’ve been eying the little cedar chest thing the wife has by the TV and I’m thinking maybe a Swifttech Apogee Drive II integrated pump & waterblock and a little Black Ice GTX M184 radiator would fit.

  22. The projectors are ceiling mounted, not on carts (smartboards are often being used here too, requiring good positioning).

  23. I would just get a USB drive and keep it with the projector in that case. Or if it is an actual video DVD, a $30 standalone DVD player on the projector cart.

  24. I inherited this infrastructure as of several months ago. And in education, being able to play back DVDs can be very useful. It’s simple, low bandwidth, and virtually guarantees respect of copyright (note: I’m not barring other sources for obtaining video).

    Not every one of my staff has this, but most of our teaching staff do. And when you inherit an infrastructure someone else set up and built, and you only have so much money at a time to make changes, you never get to do sweeping changes; you have to roll with it.

  25. [quote<]Apple would build a better cooling solution. And I'm no apple fan.[/quote<] I thought Apple fans were louder?

  26. Looks like it is 802.11ac, not 802.11n. My bad. I’ve just updated the review to reflect that.

    Personally, I’d go the SG05 route. Extremely tiny PCs are undeniably cool, but you can build a pretty small machine with more expansion–and better cooling–these days.

  27. I’m almost positive that I’ll be buying Zotac’s i5-4570R box when it comes out, rather than the i7, because I think the performance & noise will be better on the slower-clocked chip. Iris Pro is intriguing, and it would make a nice little $600 temporary machine (which could then be delegated to tiny-HTPC when I replace it).

    Unless, of course, Zotac [i<]does[/i<] come out with one sporting a GTX 765M... That'll be a hard decision to make.

  28. Nice little build, can’t say I’m wild about the case (looks a little too much like an first gen mac mini and I can’t say I’m a fan of anodized cases, just personal taste).

    What I find often however is that by the time you get all the components together and pay for them, you might as well have bought the Mac Mini as it usually is cheaper. The other great thing about the Mac Mini’s is when it comes time to upgrade, you still can sell them fast and quick at a nice premium price.

  29. Deanjo already pointed out that there’s a 47W version of this chip that runs 200MHz slower.

    [url<][/url<] So they can do it in approx 70% of the power budget. Real-world consumption differences may be lower, but at this point anything helps. edit: my math sucks.

  30. Cyril, great review! I love these NUC-style systems that are popping up all over the place.

    Quick question: the Brix systems (both the i7-4770R and i5-4570R models) are advertised with 802.11ac on newegg – did the one you guys get have that NIC in it or did they ship a different model?

    The i5 one looks tempting – while it is rated for the same TDP, the lower clock speeds (500MHz less base and 700MHz less boost, and 150MHz less peak GPU speed) and lack of hyper-threading might keep it from making as much noise. It’s also $150 cheaper – listed at $500 on newegg – and when you’re talking about spending less than $1000 on a system, $150 is a big chunk of saving.

    Add 8GB of DDR3-1600 and an M500 240GB to the i5 model and you’re looking at about $700 for a nice little system, and one that far outclasses the current Mac Mini at that price. (That could change, if Apple ever updates the Mini, but at this point they may as well wait for Broadwell).

    Unfortunately it seems all of the reviews so far have all been of the i7 model, and newegg reviews of the i5 model seem to vary. It doesn’t seem to spin the fans up for random tasks, like the i7 model does, but still makes a ton of noise when running at load. I guess that’s better than the fan going nuts while streaming a movie.

    Also, something else to consider: the hardware in this, 802.11ac card included, is well supported by Linux/SteamOS (, and it’s more capable than the hardware in Intel’s NUC. So it would make for a great mainstream-level miniature gaming box. Think Team Fortess 2, or DOTA 2.

    I’m torn between the i5 model and Intel’s NUC. I like the Intel one with the HD 5000 graphics – especially since they’ve cleaned up the firmware to correct some early issues with Linux compatibility (I run Ubuntu at home, so Linux compatibility is an important point to me). It’s low power and quieter.

    The price difference between the Intel NUC with room for a 2.5″ drive and the i5 Brix system is only $60 – $30 of which would have to go to an Intel 7260 802.11ac wifi card. So for an extra $30 you get a faster processor, more cores, and far more capable GPU? Why would I pass that up?

    Except… noise could be a huge issue. And the NUC without room for a 2.5″ drive is $125 cheaper – $95 if you figure the cost of a wifi card. Suddenly it’s 20% of the overall system price saved, and that’s substantial enough to give me pause. Do I really want Iris Pro? My current laptop has an i5-3337u in it, I think, which is higher clocked than the i5 in Intel’s NUC, so the HD 5000 graphics may not be sufficiently quick to make up the speed difference in the processor. Hmm.

    If I could afford it I’d buy both and test them side by side, I would, but the cash I buried in the backyard hasn’t sprouted into a money bush yet.

    After a certain point it might be easier to just build something into a Sugo SG05 and call it a day, which is yet another tempting option (especially since I could add a discrete video card to it).

  31. I agree that 65W is pushing it a bit too far without better cooling. If they had put the heatsink on top, made it twice as high and had a fan as large as the top of the case on top it might have worked out better. Maybe with the 14nm shrink we might see similar performing NUCs with more reasonable thermals.

  32. Build one yourself? You can get beautiful anodized aluminum itx cases and low profile desktop coolers, and even an internal psu. Well dc-dc psu so you still need an external power brick.

    I did this for my mom and got the system completely inaudible during her normal use, like playing a hd video on youtube with another browser window running a flash game. I used two high quality sleeve bearing fans and a likewise quiet cpu cooler both controlled by Asus AI suite. I also taped the internat mounting points so it would feel more solid like an Apple product.

    It is bigger than a nuc, but i think that tradeoff is worth it until we get more standardised NUC components.
    [url<][/url<] [url<][/url<] Please excuse my shameless plug 🙂

  33. So every one of your staff has a projector that requires a PC with a DVD drive? Sounds like you guys really need to rethink your entire infrastructure and deployment.

  34. If you were an Apple ‘fan’ you’d have insider knowledge of their cooling solutions. Ho ho.

  35. The reverse could be said. Why buy a laptop if it’s just going to be plugged in and on a desk all the time?

    I see more value to these setups than say why they would use APUs on a larger form factor.

  36. Several thoughts:

    -I still can’t understand why Intel is holding Iris back for all but expensive niche processors. By that time, most people will buy a graphics card. Iris is perfect for beefing up a low-power Core i5 for Ultrabooks, NUCs, etc. and they aren’t doing it. This makes little sense to me
    – If I could get a Tier 1 vendor version of these (e.g., Dell) with a DVD-RW for a reasonable price (say, Core i3, a single 4GB module), I’d start buying them for work for our staff desktops. Yes, we still need optical drives for playing back DVDs through multimedia projector.

    Dell did do something like this at one point (the Inspiron Zino) but I think it was just a little too ahead of its time. A revamped Zino with Haswell would be great. As it stands, I’ll probably just go SFF for our staff instead.

  37. I read the review up until I saw the picture of the power brick.

    Cute box though. Utterly pointless in reality at that price.

  38. See, I give Apple credit where credit is due: If they can get the Mac Pro into that form factor using a 130 watt CPU (not to mention two GPUs), then I definitely believe they could come up with a small form factor solution for a 65 watt SoC.

  39. My uncle bought this product for his son…It was returned a few days later.

    The way he configured it (overall cost), he might as well have just bought a laptop…Which he did after returning the Gigabyte solution! (Laptop was a quad core with discrete GPU solution. Much quieter!)

    Suffice to say, this Gigabyte solution is:
    * Too hot.
    * Too loud.
    * Too expensive.

    About half the cost is because of the processor alone. (Core i7 4770R).

    Having seen this solution personally, I do agree with others. Gigabyte should have just gone with standardized components like Mini-ITX form factor motherboard, etc.

    …Its a solution looking for a problem. At best, a “concept computer” solution. (Engineering for the sake of an engineering challenge. Instead of actually solving a problem of the end-user).

    By the way, Zotac has a similar solution that also uses the Core i7 4770R processor. Its 20 deg/C cooler compared to this Gigabyte solution. Mainly because they’ve decided to be more aggressive with the CPU/IGP throttling! Takes a notable performance hit!

    As for the 65W rating of the processor. That isn’t maximum. Max is closer to mid-70s.

  40. Exactly. This product doesn’t actually serve any purpose if you think about it longer than 2 seconds.

  41. I don’t know why this form factor exists. Just buy a laptop or a tablet. Sheesh.

  42. Expansions will be on Thunderbolt and enthusiast systems will consist of multiple power supplies and multiple little boxes attached together like a decorated Christmas tree with expensive cables.

    I find it very amusing that people will buy the smallest PC possible only to expand it with messy external cables and peripherals.

    I’ll keep buying mega towers and 1000W platinum power supplies.

  43. A better cooling solution could certainly be created. I’m sure apple would do something like blend the heatsink with the case, use a lot of copper and a big 12cm fan on top of all the system. Given enough money, better cooling is always possible. Then again, why not just make a microtower and add a discrete GPU? Unless of course you live in a closet and can’t afford the space.

  44. Contrary to what is written, I believe the idea comes from apple (mac mini) instead of intel (nuc).

    Anyway, I don’t get the appeal of this form factor. For a primary desktop, few people can’t afford the space under a desk for a proper midi tower with real cooling and expandability. For a home server, you’ll need less processor and much more HD/SSD space. For the living room the brick size is a nice touch, but how many HTPCs do people buy? And, really, why not a mini-ITX that can use discrete GPU if gaming capability is desired?

    Obviously, the average consumer values aesthetics more than I do, but if space really is at a premium I would still recommend a microtower. The BRIX and NUC seem like innovation for innovation’s sake.

  45. Touché.

    edit: for the dingbat that gave the minus, here’s the article about the fix:


  46. Perhaps Gigabyte should change their ‘UD’ designation to ‘Ultra Decibel’.

  47. It wasn’t a technicality, if I would have meant “Guess we have to wait for a Mac Mini refresh to have a quiet small i7-4770R setup.” I would have said just that. I specified the graphics, not a specific model of CPU.

    NeelyCam just couldn’t wait to immediately try to shoot down anything positive due to his irrational hatred for a brand instead of thinking rationally that Apple has been largest adopter of Iris graphics so far and the Mini is due for an update.

    He probably would have scoffed at the idea of a 130 Watt E5-2697V2 with two Firepro 6 GB 2048 stream processors with powersupply would be quiet or even fitted in an enclosure of coffee can before the Mac Pro came out.

  48. It is a thing, sort of. There are quite a few [url=<]aftermarket cases for NUC boards[/url<] and the boards themselves are relatively [url=<]easy to buy[/url<] caseless. Cooling mods aren't that tricky either. [url=<]This crazy guy[/url<] was even talking about ways to add a discrete GPU. 😉

  49. The Borderlands issue looks like a shadow biasing problem (not texture filtering per se). For such things it’s hard to judge if it’s an app or driver issue. In this case I’d guess the former since things like the shadow lookup/PCF filter are somewhat loosely defined in DX (one of the few places with some wiggle room still)… it’s definitely possible to set up a bias that works on one or two pieces of hardware but not another due to precision differences.

    Anyways it shouldn’t affect the performance results, but obviously would be nice to see it fixed.

  50. Three upvotes from me, I’ve been thinking this for a while too.

    We went from AT to Baby-AT to ATX, to uATX to mITX in rapid succession….

    ….and then nothing; BTX flopped and FlexATX was as niche as “Shuttle XPC”

  51. I’ve been waiting for someone to introduce active noise cancellation to a computer cooling system. With a system like this and today’s cheap electronics hardware, It should be made for the job.

  52. [quote<]The lack of support for desktop CPU coolers makes sense, of course, but that sliver of copper still looks awfully slim for a 65W chip.[/quote<] You're still thinking [i<]desktop[/i<]. I've seen 90W laptop coolers that are half that size. [i<] - Edit; make that a quarter.[/i<]

  53. Winning on a technicality stated after the fact, the true sign of a seasoned internet debater.

  54. The noise is really disappointing obviously, but I still think the system is really cool. I’d love to see what they could do with a taller form factor for cooling and a discrete graphics chip like one of the 800Ms.

  55. I wouldn’t say it is just intel. Just look at AMD’s recent efforts (and lack of in some areas). Go back and rewatch the AMD DockPort video.

  56. Intel probably defined the motherboard, but the box that it comes in is still of Gigabyte’s choosing.

    I would prefer to see M-ITX boards with these chips just like we’ve already seen with Atom and Kabini though since it opens up more options. That is almost certainly going to happen going forward since this is just the first wave of these devices to hit the market.

  57. If anyone learned from Apple, it’s Intel. Intel is the one defining this form factor, and they’re the one dictating how it develops. Intel is driving this carriage; Gigabyte is just one of the horses in Intel’s harness.

  58. Unfortunately that is not the trend. The trend in computing is to start locking systems down to a set config. Graphic vendors are being locked out of busses, graphics and i/o are migrating to on die of the cpu, ram is being soldered onto the board, external “modular” expansion is being heavily pushed, etc. Gigabyte and manfacturers would rather sell you an updated Brix instead of allowing you to upgrade on your own and them not making any money from it.

  59. Totally agreed. Other than “this is what Gigabyte decided was good”, I don’t see why you’d ever want the 65W CPU in something this size. Since you’re never really breaking 3.2GHz anyway, why not opt for a slower, lower-power CPU?

  60. I don’t recall saying it was going to be a 65 watt part. Hello i7-4950HQ. 47 watts which is just 2 watts more then their current, quiet top of the line Mac Mini.

  61. shameful.

    all those darn TR golders wasting bandwidth thumbing upvotes! it’s just busyvotes!

  62. I’ll separate the chip itself from the platform here: At 65 watts, it ain’t too hard to pull off, but I do agree that a cooling solution in a form factor of that size isn’t going to be easy to do.

    A more standard (but still small) case with a nice 120mm closed-loop cooler however…

    I also agree that we are really waiting for desktop Broadwell to make systems like this one more than just a curiosity at this point.

  63. that is one chunky power bar.

    gigabyte brix you say?

    intel needs to get beyond graphical problems with its graphics stuff. I thought they were there already. Still, I bet whatever they have in 2015 or the year after really starts to make them contend in graphics. wonder what it would cool like if it just had more venting on it.

  64. [quote<]Time for a better cooling setup![/quote<] It's kinda hard in that form factor... I'd say a more efficient chip is needed to make the cooling solution sufficient and sufficiently quiet... i.e., "Time for Broadwell!" [quote<]P.S. --> To anyone who hasn't subscribed yet, the single-page mode for reading reviews practically makes the subscription worth it all by itself.[/quote<] You know, I totally agree.

  65. [quote<]P.S. --> To anyone who hasn't subscribed yet, the single-page mode for reading reviews practically makes the subscription worth it all by itself.[/quote<] *cough* reader mode has done this for a long time *cough*

  66. And — not for the first time — I find myself wishing I could buy the components separately. Standardized motherboards (perhaps with CPUs, if Intel insists), cases (perhaps with bundled power bricks), and cooling systems for this form factor would enable people to make the size / noise / cost trade-off that suits them, rather than whatever a company like Gigabyte can come up with.

  67. I have to say I have been impressed with Intel’s latest temperature throttling. Where I work we put 5 I7s in a metal box under the sun with questionable cooling and although the chips did throttle (a lot!) they refused to break 100 Celsius or crash which I thought was pretty impressive. Of course a simple desktop like this should have adequate and quiet cooling. You can rest easy though knowing that it is nearly impossible to truly overheat an intel chip.

  68. Any chance that blower could be nixed for a regular fan?

    I’m sure the airflow would be really constricted for a fan, but just looking at that little leaf blower I can hear its awful screech in my head.

  69. Looks like a person will have to wait for a Mac Mini refresh to get a Iris 5200 in a quiet small enclosure.

  70. Minus the issue with the cooling fan it looks like a very nice system.

    Time for a better cooling setup!

    P.S. –> To anyone who hasn’t subscribed yet, the single-page mode for reading reviews practically makes the subscription worth it all by itself.

  71. I have the Intel NUC with i5 on its way here for my kids’ new computer, with 8GB and a 128GB SSD, can’t wait to get my paws on this.. more exciting than the time I bought the first mITX system years ago.. some VIA thing?

  72. Gah, a perfectly good idea wrecked by not giving it an extra 1/2″ of height for some decent cooling. On the other hand I’ve built* four systems with D34010WYK NUCs and am completely infatuated with them, even the little intel theme that plays when you open the box. 🙂

    * – if you count installing RAM and an mSATA drive as “building”

About Cyril Kowaliski