Gigabyte’s Brix Gaming BXi5G-760 mini-PC reviewed

We’ve taken a look at a couple of PCs based on Intel’s Next Unit of Computing (NUC) design here at TR: Intel’s original whitebox NUC, and Gigabyte’s NUC-inspired Brix Pro. While both of those machines were competent performers, neither of them had enough GPU horsepower to seriously tempt the PC gaming enthusiast.

With its Brix Gaming BXi5G-760, Gigabyte is looking to change that. The company has stuffed a discrete Nvidia GPU and an Intel Core i5 processor into a case that takes up about as much space as a dense paperback. The amount of computing power per cubic inch here is seriously impressive, at least on paper. Is the Brix Gaming good enough to replace the typical mid-tower gaming PC? Let’s find out.

In the world of computing hardware, small, powerful things rarely come cheap. The Brix Gaming is no exception. It sells for $799.99 at Newegg right now. That asking price gets you an Intel Core i5-4200H CPU, which is a dual-core mobile part with Hyper-Threading enabled. The i5-4200H’s base clock speed is 2.8 GHz, and Turbo Boost can take it as far as 3.4 GHz.

The real point of interest in the Brix Gaming is the discrete GeForce GPU. Nvidia and Gigabyte are calling the part a GeForce GTX 760, but don’t let that name fool you—this isn’t the same product as the regular, desktop GTX 760. A quick look at GPU-Z proves as much:

For comparison, here’s the GPU-Z analysis of the GTX 760 in my personal desktop:

After looking at the numbers above, we reached out to Nvidia for more information. The company clarified the lineage of this GPU for us, and it explained the somewhat confusing choice of name:

In this particular case, [Gigabyte] is using a GK104 chip with 1344 cores and 192-bit memory interface. The “GTX 760” name was chosen because of the performance of this particular GK104 chip. It most closely matched the performance of the GTX 760 in our traditional desktop GPU lineup. Perhaps the 870M designation would’ve been more fitting, although the GPU in Brix doesn’t support Optimus and Battery Boost, so that name wouldn’t have been a great fit either.

Keep that in mind as you read the following pages. While the GPU in this system has the same name as the full-fledged GTX 760, it may not be capable of the same feats of performance. The Brix GPU’s 72 GB/s memory bandwidth deficit is particularly noteworthy in this regard.

The Brix Gaming’s GTX 760 is backed with a whopping 6GB of GDDR5 memory, though. That’s the kind of capacity I’d expect to see in concert with something branded “Titan,” not the average mid-range GPU. Whether the GTX 760 can make use of such copious memory is questionable, but it does look good on the spec sheet.

Processor Intel Core i5-4200H (2.8GHz, 3.4GHz Turbo), dual-core, Hyper-Threading enabled
Graphics Nvidia GeForce GTX 760 with 6GB GDDR5
Platform hub Intel HM87
Audio Realtek ALC269 HD audio
Wireless 802.11ac Wi-Fi and Blueooth 4.0

Chipset: Azurewave AW-CB161H

Ports 2 Mini-HDMI

1 Mini DisplayPort

4 USB 3.0

1 Gigabit Ethernet via Realtek RTL8111G

1 combination headphone/microphone port

Expansion SATA port for 2.5″ hard drive/SSD
Dimensions 2.3″ x 5″ x 4.5″ (59.6 mm x 128 mm x 115.4 mm)

Gigabyte normally sells the Brix as a barebones machine. However, the company kindly provided us with a 128GB mSATA solid-state drive and 8GB of RAM for testing. Folks who buy this system at retail will need to supply their own memory and storage. I took a quick survey of Newegg prices for an 8GB SO-DIMM and a 128GB mSATA SSD, and found that similar parts would cost around $200. That would bring the total price for a Brix Gaming system to roughly $1,000.

Like the Brix Pro, the Brix Gaming relies on an external power supply. While it may seem unwieldy, the power supply will likely live under a desk for most of its life, so it’s not a big deal in practice.

Also during my unboxing, I found a splitter for the combined mic/headphone port, a thumb drive with Windows chipset and grapics drivers, a VESA mounting plate, a mini-HDMI to HDMI cable, and a mini-DisplayPort to DisplayPort cable. The Brix Gaming comes with almost everything you’ll need to get it up and running, although I did have to provide my own HDMI-to-DVI adapter to connect the Brix to my older LCDs.

Now that introductions have been made, it’s time to take the Brix Gaming apart.

 

Peeking inside

Much like the Brix Pro, the Brix Gaming comes apart easily. I took four screws out, and I was in:

Inside, I found a 2.5″ hard-drive caddy on the Brix Gaming’s bottom plate. I didn’t have a 2.5″ drive to install, but one can easily see how it’s done: remove the tray, slide your disk of choice under the tabs at the end, secure it with the two screw holes at the bottom, re-attach the tray to the bottom plate, and plug in the combined SATA data and power cable from the Brix Gaming’s motherboard.

All of the Brix Gaming’s expansion slots are on the bottom of the motherboard. The SO-DIMM and mSATA slots are free of obstructions and easily accessible. You can also see the combined SATA power and data cable for the 2.5″ bay taped down in the center of the picture. The wireless card sits under the mSATA slot, and a pair of fans on the left side of the case draw in cooling air, which is then expelled from a vent on the right side.

Removing the port cover allows for a glimpse at the top half of the interior:

There’s a lot of heatsink in there, though not as much as I expected. I would have loved to take the Brix Gaming apart further, but removing the motherboard is difficult, and I didn’t want to break anything. (This was a loaner from Gigabyte.) Based on the placement of the heatsinks, it appears the CPU resides on top of the motherboard, while the GPU is on a daughter board that sits at the very top of the case. You can see its mini-DisplayPort and mini-HDMI outputs in the picture above.

Given the limited heatsink area, I’m a little puzzled by a couple of the exterior design decisions. The enclosure’s top plate is solid, which means it blocks any hot air that might rise from the bottom of the GPU daughter card. Also, on the intake side of the case, one of the fans is partially blocked by an angled section of the fascia:

As we’ll soon see, the Brix Gaming needs all the help it can get on the cooling front. Any obstruction to airflow or heat dissipation, deliberate or otherwise, is bad news.

 

Gaming performance

With a dual-core mobile CPU, the Brix Gaming isn’t going to break any benchmarking records. That said, we’ve found Haswell-based CPUs to be great performers in general, and in my experience, the Core i5-4200H lives up to that tradition. For more detail, you can check out our overview of the Haswell microarchitecture and its performance here. Instead of retreading those performance numbers, let’s install a few games on the Brix Gaming and see how it holds up.

Watch Dogs

The first game in my test suite is Watch Dogs, which employs a sophisticated game engine called Disrupt to render its techno-dystopian Chicago setting. Watch Dogs can strain even the fastest PCs available today, so it’s a good stress test for the Brix Gaming.

Driving seems to be one of the more resource-intensive activities in this game, so I chose it over wandering around on foot for my testing. To collect data, I drove around a section of the in-game Chicago Loop five times for 90 seconds each, trying to stick as closely to the same route as possible.


Lest you come away from these numbers with a bad impression of the Brix Gaming, we should note that Watch Dogs appears to be unusually dependent on CPU resources. Results from an upcoming project in Damage Labs suggest that even powerful desktop CPUs can have a hard time with Watch Dogs, so the GPU may not be the limiting factor in getting good performance in this game. With that in mind, it’s probably not fair to judge the Brix Gaming solely on its performance in Watch Dogs.

CPU demands aside, you’ll recall that Nvidia described this custom GPU as closely matching the performance of desktop versions of the GTX 760. In my experience, it falls short. Part of the problem seems to be that the system has to throttle the CPU and GPU clock speeds under load, reducing overall performance somewhat. The game isn’t entirely smooth, either, even at these low settings. My experience with Watch Dogs involved a fair bit of stuttering and hesitation.

There is one thing that can help the Brix Gaming’s performance, though: Turbo mode. Hidden away in the EFI under the Chipset category is a setting called “System Performance Mode.” The default parameter, “Operational mode,” keeps fan noise in check at the cost of some clock speed throttling. “Turbo mode” turns the tables. It increases fan speed across the board, which allows the CPU and GPU to remain closer to their normal clock speeds. After discovering this setting, I re-ran my test cycle with Turbo mode enabled, and the results are highlighted in orange above.

For really demanding games like Watch Dogs, keeping Turbo mode enabled is a big help. It increases average FPS and lowers the 99th percentile frame time, making for a noticeably smoother experience. Unfortunately, Turbo mode also makes the system quite loud. I’ll quantify just how loud it is in a bit, but for now, let’s switch things back to Operational mode and try out a few less demanding games.

 

Subjective gaming performance

I played the following three games at the highest detail settings that still produced smooth gameplay. While I didn’t do any data logging, I left Fraps running and kept an eye on frame rates to bolster my subjective impressions.

BioShock Infinite

BioShock Infinite is based on Unreal Engine 3, so it isn’t too demanding of modern hardware. I was able to use the game’s Normal graphics preset at 1920×1080. At those settings, the Brix Gaming delivered solid frame rates in the 55-60 FPS range. Any stutter and lag were minimal, never distracting me from the gameplay itself.

Dota 2

Dota 2, Valve’s massively popular MOBA, is based on the Source engine, which runs well even on modest hardware. The Brix easily hit and remained at the game’s 60 FPS cap, even with the limited graphics settings maxed at 1920×1200. I never saw even the slightest hint of stutter or lag.

Counter-Strike: Global Offensive

While slightly more demanding than Dota 2, Counter-Strike: Global Offensive is another Source Engine game. Here, the Brix Gaming could maintain 90-100 FPS at 1920×1200 with 8x MSAA, 8x anisotropic filtering, and high-quality textures enabled. Playing CS:GO on the Brix was a pleasure.

Clearly, in games with more modest requirements than Watch Dogs, the Brix Gaming does just fine with its default fan profile. The system can churn out more-than-sufficient frame rates, even at 1080p with medium or high graphics settings. That’s quite a step from other machines with similar physical footprints. For reference, Cyril found that the Brix Pro, which is powered by integrated Intel graphics, maxes out at about 1366×768 with much lower doses of less eye candy.

 

Cooling and acoustics

The Brix Gaming produces a lot of heat in a very small area. The only way for its tiny fans to rid the case of this heat is to spin fast, and that means a lot of noise.

I didn’t have access to TR’s lab-grade decibel meter for my tests, but as with most things these days, there’s an app for that. I downloaded a highly rated app called dB Meter – lux decibel measurement tool for my iPhone and got to work. I wouldn’t claim scientific levels of accuracy for these numbers, but they should provide a rough idea of how loud the Brix gets.

According to this app, the noise floor in my office is about 30dBA with no appliances, computers, or HVAC equipment running. Fire up the Brix, though, and things quickly get rowdy. At idle from a distance of about one foot, the fans produce roughly 37 dBA, and that figure rises to 43 dBA under load. With Turbo enabled, the fans can go quite a bit faster under load, which increases the SPL to 56 dBA.

dBA measurements don’t tell the whole story, though. A sound’s character is just as important a consideration when describing how annoying it is to the ear. At idle, these tiny fans make a coarse whir that’s difficult to ignore. Under load, the fans produce a high-pitched whine. Turbo mode only makes matters worse. With its more aggressive fan profile enabled, the Brix Gaming in full song is like having a 1U server on my desk.

Here’s a video of the Brix Gaming at idle, under load, and under load with Turbo enabled. Hear it for yourself:

While running my gaming tests, I also logged CPU and GPU frequencies and temperatures. You can see how much the Brix Gaming throttles clock speeds—and how Turbo mode alleviates the issue:


Intel’s spec allows for the Core i5-4200H processor to run at up to 100°C, and Nvidia quotes a maximum temperature of 97°C for the desktop GTX 760, which is based on the same silicon as the part inside the Brix Gaming. Judging by the numbers above, then, the Brix Gaming doesn’t overheat—but in its default fan profile, it does throttle clock speeds a fair amount in order to keep temperatures within spec. Even with Turbo mode enabled, the discrete GPU occasionally dips below its 941MHz base speed.

Toward the end of my time with the Brix Gaming, Gigabyte informed me that a new firmware update was available for the system. According to the company, this update changes the fan profiles for both the default and Turbo modes. I didn’t have time to run the new firmware through the same tests as the old, but I did do a little last-minute testing to see what had changed.

With the new firmware, the default fan profile seems to allow the fans to run a tiny bit faster under load, but the difference is minimal. The updated Turbo profile appears to have been the biggest change. The system will now spin its fans as fast as is necessary to avoid GPU throttling under load, at the price of further increased noise. According to my imperfect tools, the Brix Gaming can now reach up to 60dBA under load with Turbo mode enabled. That’s loud, folks. Get your noise-canceling headphones ready.

 

Conclusions

I wanted to love the Brix Gaming. Even with its flaws, it’s amazing how much raw performance Gigabyte has packed into this tiny package. If your game library consists of older or less demanding titles, the Brix Gaming can easily produce playable frame rates at 1080p with moderate-to-high detail levels. For a system that can easily fit into a backpack or messenger bag, that’s incredible performance. The Brix Gaming is certainly worth a look for anyone who needs maximum portability from their gaming PC. Right now, there’s not much else like it on the market.

This system’s feverish temperament broke my heart, though. The Brix Gaming runs hot, and while its small fans are up to the task of getting rid of the heat, they make quite a din. Turbo mode’s higher fan speeds improve performance, but the fans make far too much noise at those speeds to be tolerable. If Gigabyte could figure out a way of cooling mini PCs like this one both effectively and quietly, it would have a real winner on its hands.

With that in mind, it’s tantalizing to consider what Gigabyte could do with 50 to 100% more interior volume in a system like this. Such a system would still be smaller than mini-ITX PCs, and the added room would allow for bigger heatsinks and larger, quieter fans. For the moment, though, the Brix Gaming trades quiet operation for performance and small size, and while those might be reasonable compromises for some, I like my gaming PCs quiet. If the next Brix Gaming manages to be this fast without all the noise, I’ll be first in line to buy one; but for the moment, the ATX mid-tower under my desk is safe.

Comments closed
    • Mister Blunt
    • 5 years ago

    The fan noise reminds me of a launch XBox 360, and not in a good way.

    • burntham77
    • 5 years ago

    I can’t see a niche where this would fit. People wanting portability would be better served by a laptop. People trying to save desktop space could opt for either a laptop or MSI’s AIO gaming PCs (I have one and it’s fast and quiet).

    I guess if a gamer was fine with the performance level and noise of this thing, plus they had a very specific monitor in mind, they could be happy with this, but there are too many compromises I think.

    • HisDivineOrder
    • 5 years ago

    Why is it every new product category with regards to graphics technology must first go through the, “Okay, performance is bad!” stage, then the “Okay, performance is better, but hey dude, I just went deaf!” stage, and eventually into the, “Okay, maybe too slow AND too loud will not work, so let’s try a balanced approach instead” stage?

    Can’t they just skip to the end? The 360 did it. Our video cards did it (ie., FX5800, AMD apparently “caught up” to nVidia recently with the R9 290X reference cooler). Our CPU’s did it back in the day (ie., Delta Black’s, roar in peace (RIP)). Gaming laptops did it for a while (and still do I guess in some limited cases).

    And now… here we are. In 2014, we have yet another chance to see it happen. You’d think after all those MANY, MANY instances where the noise was too loud and it got blasted for it that Gigabyte would go, “Hey, maybe this is too loud.”

    Nope. Never gonna miss a trick to crap something out in case people won’t notice until they’ve built the system and refuse to disassemble it and send it back, Gigabyte goes ahead with their recipe in bad choices.

    The problem is an easy one to fix. Slap this thing into a special cube shaped case that keeps the depth of the system the same but puts 120/140mm fans on the top and bottom (or front and back) or side to side), slot the motherboard in the middle of the cube, let the air blow across it, and throw a giant heatsink inside it that has lots of room between the fins to let air flow dominate. Put the GPU on the bottom, the CPU on the top. Wind tunnel solves everything including size making the airflow quieter because you can run it slower. You know, the same thing that all of the above learned years ago.

    Or water cool it.

    Or if you’re going to use a shell as big as this BRIX, then how about you just blackmail Intel into releasing more CPU’s with their Iris Pro tech instead?

    Or just wait for Broadwell/Maxwell.

    Anything but release this and give us the vacuum cleaner of old.

      • Spunjji
      • 5 years ago

      Not sure why the downvotes, that’s a sensible solution. I like the idea of a left-right wind tunnel and horizontal CPU/GPU division, allowing the ports to be placed at the front and back. Even better, you could always make the division a 33/67 one and have a taller heatsink on the more power-hungry and throttle sensitive GPU. Sure, the whole thing gets *slightly* larger, but it’s still tiny.

      EDIT: I just realised with that setup you also get sealed top, bottom, front and back plates as well. Further reduces noise (indirect radiation of sound) and potential for dust problems.

      …can we build this?

    • Anovoca
    • 5 years ago

    Every time I thought up a suggestion to make this run quieter or cooler, it occurred to me I was just describing ITX form factor.

    I wouldn’t call this pointless though. I like that gigabyte is taking a risk here to explore gaming potential in different form-factors. Even if the machine is a bust, there is valuable R&D that will come of it.

    • ronch
    • 5 years ago

    So in this case, it’s not only Intel that has a misleading product model, but Nvidia as well.

    Anyway, what I think Gigabyte needs to do is hire a bunch of artists to give their Brix line some much-needed inspired styling.

    • Metonymy
    • 5 years ago

    This is not a snarky question, as I like the look of the Brix series (though the fugly Intel sticker on the green is hard to take). But…. is there really a decent market for people who need/want/like this size enough to put up with the noise level when gaming with this, when an only slightly larger case could reduce the sound level significantly?

    I’m genuinely wondering.

    • l33t-g4m3r
    • 5 years ago

    You can only make PC’s so small before you’re compromising too much. Just buy a shuttle.

    You want smaller than that? Buy a shield tablet.

      • fr500
      • 5 years ago

      There is a Zotac box with discrete graphics that fares well in all areas:
      [url<]https://www.youtube.com/watch?v=PaLbUXpXPSY[/url<]

    • JohnC
    • 5 years ago

    Not sure what was the point of putting such GPU into such small system… All it does is creates a lot of extra noise with very minimal practical benefits. You want “gaming system”? You can build (or buy) similar one which would be much more quiet for about the same or less money, using larger case and video cards with larger heatsink and more quiet fans. You want “portable gaming system” (something which you need to carry around)? Get a laptop with better GPU, it would be more compact and much more easy to set up and manage (compared to this thing and stand-alone monitor/keyboard/peripherals and all the cables for it). You want “compact gaming system” (something permanently mounted but which would take very little space)? Buy an “All-in-One” system – MSI makes such gaming-oriented systems which are already more powerful and take much less overall space (since all of the components are “spread around” the back side of the monitor), with less annoying cables, fully wall-mountable and with an optical drive which is still useful for some tasks. You just want a compact HTPC, something to stack next to your receiver/TV/gaming console? You can buy plenty of others which look more aesthetically pleasing, consume less power (because they don’t waste it on such redundant GPU) and as a result produce less noise and heat.

    • UnfriendlyFire
    • 5 years ago

    If it’s going to be loud, might as well as throw in a Dustbuster FX 5800 GPU. Or the Radeon 7990.

    And a 120mm Delta fan to keep everything at almost room temperature: [url<]http://www.youtube.com/watch?v=9Dcj7tB4NCk[/url<]

    • kureshii
    • 5 years ago

    Any chance to do some power consumption measurements?

      • Jeff Kampman
      • 5 years ago

      I don’t have a Kill-A-Watt here, but given the mobile-ish origins of the CPU and GPU, it’s probably not that much of a power hog.

      • lhl
      • 5 years ago

      It’ll hit/sustain 150W+ under CPU+GPU load. It’s about 60W CPU only. If you want to check out some of the tests I ran when I got mine: [url<]https://randomfoo.hackpad.com/GB-BXi5G-760-Testing-OILOc5jPsEn[/url<]

        • kureshii
        • 5 years ago

        Thanks, very informative. That must be quite a chunky power brick then.

    • orik
    • 5 years ago

    Heavily considering buying this and replacing the chassis with something custom that would cool better.

    How does the GPU in this compare to a gtx 750?

      • MadManOriginal
      • 5 years ago

      No point, just make an ITX build instead.

      • UnfriendlyFire
      • 5 years ago

      If you want to not loose your sense of hearing, I think you can build a quieter ITX rig for the same price.

        • orik
        • 5 years ago

        I’m sure I could cool this quietly with a custom chassis. 🙂

      • vargis14
      • 5 years ago

      Looks to be a good bit faster then a GTX 750TI.

      If I recall correctly it looks the be in the GTX 670 class with 1300+ shaders but only a 192 bit memory bus and slight slower memory speeds for heat i assume.
      EDIT: Why it has 6gb of memory is beyond me…..waste of chips if you ask me….4gb is plenty for a card of that speeds.

        • vshade
        • 5 years ago

        Maybe there is some performance loss when using the asymmetric ram, so it was better to put 6gb or 3 gb instead of 4. Also given the performance and ram of the consoles maybe it is easier to run console ports if it has 5 or more gigs

      • lhl
      • 5 years ago

      It’s basically (all the specs and even the PCI ID) an 870M. This might give you an idea of how it stacks up: [url<]http://www.notebookcheck.net/NVIDIA-GeForce-GTX-870M.107792.0.html[/url<]

    • ChangWang
    • 5 years ago

    Sounds like my wifes hair dryer under load LOL

    • ssidbroadcast
    • 5 years ago

    Huh. Actually 800 dollars doesn’t sound so bad for a complete system in such a small form factor. I would consider buying one if I were in the market for an HTPC/Steambox. As for the noise, if playing games is all you’re using it for just get a louder speaker/soundbar setup or decent headphones.

    I think the only way they could make something this small quieter is if they replaced the fans with a water-cooled block, and that’d probably push the pricetag up another 100-200 dollars.

      • Milo Burke
      • 5 years ago

      But it’s not a complete system. It didn’t come with storage or memory.

        • ssidbroadcast
        • 5 years ago

        Ohhhhhh WEIRD. I zipped past the paragraph explaining that because I thought that the GeForce 760 on the Brix was using 6GB of system RAM, and presumed that the system came with 8GB-16GB of RAM total…

        … the fact that this thing has 6GB of DDR5 VRAM on a 192-bit bus intended for 1080p displays is super weird.

        Also, yeah as you pointed out since you have to supply your own RAM and disk drive, that really throws a wet blanket on things.

          • Milo Burke
          • 5 years ago

          Actually, I prefer they come without storage and RAM.

          I’ve helped six people buy laptops in the last 18 months, and each time I take out the installed RAM and HDD and install double the RAM and an SSD because it is cheaper to do it from Amazon/Newegg than to buy it from the manufacturer. It would be better still to not be paying for the garbage HDD and RAM.

          But, of course, a computer shipping without storage or RAM should have a lower entry price to match.

            • ssidbroadcast
            • 5 years ago

            Sure, but as the Kampman points out, buying the system RAM and SSD yourself pushes the total price to around $1000. Yuck.

    • DPete27
    • 5 years ago

    I almost wonder how much could be improved with a TIM change and some better (40mm?) fans. At that price premium, it’s a shame that’s not already taken care of though. Unfortunately, being a proprietary form factor, there’s not much else the average Joe is going to be able to improve upon for noise/cooling.

    I think of it this way: This is a piece of hardware that’s ahead of it’s time. It won’t be long before CPUs and GPUs are efficient enough to deliver this amount of power in a much-reduced TDP.

    • Milo Burke
    • 5 years ago

    Is this our first review by Jeffrey? I like it!

    Your attention to detail serves you well. You’re inquisitive, and you chase down answers to the same questions most reviewers from other sites might ignore and then have asked in the comments.

    Your wording was a touch less of a narrative than what we’re used to at TR. I look forward to seeing you grow into your own voice. For the moment, it’s a bit of this: [url<]http://blog.a-b-c.com/wp-content/uploads/2014/07/Just-the-Facts-Maam.jpg[/url<] The detailed TR-style performance reporting for Watch Dogs is indispensable, yet I really appreciate that you provided subjective experiences from other games. We learned right away that this will be a very competent machine for someone playing a MOBA game or a last generation FPS in multiplayer. Good to know. I hope to see another review from you soon!

      • derFunkenstein
      • 5 years ago

      I agree, it was well written and thoughtful.

      • ssidbroadcast
      • 5 years ago

      Yeah a full product review so soon. Well done.

      • Jeff Kampman
      • 5 years ago

      Thanks for the kind words. I’ve also reviewed Rosewill’s RGB80 mechanical gaming keyboard, as seen here: [url<]https://techreport.com/review/26755/rosewill-rgb80-keyboard-reviewed[/url<]

    • Milo Burke
    • 5 years ago

    I can’t get the speed and temperature graph to change when I click the buttons for the turbo graphs. I tried it in Firefox 31, Chrome 35, and Opera 22.

      • Cyril
      • 5 years ago

      My bad. Should be fixed now.

        • Milo Burke
        • 5 years ago

        Thanks!

    • NeelyCam
    • 5 years ago

    Awful coloring. I wonder how much NVidia paid Gigabyte for that…

    • MadManOriginal
    • 5 years ago

    I know you didn’t run any pure CPU benchmarks, but the system as tested only has one memory channel populated. While it might not make a difference in the gaming that is practical for this sytem, I would think it matters for overall performance, and possibly for the apparently CPU-dependent Watchdogs. Do you have an extra SO-DIMM lying around to test with dual channel memory?

    (Also – the CPU (Turbo) and GPU (Turbo) graph buttons don’t do anything.)

    • Chrispy_
    • 5 years ago

    [quote<]the power supply will likely live under a desk for most of its life, so it's not a big deal[/quote<] Well, with a garish paint job like that and poor noise levels, the Brix itself will live out of sight and out of earshot too, most likely 😉

      • tay
      • 5 years ago

      Heh agreed. What a terrible box. External power supplies are fugly and I hate them with a passion. This is why Apple has it right, where every little thing including the router and apple TV has an internal power supply.

    • chuckula
    • 5 years ago

    [quote<]Lest you come away from these numbers with a bad impression of the Brix Gaming, we should note that Watch Dogs appears to be unusually dependent on CPU resources. Results from an upcoming project in Damage Labs suggest that even powerful desktop CPUs can have a hard time with Watch Dogs, so the GPU may not be the limiting factor in getting good performance in this game. With that in mind, it's probably not fair to judge the Brix Gaming solely on its performance in Watch Dogs.[/quote<] Watch Dogs was ahead of its time.. and a humongous, steaming bowl of elephant piss. [url<]http://snltranscripts.jt.org/05/05mfunhouse.phtml[/url<]

    • jjj
    • 5 years ago

    When they announced this i was wondering how big the power brick is since the PC is smaller than a standard ATX PSU. I can see the brick in one pic but any chance you can share actual dimensions? Doesn’t really matter, just curious.

      • Jeff Kampman
      • 5 years ago

      The system is in the mail now, but the brick is about an inch tall by 7″ long by 3″ wide, if memory serves. It’s pretty large.

        • Chrispy_
        • 5 years ago

        In the mail because it overheated and you’re RMA’ing?

          • d0g_p00p
          • 5 years ago

          DId you read the review? It’s a loaner unit unless you’re being facetious.

    • derFunkenstein
    • 5 years ago

    The noise is so maddening about these. They’re really trying too hard to keep the physical size down where they could make them a big bigger with a nicer cooling system and make it not sound like a hair dryer. Or just as well, keep the same size and replace the GPU with something more tailored to the size/power constraints, like a desktop Maxwell GTX 750 or 750Ti.

      • Pez
      • 5 years ago

      Not entirely sure why they didn’t go with a full-mesh design. Surely that would have alleviated temperatures somewhat. As long as the mesh was relatively fine it should be easy to clean free-ish from dust too.

        • Usacomp2k3
        • 5 years ago

        You could put a monitor on top of it with it being a solid top.

          • derFunkenstein
          • 5 years ago

          it’s so small that if you tried to balance a monitor on top of it, I think you’d have a hard time.

      • drfish
      • 5 years ago

      1000x this. Fricken build it like a PSU with a big 120mm fan on the top or bottom (or both) already. We will forgive the small size increase.

      • homerdog
      • 5 years ago

      Absolutely. GM107 would have been a [i<]much[/i<] better fit.

      • MadManOriginal
      • 5 years ago

      That’s what everyone said when this was announced as well, it’s an obvious thought. The only thing I can figure is that Gigabyte was designing this quite some time ahead of the GM107 chip and decided to push ahead with what they had. v2 might have some great improvement in the noise department with a much cooler (and cheaper?) Maxwell GPU.

    • Usacomp2k3
    • 5 years ago

    That is a cute little PC. You pay heavily for the size though, it looks like. What would be the equivalent mid-tower be priced at? 33% lower?

Pin It on Pinterest

Share This