A first look at Gigabyte’s next-gen Intel motherboards

Brace yourself, because a batch of new motherboards based on Intel’s next-gen chipsets is just around the corner. Gigabyte is prepping no fewer than 37 different models, almost all of which will be available stateside. I got a closer look at a bunch of them at a press event earlier this week. I can’t divulge certain details just yet, but I can tell you about some motherboard-specific features. And I can show you pictures—lots of pictures.

The new boards are split into four families, each with a different focus. There are the standard versions, of course, plus gaming-specific mobos, SOC variants optimized for hardcore overclockers, and a whole new line of Black Edition products with longer warranty coverage. Let’s start with the G1.Gaming series, which ditches the green theme of Gigabyte’s previous gaming boards in favor of a black-and-red aesthetic that looks eerily familiar.

Every major motherboard maker seems to be using a similar color scheme for its next-gen gaming gear. They’re not necessarily copying each other, though. According to Gigabyte, system integrators requested the color scheme because it’s easy to match with other components, such as graphics cards and memory.

System builders are partly to blame for the oversized VRM heatsinks, too. Most boards can get by without these coolers, Gigabyte says, but the massive hunks of metal are included for show. Boutique builders apparently think motherboards with smaller heatsinks look too pedestrian, and they’re not interested in them as a result.

Gigabyte claims the market for so-called gaming motherboards is more interested in appearances than hardware specifications. A lot of PC enthusiasts probably don’t share that mindset, but enough people do that all the big mobo makers now have gaming-specific brands. At least the G1 boards are separate from Gigabyte’s more traditional enthusiast fare.

The G1.Gaming line comprises 12 members in all. It has everything from an ultra-high-end monster that supports four-way SLI to a compact Mini-ITX offering that isn’t ready for the spotlight just yet. Get this: the top-of-the-line board actually has two PLX switch chips. One divides the CPU’s Gen3 PCI Express lanes for multi-GPU configs, while the other shares the chipset’s Gen2 connectivity between all the onboard slots, peripherals, and ports.

We don’t yet have a complete feature matrix detailing which features are available on which boards. Expect the gaming stuff to sport Qualcomm Killer networking chips and upgraded integrated audio, though. The G1 flagship pairs Creative’s Sound Core3D audio processor with high-end Nichicon capacitors and a replaceable OP-amp. Realtek codecs replace the Core3D chip in the middle and lower tiers, but you still get a handful of perks via Creative SBX ProStudio software. Amplified audio outs are available even on some of the cheaper offerings, as well.

Some of the gaming boards feature separate power circuitry for a couple of their rear USB ports. These dedicated lines purportedly deliver cleaner power to USB DACs, and there are probably picky audiophiles out there who swear they can hear the difference. Obsessive-compulsive types with self-powered DACs also have the option to cut power to the “DAC-UP” ports, leaving only the audio signals behind.

For hardcore overclockers, Gigabyte is prepping a pair of SOC boards that borrow super-overclocked branding from the company’s hot-clocked graphics cards. If your cooling solution doesn’t employ liquid nitrogen, the SOC boards are probably overkill. But they do a few interesting things for competitive overclockers seeking benchmark records. Individual DIMM and PCIe slots can be disabled via dip switches, for example, and the memory lines have been subtly massaged. There are more than two dozen onboard buttons, and I couldn’t tell you what most of them actually do. But I know it’s related to running at the ragged edge, where sub-zero temperatures are required and systems need to be stable only long enough to run a benchmark and capture a screenshot.

Gigabyte’s premium OC offering for the 8-series generation uses a PLX switch to provide enough PCIe connectivity for four-way SLI configurations. That chip apparently has a bit of associated latency that can reduce benchmark scores, so Gigabyte dropped it to improve performance with single- and dual-card configs. Quad setups just aren’t popular in extreme overclocking circles, at least among people who buy their own motherboards.

Thanks to firmware tweaks, the SOC boards can be overclocked on the fly within the UEFI, with no need to reboot. This real-time overclocking capability is set to be available on all of Gigabyte’s next-gen boards—or at least on the ones that support CPU overclocking, anyway.

More traditional fare… with a twist

While the SOC and G1.Gaming camps are somewhat specialized, the standard Ultra Durable clan should have broader appeal. These boards are supposed to offer the best combination of features and pricing. Some of them borrow features from the gaming family, but they’re largely more straightforward overall.

Speaking of borrowing, the color scheme for this bunch is familiar from Asus’ 8-series boards. The tone is richer, and it’s at least limited to the heatsinks. I’m not surprised to see it again. Multiple mobo makers have told me the black-and-gold look is popular in China, a country that now represents over half of Gigabyte’s global business.

By now, eagle-eyed readers will no doubt have noticed some M.2 slots and SATA Express ports in our pictures.

Ahem.

The arcane rules attached to this particular product embargo forbid me from commenting on the origin of those connectors, but you can probably guess. A certain next-gen chipset has long been rumored to support both standards. Note that it’s a “next-gen” chipset. The name is verboten, but the math isn’t difficult. Look, the secret combination of alphanumeric characters is clearly written on the motherboards I’m allowed to show you:

Last, but not least, we have the Black Edition breed.

Apart from their mostly murdered-out color schemes, these boards are physically identical to products in the standard and gaming families. However, they undergo a week-long burn-in test before shipping to customers, which should cut down on DOA boards and premature failures. They’re also covered by a five-year warranty.

Gigabyte spent more than a million bucks building a dedicated Black Edition test lab at its factory in Nanping, Taiwan. The lab is capable of hammering 3,000 systems simultaneously, and once it’s up to speed, it should be one of the largest Litecoin mining operations in the world. Yep, Gigabyte is going to use cryptocurrency mining to stress not only Black Edition motherboards, but also a new line of graphics cards. The exact test config hasn’t been finalized yet, and it’s unclear whether the onboard storage, networking, and other peripherals will be targeted.

Gigabyte hasn’t decided what to do with the proceeds from the operation, either. Charity is one option—just think of the children. Another is giveaways exclusive to Black Edition owners. I wonder how the virtual cash generated by the lab will compare to the cost of running it. On top of the power consumption and cooling requirements, maintaining a few thousand PCs running at full tilt will require some manpower, especially since the boards and graphics cards will all be swapped out weekly.

In the U.S., Black Edition boards are expected to command about a $10 premium. That doesn’t seem like too much to ask for additional testing and longer warranty coverage, though registration is required for the latter. Registering will also grant Black Edition owners access to a private website and possibly even live video streams from the test lab.

There are new firmware and software features to accompany the updated motherboards, of course. Gigabyte’s updated UEFI supports 19 different languages. It also sports an intro screen and setup guide for newbies, plus a Smart Tweak mode optimized for keyboard-only navigation. The fan controls may be improved, as well, but I didn’t get a definitive answer on that front.

On the software side, EasyTune’s integrated hardware monitor has left the tweaking utility to pursue a solo career as a separate application. The two apparently couldn’t agree on matters of system polling. Gigabyte has also added Game Controller software that lets users program a range of hotkey functions on keyboards that don’t have built-in macro support. There’s even a sniper mode that decreases the mouse sensitivity for precise headshots.

I haven’t played with the new firmware and software yet, but the changes sound more evolutionary than revolutionary. The same goes for the motherboard tweaks. That’s not necessarily a bad thing, though. Gigabyte’s 8-series boards are pretty good, and small refinements can have a big impact on the overall user experience. We’ll have a full report on what it’s like to actually use Gigabyte’s new hotness soon. In the meantime, you can peruse some additional board shots in the image gallery below.

Comments closed
    • LoneWolf15
    • 5 years ago

    I’m hoping Gigabyte takes this chance to go with Intel NICs, skipping both the low-end Realtek, and the overrated Killer NICs.

    My last couple of boards have been Gigabyte, and I’ve been pleased with them. Thick PCBs, so no warping, and a solid feel. Stable as all get-out. My current one (Z68XP-UD5) only has base UEFI functions to support >3TB hard drives; it still has the blue/yellow Award BIOS. As I’ve skipped Ivy Bridge and Haswell, if their new UEFI BIOS, board construction and features are good at the UD3/UD5 level, maybe it will be time for an upgrade when Broadwell comes around.

    • MadManOriginal
    • 5 years ago

    So is microATX dead or just dying and becoming seriously niche? No mATX motherboards in this preview, the only one from Asus I saw at Anandtech was the ROG Gene which is an overly expensive ‘gamer’ board. I am sure there will be some mATX but to not preview any… :/

    • Bensam123
    • 5 years ago

    SoundCore3D seems to be becoming more and more common as a integrated solution. That seems like it was Creatives original intent after aiming for the lower base line and now it’s coming to pass. I do miss the days of them making real chips though, like the X-Fi. Perhaps one day it’ll come back around.

    Speaking of which, we haven’t heard anything about Trueaudio in a long time… Curious what’s up with that.

    • Umbragen
    • 5 years ago

    And I was wondering how much it would cost to get a Gigabyte board with no pci slots.

    • d0g_p00p
    • 5 years ago

    The USB port arrangements and numbering look a little, off? Nice looking boards though.

    • f0d
    • 5 years ago

    “System builders are partly to blame for the oversized VRM heatsinks, too. Most boards can get by without these coolers, Gigabyte says, but the massive hunks of metal are included for show”

    i agree that most boards can get by without the vrm heatsinks but as a heavy overclocker its one of the first things i look for on a motherboard

    at 5ghz my 3930k’s vrm’s can get up to 90degrees C after an hour of linx (intelburntest) and i have a bunch of fans directed right at the vrm heatsinks on my x79 sabertooth

    yeah i know im in a minority as a heavy overclocker, still i say theres nothing wrong with having some big heatsinks on the vrms just in case someone decides to overclock with these boards

    • crabjokeman
    • 5 years ago

    The DIP switches also come in handy for setting the IRQ and COM port on your modem.

    • Chrispy_
    • 5 years ago

    These “gamer” boards are all starting to look like “extreme enthusiast tinkerer” boards.

    Gamers don’t want to constantly have their side-panel off, endlessly overclocking using motherboard switches and buttons, running synthetic stress-tests and making sure their LED lighting scheme and colourways match their case fan trims.

    Gamers want to game. Can we call these something else please, Asia. They’re pretty cool but they’re for hardware enthusiasts and overclockers. If there’s any overlap with the gamer demographic it’s a small one, in my opinion. (Dreamhack, Insomnia LANs etc).

    • MadManOriginal
    • 5 years ago

    mmmm….SATA-express. I’ll still wait until a more significant CPU architecture change (Skylake?) to upgrade from Ivy Bridge though.

      • Oscarcharliezulu
      • 5 years ago

      Maybe, but I think faster storage trumps what we’ve been getting lately across CPU generations.

    • sluggo
    • 5 years ago

    “Gigabyte is going to use cryptocurrency mining to stress not only Black Edition motherboards” …

    And the cries of “mother*******!” (or the Mandarin equivalent) arise from a handful of line test technicians who were two steps ahead of their employer in this regard.

    • Voldenuit
    • 5 years ago

    [quote<]Yep, Gigabyte is going to use cryptocurrency mining to stress not only Black Edition motherboards, but also a new line of graphics cards.[/quote<] So if you buy a new Gigabyte Radeon, it's already dead out of the box from mining? Thanks, Litecoin. 😛

      • Wildchild
      • 5 years ago

      This is a pretty common way to test electronics. High end vacuum tube manufacturers use the same process (though not for litecoins). It’s a lot more of a good thing than you might think.

      There was one occasion where a GTX 560 Ti reared it’s ugly head during a stress test and, ironically, it was made by Gigabyte!

        • Voldenuit
        • 5 years ago

        I was kinda being facetious here. It’s definitely a good thing for manufacturers to stress test components before selling them, and it’s good to hear that Gigabyte is at least considering donating the proceeds to charity, although I agree that F@H should also be considered. After all, how else can Big Pharm make money if they don’t patent discoveries made by the public sector (here I go being facetious again, but only just).

      • l33t-g4m3r
      • 5 years ago

      I don’t agree with this stereotype because it’s based on fud. It’s not like they’re overclocking / volting, and the card should be operating under warranty supported specs. Burn-in is also a widely accepted testing method to weed out defective hardware. Anything that passes this test should be rock solid.

      • HisDivineOrder
      • 5 years ago

      I’d rather they use actual graphics tasks to test a video card than some Compute function that isn’t a true test of a graphics cards ability to produce graphics.

      Sure, it’ll run hot, but that doesn’t mean it’ll really stress the parts that are stressed by actual games.

    • psuedonymous
    • 5 years ago

    ‘Low noise’ USB outputs for DACs! Fantastic! For those who have DACs with staggeringly poor power filtering stages, now you can buy an entire motherbord to compensate. Rather than buying a competently designed DAC, of course.

      • mnecaise
      • 5 years ago

      Excellent. this will help improve the sound quality, along with my heavy gauge oxygen-free mono-crystalline copper litz-wire USB cables with excessively-platted gold connectors.

      Seriously though… I did have a problem with a motherboard introducing noise into my sound system. It was a bitch to deal with and I did end up spending a couple hundred dollars to work around it. Might have been cheaper to buy a new motherboard in hind-sight.

      Edit: for what it’s worth, this was like 8 or 9 years ago. That computer is long gone but I think I still have the stupid expensive M-Audio card I bought to solve the problem.

      • MadManOriginal
      • 5 years ago

      For USB-powered DACs this feature has some vague potential to be meaningful even though they ought to have some power filtering, for something like a USB headset or budget USB soundcards it could help. But anyone who cares enough about audio with a desktop system will have an external self-powered DAC anyway, and these are all ATX motherboards so they aren’t going into proper LAN boxes (aka ITX builds – we’ll have to see if they have this feature on the ITX boards). I just use optical S/PDIF to mine which automatically means electrical isolation.

    • DPete27
    • 5 years ago

    It costs a given amount to run the 3,000 boards for stress testing no matter what the stress test is. I actually agree with their decision to do something “productive” (notice I didn’t say useful) with those CPU/GPU cycles. I’d imagine they’re not lawfully able to keep the proceeds for themselves, so don’t put them too high on a pedestal for their “generosity.”

    BTW, I know F@H is supposedly the more philanthropic route to take, but can anybody actually tell me what useful advances have [b<]actually been completed[/b<] as a result of all that computing?

      • colinstu12
      • 5 years ago

      over a hundred papers have been written with the data found from F@H.

      science takes time. the answer to cancer isn’t going to just pop out of someone’s computer one day.

        • mnecaise
        • 5 years ago

        Well, technically, it might.

        But it can take a decade (or two) to go from “hey, this folded protein met the initial requirements and had the top score” to a fully tested and FDA approved drug.

        Of course, that’s probably what you meant by your comment.

      • UberGerbil
      • 5 years ago

      Why wouldn’t they be lawfully able to keep the proceeds?

    • CheetoPet
    • 5 years ago

    M2, check (sorta yay!!!)
    SATA Express, check (double yay!!!)

    Whats that other connector riding the edge of the boards near the SATA ports?

      • Ninjitsu
      • 5 years ago

      I think it says “ATXAP”…AP for additional power? Looks similar to a SATA power connector…

      • jaset
      • 5 years ago

      [quote<]Whats that other connector riding the edge of the boards near the SATA ports?[/quote<] It appears to be ATX4p utilising a SATA power connector.

    • chuckula
    • 5 years ago

    I tend to agree that there’s nothing miraculous in these next-generation chipsets whose names we shall not utter here.

    However, I am happy to see that they did manage to get SATA express ports in there since there was some debate about whether they would actually be supported. M.2 support is good too.

    • stdRaichu
    • 5 years ago

    Yes. YESSSSSSSS. Finally we know that there’s at least some support for M2 and SATA Express coming to a Chipset That Must Not Be Named (But Whose Name Can Be Photographed) near you.

    I’m going to call this next-gen chipset the CTMNBNBWNCBP for the time being until the marketing boys come up with something a bit catchier. Finally I’ll have a good enough reason to upgrade from my trusty 2600K.

    Edit: Looks like Crucial are finally bringing out M2 drives as well. They’re not listed on Crucial’s UK site but are up for pre-order on Scan: [url<]http://www.scan.co.uk/products/480gb-crucial-m500-m2-2280-ssd-6gb-s-20nm-mlc-flash-read-500mb-s-write-400mb-s-80000-iops-max[/url<] I sense an awesome TR M2 storage standoff as soon as the CTMNBNBWNCBP is released.

      • Ninjitsu
      • 5 years ago

      [quote<]CTMNBNBWNCBP[/quote<] Caesar cipher?

    • appaws
    • 5 years ago

    These big, dumb looking (and not really functional) heat sinks on things really bother me. Like the big stupid heat spreaders that are slapped on top of RAM sticks and are completely unnecessary and add nothing to performance.

    I don’t know why gamers=tasteless/gaudy in the minds of hardware manufacturers…?

      • derFunkenstein
      • 5 years ago

      Well, I guess the best explanation is that people buy the hardware. I don’t like gaudy, but fortunately they cater to me with the Ultra Durable boards and there’s plenty of RAM without crazy looking heat sinks.

      • stdRaichu
      • 5 years ago

      It’s a well known fact in computing, especially gaming, that if a product doesn’t have a heatsink, it will catch fire, explode, kill you, have an affair with your wife and forget to feed the cat.

      I’ve just bought an aftermarket heatsink kit to weld to the heatsink I installed on top of my heatsinks to keep my heatsinks cool. It comes in a tasteful metallic peuce/lime green stripes colour scheme that compliments perfectly with my Fatal1ty Gaming Runes (100% performance or your money back).

        • Pez
        • 5 years ago

        You’ve made my morning with that comment, thank you!

        • Chrispy_
        • 5 years ago

        OHHAI!

        I endorse whatever product and/or service this is.

      • DPete27
      • 5 years ago

      The problem is, if the heatsinks are there (regardless of their usefulness), everyone will have an opinion about their looks. Back in the day when most RAM was naked, we didn’t have this problem and I’d imagine RAM manufacturers were making higher profits because they didn’t have to keep such a huge product spread (aka 20 different heatsinks all with the same DIMM underneath). VRM heatsinks on the other hand, are something that I actively look for in a mobo. They don’t have to be gaudy and pretty, they just have to be there. I’ve just seen too many mobo reviews where those without VRM heatsinks have stability issues (see FM2+/Kaveri mobo reviews for a current example)

      I think even the most square PC enthusiast can be swayed in his/her purchase by aesthetics as long as the cost difference is minimal and features/performance is the same. Generally (in my experience), the higher the budget, the more aesthetics comes into play. I know my current rig was loosely based around a chosen color scheme (black/red), ultimately it was pointless once it was inside my windowless case…but that didn’t seem to affect me.

      I do agree with your statement that the gamers = gaudy design taste stereotype is almost always false in real life (in the US anyway). Most people I talk to like the understated stuff, sometimes with a TASTEFUL dash of color. Keep in mind though, most of these companies are Asia-based (no racial stereotypes intended, but they’re going to take their design cues from what’s popular in their home country and/or country that buys the most of their stuff)

      • colinstu12
      • 5 years ago

      exactly what I was going to complain about. take all that crap off and clean up the look of those boards, minimalism is great.

      • f0d
      • 5 years ago

      they are very useful when overclocking – VRM’s can get very hot with high voltage

      sure when not overclocking they probably wont be needed but then the people that dont overclock can do just fine with the cheapest model motherboard anyways and they have the “minimalist” look you want with no heatsinks or “gaudy” looks

        • LoneWolf15
        • 5 years ago

        It depends. Analog VRMs supposedly have higher temps than digital ones.

        Also, some board manufacturers attach the VRM heatsinks with plastic push-pins, which aren’t very secure. Better boards (my Gigabyte Z68XP-UD5 and others) screw them down, creating a much more positive contact that can’t move easily.

      • Chrispy_
      • 5 years ago

      I’m usually the first person to complain about stupid heatsinks that are woefully inefficient and seem to exist solely for artistic or manufacturer logo reasons.

      I’m glad someone else has taken up the torch now that I can’t be bothered to complain on every article 😉

Pin It on Pinterest

Share This