Gigabyte has an Aorus party at Computex 2019

To the surprise of absolutely no-one, Gigabyte has a big presence at Computex. The company sent over a whole packet of photos of its booth, which was mostly Aorus-flavored. Unfortunately, I'm not at Computex, and if you're reading this, you're probably not either. So instead of the booth, let's talk about the new products Gigabyte has on the way. Besides the seven Socket-AM4 motherboards based on AMD's X570 chipset, there's a trio of monitors and a PCIe 4.0 SSD to talk about.

As far as the motherboards go, I'm going to skim them real quick. If you're after the nitty-gritty, the product pages are up for each model and I'll link them as we go, or if you're impatient, I made the chart above that you can check out with the salient details.  Gigabyte also helpfully provided another chart that identifies the exact components used in the construction of each board's power delivery hardware. I'm not too knowledgeable about things at the component level, so I'll let the gerbils dissect that one.

Gigabyte's fanciest X570 board is the X570 Aorus Xtreme. I probably don't need to tell you that this E-ATX monster has everything plus the kitchen sink. Triple M.2 sockets, Intel GbE-plus-Aquantia-powered 10-Gigabit Ethernet, a fancy ESS Sabre 9218 DAC, and seven USB 3.1 ports are the functional highlights of the board. The Xtreme also has dual EPS12V sockets to drive its sixteen PowIRstages, which are in turn cooled by a microfin array heatsink whose heatpipe rests directly on the hardware just like the best CPU heatsinsks. Gigabyte brags that the board uses 5-W/mK thermal pads under the aforementioned VRM heatsink, as well as between the board and its metal backplate.

The X570 Aorus Master is only a small step down from the Xtreme. The Master board has the same triple-M.2-plus-six-SATA storage configuration, but its dual LAN trades the Aquantia chip for a Realtek-powered Gigabit Ethernet to sit alongside the Intel connection. It also steps the ESS DAC down to a 9118; what difference that makes, I do not know. You lose a couple of USB 3.1 ports in the transition, but the Master has a PCIe 4.0 x1 slot that the Xtreme doesn't. It also retains the fancy backplate and VRM cooling from the Xtreme.

The next step down in Gigabyte's X570 product family is the X570 Aorus Ultra. This board, much like other "Aorus Ultra" models before it, is sort of a "premium midrange" offering. It's stripped-down compared to its more expensive siblings, but it still has more or less all the functional parts that they do: triple M.2 sockets, a fancy ALC1220-VB audio codec, and a microfin array VRM heatsink. The majority of things that it misses are overclocker niceties, like diagnostic LEDs, onboard power and reset buttons, and a rear-panel reset button. 

Gigabyte's X570 Aorus Pro comes in Wi-Fi and no-Wi-Fi versions. Otherwise, it's yet another small step down from the Ultra model, and when I say "small" step, I'm serious. They have the exact same power delivery configuration, their stat-lines in the chart above are essentially identical, and ultimately the only real difference I can find in the two boards (Ultra and Pro) is that the Pro model loses an M.2 socket compared to its cousin. There's also no grille on the chipset fan, for what that matters.

Naturally, Gigabyte has a mini-ITX board coming: the X570-I Aorus Pro Wifi. There's something about the idea of running a twelve-core, 4.6-GHz processor in a mini-ITX machine that makes me giggle. Anyway, this board makes the necessary sacrifices for the sake of miniaturization, but it's still richly-appointed. A pair of M.2 sockets are joined by a third E-key slot for the Wi-Fi card. This is also the only board in Gigabyte's lineup that includes more than one video connection, in case you should want to use it with a chip that has integrated graphics.

Gigabyte didn't send me a picture of the Elite, so here it is in the booth.

The X570 Aorus Elite is probably the most appealing model to price-conscious gerbils. Not to say that it's cheap—perhaps "uncomplicated" is a nice way to put it. Despite its positioning in the product stack, the Elite still keeps a pair of M.2 sockets and Intel-powered Ethernet, as well as S/PDIF audio output and a USB 3.1 Gen 2 Type-C port for the front of the case. It does lose the ability to split the CPU's PCIe lanes and hook up a pair of graphics cards to 8 lanes, though. It also doesn't come with WI-Fi.

Last and likely least among Gigabyte's X570 boards is the X570 Gaming X. Despite the decidedly budget-oriented nature of this board, it should appeal to a certain segment of users. After all, it's the only board among the lineup to have PS/2 ports—and a pair of them, too. It also has three PCIe 4.0 x1 slots mixed in with its dual physical-x16 slots and pair of M.2 sockets. The real letdown of this board is the lack of USB 3.1 Gen 2 ports, although I'm not extremely excited about the single Realtek LAN chip or that ancient ALC887 audio codec, either. Still, it should be cheap.

As you'll know if you read the chart above, all of Gigabyte's X570-based Ryzen boards support ECC memory, which is interesting. Gigabyte is proud of a few of its other features too, like Q-Flash Plus, which lets you "update the BIOS even without installing a processor, memory, graphics card, or booting up the PC." Pretty handy for boards that are likely to see at least one more round of CPU releases. All of these boards also support XMP profiles for easy memory configuration, too.

Gigabyte's new monitors are the KD25F, CV27F, and CV27Q. We know the least about the CV27Q, so I'll start with that. Gigabyte says that the CV27Q is its first "curved, tactical gaming monitor" and that the 27" display uses a VA LCD in 2560×1440 resolution with 1500R curvature. The monitor supports refresh rates up to 165 Hz as well as FreeSync 2 HDR, and Gigabyte says the CV27Q can reproduce 90% of the DCI-P3 colorspace. Not bad at all.

Gigabyte also says the CV27Q supports a 1-ms response time, but I'm a little dubious on that point because the company reports the CV27F's response time as "1ms (MPRT)." For those who don't know, MPRT is "moving picture response time," and it represents the "felt" response time of a display. When talking about gaming monitors, a response time of "1ms (MPRT)" generally means that the monitor is using blur-reduction tech—usually backlight strobing—to achieve that figure. Besides, a real 1ms response time on a VA LCD is virtually unheard-of.

Other salient specs on the CV27F include a 27" diagonal, 1920×1080 screen resolution, 165-Hz refresh rate, 3000:1 static contrast, 400 cd/m² peak brightness, and FreeSync 2 support. It also has a USB 3.0 hub built-in, VESA mounting support, and like its cousin, the ability to reproduce some 90% of the DCI-P3 colorspace. I'd show you a picture of it, but it looks virtually identical to the CV27Q above, so refer to that.

I could show you the front, but it's not interesting. Check out this wacky stand!

Meanwhile, the KD25F is not like the others. This flat (i.e. non-curved) display is 24.5" from corner to corner, and it uses a TN LCD panel. You might think, "oh, it's a budget display," but you'd be mistaken. This monitor supports a 240-Hz refresh rate, after all. More interestingly, this monitor is the second (after BenQ's XL2546) that we've seen to support blur reduction strobing at 240 Hz. Gigabyte marks this model down for a 0.5ms response time, but that's an MPRT number of course.

If you're concerned about the image quality, you probably needn't be. The KD25F's TN LCD is a true 8-bit panel that can display the full sRGB colorspace, and its backlight allows it to shine at up to 400 cd/m². Typical contrast is average for a TN LCD at 1000:1. Like the other monitors, it supports VESA mounting and has a USB 3.0 hub built-in. Just make sure you sit directly in front of it.

Finally, that SSD I mentioned. Gigabyte paired up an "all-new" PCIe 4.0 SSD controller with similarly-fresh Toshiba BiCS4 NAND flash to make what it calls, simply, the "Aorus NVMe Gen4 SSD." (Maybe it'll get renamed later.) This thing is wreathed on all sides in copper to keep it cool, and that's not that surprising when you hear that it can whip out a full 5 GB/sec on sequential reads. Gigabyte's not talking random performance, unfortunately. Gigabyte's general Computex press release claims the drive comes in an 8TB capacity, but the SSD's own info only lists 500GB, 1TB, and 2TB sizes. We'll ask Gigabyte for clarification.

Gigabyte didn't say when any of this stuff would be available, or for how much. Sorry to get you excited with no payoff. We reckon the majority of the motherboards will be available in early July, around the time the CPUs for them are out. As for the rest of the hardware, it's anybody's guess.

Comments closed
    • confusedpenguin
    • 1 month ago

    I’ve thought about building myself a new computer here pretty soon with a Gigabyte motherboard. Built one with a gigabyte board about 9 years ago right after my divorce and it was filled with good memories of World of Warcraft and Sacred 2 Fallen Angel. Might go with a regular board though and not one with fancy shielding or RGB lighting.

      • LoneWolf15
      • 1 month ago

      The best boards for the money have a little of both, even if you don’t use them. You don’t have to use the RGB, but base your purchase on the voltage regulation/power phase circuitry on the board; mid-price boards are usually where that starts.

    • ronch
    • 1 month ago

    These things are too gaudy for my taste, and that weird dodo and the weird name ‘Aorus’ don’t help at all. I’ll take that M.2 though; looks kinda retro.

    • MOSFET
    • 1 month ago

    [url=https://en.wikipedia.org/wiki/List_of_AMD_chipsets#A-Link_Express_III<]990FX was 19.6W[/url<] SB950 was 6W. They were often heat-piped together in the better mobos.

    • LoneWolf15
    • 1 month ago

    I find it very interesting that so few Z390 mainboards have that cooling fan (I think ASUS has one, only one I know of) and so many X570 boards have one.

    Does this chipset really run that warm?

    (agree with everyone, 40mm fans fail easily and have potential for high noise)

      • WaltC
      • 1 month ago

      Depends entirely on the quality of the fan. The scant reading available to date indicates that these fans are the highest-quality obtainable–but that doesn’t really tell me that much, except that early failures are not anticipated…;) The MSI x570 fans are user adjustable–to what extent, I don’t know. From what I’ve seen so far, the x470 chipset requires 4-5W max; the x570 requires ~11W max. Difference is of course in the PCIex4.x lanes the x570 handles (in addition to the 4.x lanes handled by the cpu)–the bandwidth is 2x that of PCIe 3.x, of course. Intel core-logic to date, I believe, does not yet support PCIe4.x.

      [url<]https://ark.intel.com/content/www/us/en/ark/products/133293/intel-z390-chipset.html[/url<]

        • ronch
        • 1 month ago

        “these fans are the highest-quality obtainable”

        In all the land, eh?

        /chuckles

      • Chrispy_
      • 1 month ago

      It runs warm, but not hot.

      Z390 is a 6W chipset, X570 is an 11W chipset.

      Historically, there have been a great number of CPUs, chipsets, and GPUs that ran above 11W with either no heatsink at all, or with a minimal heatsink that would occupy less space the the 40mm fan alone.

      Here’s a 12W Voodoo Banshee heatsink from 1998:
      [url<]http://hw-museum.cz/data/vga/pic/Creative_3D_Blaster_Banshee_CT_6760_F.jpg[/url<]

        • LoneWolf15
        • 1 month ago

        The Banshee is a bad example though. They got notoriously hot; I actually had a friend’s card start to melt its BGA GPU package with just a heatsink, and had to rig a 40mm fan to keep it cool and stable. I burned my fingers on the card removing it from his system prior to installing the fan.

    • Ifalna
    • 1 month ago

    Sheesh what happened to the mainboards?
    Ridiculous amount of plastic shrouds and whiny 40mm fans?!

    But at least they have RGB lighting. *chuckles*

    Thanks, I’ll pass. I expect my system to be quiet.

      • rinshun
      • 1 month ago

      And cheap

        • Ifalna
        • 1 month ago

        These days I don’t mind dropping 2K+ on a system that lasts 8 years.

          • Chrispy_
          • 1 month ago

          Uh, an 8-year old processor is pretty slow these days, and that’s even accounting for the stagnation in the CPU sector from 5 years of AMD’s Bulldozer mistake and Intel’s re-re-branding of everything because of 10nm production issues.

          Meanwhile, an 8-year old graphics card is either Nvidia’s Fermi or AMD’s Terascale 3 – both outdated architectures that have long since fallen off the driver update curve and forced to run legacy drivers. Not that it matters because 1080p modern gaming is out of the reach of either of them, unless you like minimum details and low framerates.

          I don’t want to tell you that you’re wrong, but using a $1000 PC for 4 years and then buying another $1000 PC will give you a decent experience across the whole 8 years, rather than providing 2 years of amazing performance, 2 years of good performance, 2 years of okayish performance and 2 years of struggling to cope with modern games/software/operating systems.

            • Ifalna
            • 1 month ago

            You’re definitely right on the GPU front because that still moves quite fast, an update every 4 years (or 2 generations) has it’s merit.

            Can’t say I agree much on the CPU front but my experience playing brand new titles is pretty limited, so I don’t know how well my 3570K OC’d to 4.6 would fare in those.

            I have to say though: not once did I feel as if my CPU struggled with software or even the OS (I work with systems that do every day, so I know the difference in feel). usually my CPU waits for me to catch up. I’d probably notice it if I did heavy duty rendering of video/3D or compiling, neither is on my to do list anytime soon.

            • Chrispy_
            • 1 month ago

            You won’t have noticed the CPU performance drop so badly because of the stagnation in that market, but also because an OC’d 3570K has gone from [i<]complete overkill > not quite so overkill > good > acceptable.[/i<] It's ABSOLUTELY bottlenecking if you bought a modern GPU, but since your GPU is by far the larger bottleneck you wouldn't have noticed the CPU being a problem.

            • Ifalna
            • 1 month ago

            @1080p yep, my 1070 is definitely bottlenecked by my CPU. I noticed that. Doesn’t matter though, because if I get past 60 everything is green.

      • mudcore
      • 1 month ago

      Nearly every X570 chipset motherboard will have a fan.

      There’s actually been a notable improvement in the heatsinks of the shown X570 boards versus the previous couple years. Look at the X370-era boards; they’re way, way worse and most of the X470 ones are similarly bad (some exceptions).

    • Chrispy_
    • 2 months ago

    Ugh, more fans.

    I guess I’m more interested in the B550 chipset when it arrives. There was very little point in buying an X470 over a B450 with the current generation. As far as I can tell the only difference is a couple of old-school SATA ports and SLI support anyway.

      • LoneWolf15
      • 1 month ago

      The X470 should be obtainable without fans. As for the X570 with double the power usage, I think the biggest question is whether PCIe 4.0 offers significant advantages for the average user.

      I can’t think of a reason on my regular systems for 10Gbps Ethernet, and we don’t have any PCIe 4.0 graphics cards yet. If it adds more lanes to allow for M.2 NVME SSDs without taking away SATA ports, that’s my only useful take at this current point.

      I have SLI myself, but went Intel this round. But I’m rooting for AMD too.

    • Derfer
    • 2 months ago

    Throw as many parties as you want but I’m never buying a product with that god awful logo on it. Right up there with the Corsair tramp stamp. What even is it supposed to be? An eagle with the hook they use to pull brains out of mummies? A metal dick? Captain Hook’s arm?

      • Voldenuit
      • 2 months ago

      Dude.
      Don’t diss the bro-bird.

      • Waco
      • 2 months ago

      I *think* it’s supposed to be an eagle flexing it’s arm? All I know is I hate it as well.

        • Krogoth
        • 2 months ago

        It is supposed to be Horus but I don’t think Gigabyte was able to secure the trademark for it. So they went with the next best thing and change one of the letters around while retaining the phonics. Aorus branding was the result.

          • Waco
          • 2 months ago

          Yes, but the flexing arm part is just weird. The eagle head and the Aorus naming are fine, IMO…it’s just that damn arm thing that makes it odd.

            • Krogoth
            • 2 months ago

            It is all about eGains my friends. 😉

      • The Egg
      • 2 months ago

      It’s an eagle with a Trogdor muscle arm.

      • albundy
      • 2 months ago

      glad someone finally said it. its too bad mobo makers no longer make green motherboards. i prefer it. it looks much better than a black/grey/white multi colored board.

        • Voldenuit
        • 2 months ago

        Remember red ABit boards and brown Gigabyte boards?

          • LoneWolf15
          • 1 month ago

          actually, I remember blue Gigabyte boards with slot, SATA, and RAM socket plastics in various shades of green, blue, orange, red, and purple.

          I owned this one, it was very solid. Just a little garish.
          [url<]https://www.anandtech.com/show/2717[/url<]

        • Usacomp2k3
        • 1 month ago

        LanParty ftw!

      • LoneWolf15
      • 1 month ago

      Really, it’s pretty unobtrusive. I have a Z390 Aorus board myself, and it doesn’t stand out.

      • K-L-Waster
      • 1 month ago

      If you stick it in a case with solid sides you don’t need to care….

    • NTMBK
    • 2 months ago

    What’s with the enormous hunks of plastic crap around the rear IO cluster? Even on the ITX board! Surely that can’t be good for airflow?

      • LoneWolf15
      • 1 month ago

      Really doesn’t affect it, it’s mainly covering the port cluster, and anchoring an attached I/O shield, which itself is a nice feature.

      The VRM heatsinks are fully exposed and can still be cooled by existing fans and airflow in your case as reasonably as if the plastic shielding wasn’t there. My Aorus Z390 board has this, and my Gigabyte Z97X board without it had nothing in that area generating heat.

    • UberGerbil
    • 2 months ago

    So I’m probably the only person who read that as “seven-Socket AM4 motherboards” ie motherboards with seven sockets. I need my second coffee. Because I blew my first one out my nose.

      • K-L-Waster
      • 2 months ago

      Seven sockets times 12 physical / 24 logical cores equals “who the blue blazes needs that many threads anyway?!?”

    • RAGEPRO
    • 2 months ago

    Guys, the ugly table at the top is my doing; I couldn’t fit the info into one of TR’s HTML tables quickly and easily. That’s a totally new thing, my own invention—let me know if you find it useful or if you think it’s a cluttered mess.

      • Usacomp2k3
      • 2 months ago

      I like the table.

      • The Egg
      • 2 months ago

      I like it. With so many motherboard models, you almost need a table to keep track of what the actual non-RGB/shroud differences are (if any). Can always revise/improve it over time.

      • astrotech66
      • 2 months ago

      Looks good to me … I found it easy to compare the different models.

      • fredsnotdead
      • 2 months ago

      The table serves its purpose well.

      • UberGerbil
      • 2 months ago

      Don’t know what it’s like for people trying to read it on certain lower-res mobile devices (not that TR is particularly mobile-friendly in general) but it’s fine. Though I will say you could’ve pulled the m-ITX board out and treated it separately; then the memory column could’ve been eliminated and handled in text and the table shrunk appropriately. I think I’d prefer to have them separated by form factor (E-ATX, ATX, and m-ITX) anyway. I don’t know about other folks here, but that’s the first filter I apply when board shopping since that’s almost always pre-determined before I’m even deciding Intel vs AMD.

        • Usacomp2k3
        • 2 months ago

        Either that or merge any common fields. I personally would prefer it all on one table.

    • The Egg
    • 2 months ago

    I just scrolled down the page and went “No…no…no…no…no…no…no.” PC builders should consider organizing a boycott of all motherboards with 40mm fans. I know they can be modified and a passive heatsink installed, but why go back to the bad old days. Somehow the message needs to be gotten across to stop cramming s— down our throats that nobody asked for.

      • RAGEPRO
      • 2 months ago

      I feel like you probably have AMD to blame. Whether or not the dinky little fan is [i<]actually[/i<] necessary or not, I couldn't say—I feel like a wide microfin heatsink could do just fine—but I suspect AMD mandated active cooling just in case.

        • The Egg
        • 2 months ago

        Chrispy mentioned 11-watts for the chipset in the Asus motherboard thread. If that’s accurate, they absolutely do no need active cooling. Things were good for about 10 years, with large passive heatsinks being installed on chipsets and power circuitry. Now we’re going backwards.

          • Krogoth
          • 2 months ago

          It is probably a bit more than 11-watts when loaded and PCH is near PCIe slot 1 which tends to be occupied by a toasty GPU.

          FYI, PCH on older platforms using PCIe 2.0/3.0 have been running on the toasty when loaded. I’m not too surprised that jump to PCIe 4.0 has jumped it into needing more than a small, low-profile heatsink.

            • The Egg
            • 2 months ago

            Doubt it. My experience with those garbage 40mm fans is that they’re usually attached to cheap, pot-metal heatsinks with no fins, and a passive Zalman ZM-NB47J always did a better job. Even if we’re talking more than 11 watts, then they need to start stretching it across unused areas of the board, using heatpipes, and/or something else innovative. Especially at the prices they’re charging for these boards. 40mm fans are flat out unacceptable.

            • Voldenuit
            • 2 months ago

            [quote<]Doubt it. My experience with those garbage 40mm fans is that they're usually attached to cheap, pot-metal heatsinks with no fins, and a passive Zalman ZM-NB47J always did a better job. [/quote<] Yep. There are a few X570 board teardowns circulating around, and all the PCH cooling teardowns I've seen have featured flat, practically finless dissipation areas next to the fan. Also, they could always move the main PCIEx16 slot down one and put a x1 or x4 slot in its place. That should clear up room for cooling solutions.

            • Krogoth
            • 2 months ago

            That Zalman HSF is too tall for most long video cards PCBs. They can’t relocate the PCH either because its needs to be really close to PCIe slot in order to support 4.0 speeds.

            Heatpipes or waterblocks are only viable alternatives. I suspect that bottom-line is a major factor for heatpipe route since they were used before back a decade ago on some higher-end motherboards SKUs. It also suspect it would be hard to use if there is a M.2 slot in the way.

            Besides, those 40nm fans don’t necessary = the loud, unreliable sleeve-bearing units of yesterday.

            • Redocbew
            • 2 months ago

            Yeah, well you are welcome to be the guinea pig for testing that.

            • Voldenuit
            • 2 months ago

            Back in the day I had a [url=http://www.sidewindercomputers.com/thhrsli.html<]Thermalright heatsink[/url<] to replace the whiny fan on my nForce 4 chipset. The L shape lets it be positioned to sit above or below the graphics card depending on the layout of your motherboard. It wasn't cheap, but it was a huge improvement over the stock chipset fan, both thermally and acoustically.

            • The Egg
            • 2 months ago

            That’s just one example of a user-mod. Take a look at the [url=https://www.asus.com/Motherboards/P5Q_Deluxe/<]P5Q Deluxe[/url<] I had 11 years ago if you want to see it done right.

            • Waco
            • 2 months ago

            Ha, there’s a throwback. I haven’t seen the OCC reviewed logo on anything in a while.

        • Voldenuit
        • 2 months ago

        > I feel like you probably have AMD to blame.

        Agreed, but probably not an actual mandate for active cooling. As I understand it, the problem is compounded by a combination of factors:

        1. The PCH has a high power draw – 11W typical and 15W peak
        2. There is no power management, the system isn’t smart enough to turn off or downclock components in between requests
        3. There is no (or minimal) thermal protection system
        4. It’s probably built on some oldass process; AMD ain’t spending the money to build these on TSMC’s 7 nm or Samsung 7nm fabs
        5. Motherboard makers have to make sure the system is stable in a wide array of cases and builds, including systems with inadequate airflow
        6. Motherboard makers are cheap

        The Aorus Extreme proves, or at least suggests, that passive cooling is possible with the right cooling design. If users demand passive cooling, the motherboard makers will eventually have to cave or lose market share. If users just keep buying the lowest common denominator boards, then motherboard makers will keep making the cheapest board they can get away with.

          • UberGerbil
          • 2 months ago

          There’s probably going to be an early-adopter’s tax here also. These are the boards that the board-makers will have on hand for the launch, so they’re the least-refined. Just as we saw the last time around with memory timings (and a few other glitches), it takes a while for the makers to iterate the designs and get rid of the rough edges. I wouldn’t be surprised if there are a lot more passive boards six months from now (especially, as you say, if the enthusiasts avoid the fans — though the companies may just conclude from that they should just make expensive boards, not that they should make cheaper ones without fans). And maybe AMD will eventually release a new chipset on a more efficient process. Otherwise, there’s going to be quite a business in after-market passive heatsinks.

        • Redocbew
        • 2 months ago

        I don’t know if AMD did a reference design for the chipset. If so, perhaps that had active cooling. That would make sense as the reason why most board makers are using active cooling expect for the few cases so far where they don’t. It seems more likely to me that it’s just another sign of a race to the bottom as others here have said.

        It’s not hard to cool 15 watts with a passive heatsink, but to do so while respecting all the various constraints that Volenuit listed earlier in this thread can make it seem that way.

      • cmrcmk
      • 2 months ago

      I’m hoping the midrange PCH (B550?) will eschew the active cooling requirement. Since the connectivity menu in the X570 is so extensive, it’s little brother should have more than enough for most of us.

Pin It on Pinterest

Share This