AMD’s 890GX integrated graphics chipset

AMD has been on a bit of a hot streak lately. No, I’m not talking about the dominating performance of its peerless Radeon HD 5870 GPU. I’m not referring to quad-core bargains like the nearly-$100 Athlon II X4 630, either. I speak of something far more exciting: integrated graphics chipsets.

Ok, so maybe exciting is a bit of a stretch. But AMD has definitely been on a run in the integrated graphics world, and it all started with a 780G chipset launched two years ago nearly to the day. With a DirectX 10-class graphics core, Blu-ray decode acceleration logic, and gen-two PCI Express, the 780G quickly became our integrated graphics chipset of choice—and our recommended platform for budget desktops and home-theater PCs.

This past summer, AMD replaced its mainstream integrated graphics chipset with the 785G. A refreshed graphics core with tweaked shader units and an updated video decode block punctuated this release, propelling Gigabyte’s implementation into TR Editor’s Choice territory.

AMD hasn’t been content to confine its integrated graphics chipsets to budget microATX motherboards, though. Some six months after introducing the original 780G, a hopped up version of the chipset dubbed the 790GX arrived astride mid-range ATX boards targeted at PC enthusiasts, gamers, and overclockers. The 790GX also brought with it a new SB750 south bridge chip with AMD’s first chipset-level RAID 5 implementation and an Advanced Clock Calibration capability that typically gave overclockers an extra few hundred MHz to play with.

Six months have passed since AMD lifted the curtain on the 785G, and right on schedule, an amped-up version is set to debut as the 890GX. Like the 790GX that came before it, the 890GX boasts higher GPU clock speeds and a penchant for full-sized ATX motherboards. It also sports new SB850 south bridge silicon with a 6Gbps Serial ATA controller of AMD’s own design, which is very exciting indeed. Naturally, we had to take a closer look.

The core-logic Swiss Army knife

As someone who has long chastised marketing departments for escalating model numbers without merit, I would be remiss not to take issue with the 890GX’s primary digit. AMD showed admirable restraint when it updated the 780G with the appropriately named 785G. The 790GX made perfect sense as a tuned-up version of the 780G, too. You’d think, then, that a hopped-up 785G would carry a 795GX model designation. But no, AMD apparently couldn’t resist and has dubbed its latest north bridge component the 890GX.

Look past the model number, and you’ll find that the 890GX shares the very same north bridge silicon as the 785G. The chip features just over 200 million transistors and is fabricated on a 55-nm process by TSMC. AMD sorts the chips it gets from the Taiwanese semiconductor firm, reserving only the best for the 890GX, while the rest live on as 785Gs.

The 890GX needs the cream of the crop because its Radeon HD 4290 integrated graphics core runs at 700MHz—200MHz faster than the Radeon HD 4200 in the 785G. Apart from the difference in clock speeds, though, the graphics cores are identical. Both share the same RV620 architecture, which serves up 40 DirectX 10.1-compliant stream processors.

Like most integrated graphics components, the Radeon HD 4290 is capable of carving out a slice of system memory for its own use. With such an arrangement, the IGP is forced to share memory bandwidth with the rest of the system. Fortunately, motherboard makers also have the option of pairing the 890GX’s integrated Radeon with “sideport” memory. Also referred to by AMD as a DDR3 performance cache, this sideport RAM is typically a single, 128MB DDR3-1333 memory chip. One such chip can be seen sitting next to the 890GX north bridge component in the picture above.

Since graphics chips are responsible for more than 3D pixel pushing these days, I should also note that the Radeon HD 4290’s Universal Video Decoder (UVD) block is fully up to date. The UVD supports dual-stream decode acceleration for high-definition MPEG2, VC-1, and H.264 video, which neatly covers all the formats used by Blu-ray movies. Video output can be piped over HDMI with an accompanying audio stream, but there are a few limitations on that front. The 890GX can’t pass TrueHD, DTS-HD, or uncompressed multi-channel LPCM audio over HDMI, putting it a step behind some integrated graphics platforms.

An 890GX block diagram. Source: AMD

In addition to its graphics core, the 890GX north bridge features second-generation PCI Express logic. 16 lanes of connectivity are reserved for discrete graphics cards, and unlike the 785G, the 890GX can split those lanes evenly between a pair of x8 links for CrossFire. The 890GX has an additional six PCIe lanes reserved for expansion slots and peripherals, too.

The rest of the chipset’s connectivity is consolidated in its new SB850 south bridge component, which is connected to the 890GX via an Alink Express III interconnect that offers 4GB/s of bidirectional bandwidth—twice the bandwidth of Intel’s DMI interconnect. (The 2GB/s in the block diagram above refers to one-way speed). Alink Express looks a whole lot like PCIe, and I’d wager the interconnect is little more than four lanes of PCI Express 2.0.

The south bridge has two more PCIe lanes for talking to peripherals, giving the chipset 24 lanes in total. Unlike Intel’s P55, H55, and H57 Express Platform Controller Hubs, whose second-gen PCIe lanes signal at the 2.5GT/s rate typical of gen-one implementations, the SB850’s PCI Express lanes each boast a full 5GT/s signaling rate.

By far the most interesting element of the SB850 is its Serial ATA “3.0” controller, which supports transfer rates up to 6Gbps—roughly 600MB/s, with overhead taken into account—and all the usual RAID array configs. This is the first 6Gbps SATA controller we’ve seen make its way into a core-logic chipset, and it’s only the second implementation of the new standard currently in the wild. AMD designed the new SATA controller itself, too, which is a departure from previous south bridge chips that used third-party storage controller logic.

AMD’s older south bridge chips have a history of Serial ATA performance and compatibility issues, particularly in AHCI mode, which is necessary for features like Native Command Queuing. Developing the SB850’s SATA controller itself should give AMD more control this time around, and in a moment, we’ll see whether that paid off.

The dearth of storage solutions—including even high-end SSDs—capable of exceeding the bandwidth available with old-school 3Gbps SATA makes the SB850’s 6Gbps SATA support feel more like forward-looking insurance than a must-have feature. AMD didn’t look to the future when crafting the SB850’s USB controller, though. The controller design has changed from older SB700-series implementations and now features 14 ports instead of 12. But they’re all USB 2.0 rather than SuperSpeed USB 3.0. Given the speed of today’s external storage devices, that strikes me as a little short-sighted.

At least AMD has squeezed a Gigabit Ethernet controller into the SB850 alongside the usual HD audio interface and, surprisingly, an old-school ATA channel. The whole thing is fabricated by TSMC at 65 nm, resulting in a chip that measures about 50 x 70 mm. According to AMD, the SB850 draws just 0.85W at idle, which is a quarter-watt less than the old SB750.

Asus’ M4A89GTD PRO/USB3 motherboard
Next-gen connectivity with a side of throwback

Manufacturer Asus
Model M4A89GTD PRO/USB
Price (MSRP) $155
Availability Soon

It might seem odd for an integrated graphics chipset to anchor a mid-range motherboard targeted squarely at PC enthusiasts, but that’s what you get with the 890GX. For this market, the chipset’s embedded Radeon is best thought of as a backup display adapter or a source of additional monitor outputs rather than as a primary GPU. Even if the Radeon HD 4290 is the fastest IGP around, anyone who wants to enjoy recent games with all their eye candy turned up at reasonable resolutions will be plugging in a discrete graphics card.

Asus’ M4A89GTD PRO/USB3, then, is really more of a traditional mid-range motherboard than its video outputs might otherwise suggest. That’s a good thing, because over the years, Asus has become pretty proficient at building mid-range enthusiast boards.

The M4A89GTD certainly looks the part of something you might find in an overclocked gaming rig. Asus offsets the dark brown board with a peppering of blue and white slots and ports, and I quite like the understated styling.

Of course, aesthetics won’t make or break a motherboard. The layout can, and Asus has done a good job on that front. All of the onboard slots and ports are intelligently organized to avoid clearance conflicts. Users with upside-down enclosures that put the PSU below the motherboard may need extra-long power cables to reach the board’s auxiliary 12V power connector, but that’s only because the plug is situated next to the top edge of the board, which is our preferred location for traditional enclosures.

Like other mid-range boards, Asus’ M4A89GTD covers its north bridge and voltage regulation circuitry with ornate—but not outlandish—heatsinks. A single heatpipe links the two coolers, which are short enough to avoid clearance conflicts with most aftermarket cooler designs.

Asus uses an 8+2 power-phase design to feed the AM3 CPU socket. The board can reputedly handle processors with TDP ratings up to 140W, which should allow it to support the fastest Phenom II chips, including perhaps the upcoming Phenom II X6. There’s also a core-unlocking switch next to the DIMMs slots that will allow Athlon and Phenom X3 owners to try their luck at enabling the fourth core on their CPUs.

Those with keen eyes will note that Asus’ spin on the 890GX features solid-state capacitors throughout. Asus squeezes two-ounce copper layers into the four-layer board, as well.

Rather than employing an auxiliary storage controller to feed additional internal SATA ports, Asus makes do with the six 6Gbps Serial ATA ports offered by the SB850. However, Asus has elected to connect the board’s sole IDE port to a JMicron JMB361 storage controller rather than the south bridge. The JMicron chip is also linked to an eSATA port in the rear port cluster.

As you can see, the low-profile south bridge cooler won’t interfere with longer graphics cards. The SATA ports are neatly positioned out of the way of double-wide graphics coolers, as well, which is something you don’t always see even on high-end motherboards. The mix of edge- and surface-mounted SATA ports should ensure that users with extremely tight enclosures that snug the hard drive bay right up next to the mobo will still be able to connect a collection of drives with ease, too.

The M4A89GTD is stacked with half a dozen expansion slots, including dual PCIe x16 slots, one x4, one x1, and a couple of retro PCI slots. The x4 slot’s a nice touch, and I quite like the fact that double-wide CrossFire configs will still leave users with access to it and a standard PCI slot.

The board will automatically split 16 PCIe lanes evenly between its x16 slots when two graphics cards are installed. However, if you want a full 16 lanes of bandwidth running to the primary slot, you have to install an included switch card in the secondary slot. The card takes all of a few seconds to slide into place, which is hardly a hassle. An automatic or BIOS-level switch would have been slicker, though.

The M4A89GTD’s port cluster has all the bases covered: DVI, HDMI, eSATA, FireWire, S/PDIF, and USB 3.0. The blue USB ports offer SuperSpeed connectivity via an NEC controller. A VIA chip is tasked with FireWire, while Realtek’s new ALC892 codec chip handles audio. Interestingly, Asus employs a Realtek Gigabit Ethernet chip rather than taking advantage of the GigE MAC built into the SB850. We’ve seen mobo makers snub the built-in GigE offered by Intel’s chipsets for years, and it appears AMD is getting the same treatment.

As one might expect from an Asus board, the M4A89GTD’s BIOS is packed to the gills with overclocking and tweaking options. Multipliers, clock speeds, memory timings, and voltages can all be adjusted with ease and pushed well beyond reason. All of the voltages and most clock speeds can be keyed in directly rather than selected from a list, which makes trial-and-error tweaking a lot quicker for folks who know what they’re doing. For those who don’t, the BIOS also sports an auto-overclocking feature that does all the dirty work. Auto-overclocking schemes are relatively new in the motherboard world, and I like that Asus has implemented this one in the BIOS rather than tying it to auxiliary Windows software.

For me, though, the real star of the BIOS is the fan control section. I’ve long complained that rudimentary fan speed controls were inadequate for enthusiast-oriented motherboards, and Asus has finally taken notice. With the M4A89GTD, the user can set minimum and maximum fan speeds for the CPU and system fans. One can also control the temperatures at which those fans kick into high gear. The low-temperature limit is greyed out, but Asus tells me these fan controls are still a work in progress, so we could see it unlocked eventually. Props to Asus for putting some effort into a section of the BIOS that’s been largely ignored by mobo makers for far too long.

Gigabyte’s GA-890GPA-UD3H motherboard
Undercutting the competition

Manufacturer Gigabyte
Model GA-890GPA-UD3H
Price (MSRP) $140
Availability Soon

We’ve noticed an interesting trend develop in the motherboard world over the last little while. In an aggressive bid to increase its share of the North American retail motherboard market, Gigabyte has been selling its mobos for less than equivalent models from Asus. This pricing strategy is evident with the GA-890GPA-UD3H, whose suggested retail price is $15 cheaper than the Asus board despite the fact that both offer similar feature sets. Indeed, even Asus’ M4A89GTD PRO, which lacks USB 3.0 connectivity, is slated to sell for $5 more than Gigabyte’s SuperSpeed-equipped UD3H.

When all else is equal—including reputation and expected reliability—we’ll recommend a cheaper board that offers better value ten times out of ten. The question, of course, is whether all else is equal with this first batch of 890GX boards.

On the surface, that looks to be the case. The UD3H is another full-sized ATX board aimed at gamers and overclockers. Like Asus, Gigabyte has figured out that enthusiasts’ tastes have outgrown the days when a clashing, neon rainbow of colors was considered an acceptable palette. Bravo.

For the most part, the UD3H’s layout is uncluttered and free of potential problems. Again, though, users with upside-down cases will want to make sure that their PSUs have long auxiliary 12V cables.

Those keeping score in the power-phase pissing match between mobo makers will want to note that the Gigabyte board uses a 4×1 phase arrangement—half the number of phases available on Asus’ 890GX offerings. But the UD3H still supports 140W CPUs, and as we’ve observed with numerous other motherboards in the past, more CPU power phases isn’t necessarily better.

Like Asus, Gigabyte uses two-ounce copper layers that purportedly offer lower impedance than typical one-ounce layers. There are solid-state capacitors across the board, as well, and nerdy puns at no extra charge.

The stumpy heatsinks for the north bridge and voltage circuitry do a good job of staying out of the way, which should allow users to run larger CPU coolers without issue. The low-profile south bridge heatsink shouldn’t interfere with expansion cards, either.

Massive graphics cards like those in AMD’s Radeon HD 5800 series can stretch all the way across an ATX motherboard, creating all sorts of clearance problems for SATA cabling. Gigabyte neatly avoids the issue by lining up all eight of the board’s SATA ports along the board’s edge, where they’ll tuck just under longer cards and coolers. This arrangement isn’t without potential for peril, though. A hard drive cage mounted right next to the motherboard tray may not leave enough room to plug into edge-mounted SATA ports.

Gigabyte squeezes an extra PCIe x1 slot into its stack, but the close proximity of the north bridge cooler may complicate compatibility with longer expansion cards. Of course, there are still two x16 slots, two more x1 slots, and a couple of PCI slots from which to choose. Unlike the Asus, this board doesn’t require a switch card to juggle lanes between the PCIe x16 slots, either. When the secondary slot is empty, all 16 lanes are automatically routed to the primary slot. Install a graphics card into the secondary slot, and the board will split the lanes in a dual-x8 config.

Look familiar? The Gigabyte board’s port cluster nearly mirrors what we saw from Asus. The only difference that really matters is the UD3H’s lack of eSATA connectivity. One could argue the presence of those blue USB 3.0 ports makes external Serial ATA ports unnecessary for this board. However, for the overwhelming majority of users, I suspect an eSATA port would’ve been more useful than the seventh and eighth internal Serial ATA connectors, especially if it was one of those fancy new hybrid eSATA/USB ports.

Although the Asus and Gigabyte boards both use the same Realtek ALC892 codec chip, only the latter appears to have implemented support for real-time Dolby Digital Live encoding, which allows multi-channel game audio to be passed to a compatible digital receiver or speakers over a single S/PDIF cable. Without real-time encoding, only source material with pre-encoded audio tracks, such as movies, can take advantage of multi-channel digital audio output.

For the most part, the first 890GX boards from Asus and Gigabyte offer identical features. Their interfaces differ, but the two boards’ BIOSes serve up similar tweaking and overclocking functionality. Both include embedded BIOS flashing utilities and support for multiple configuration profiles, too.

So where do they differ? In the automatic overclocking department, for one. You won’t find a BIOS-based auto-overclocking utility on the UD3H. Gigabyte doesn’t even ship the board with Windows software that’ll turn up your CPU’s clock speed automatically, although it has done so in the past with other boards.

By far the biggest between the two BIOSes comes when we look at fan speed controls. Those on the UD3H look positively prehistoric. The user has the option of turning automatic fan speed control on or off for the CPU and system fan headers, and one can toggle whether the CPU fan is a three- or four-pin model. That’s it. We’ve been asking Gigabyte for control over temperature thresholds and actual fan speeds or voltages for years now, and nothing has changed. Apparently, the ability to tweak obscure system voltages by hundredths of a volt is more important than meaningful fan speed controls.

The devil’s in the details

This wouldn’t be TR motherboard coverage without a painstakingly detailed assessment of each board’s BIOS options and specifications. These details don’t exactly lend themselves to eloquent prose, but you should be able to find what you need in the tables below.

Asus M4A89GTD PRO/USB3 Gigabyte GA-890GPA-UD3H
Clock speeds Base: 100-600MHz in 1MHz
increments
DRAM: 800-1600MHz in 266MHz increments

PCIe:
100-150MHz in 1MHz increments
HT
: 200-2000MHz in
200MHz increments
CPU NB: 1400-2000MHz in 200MHz increments
GPU:
400-1500MHz in 1MHz increments
Sideport: 1333, 14000-1820MHz in 30MHz increments
Base: 200-500MHz in 1MHz
increments

PCIe:
100-150MHz in 1MHz increments
GPU:
200-2000MHz in
1MHz increments

Sideport: 667, 800, 1067, 1333, 1400-2000MHz in 30-40MHz increments

Multipliers CPU: 4X-14.5X in 0.5X increments CPU: 5X-14.5X in 0.5X increments
CPU NB:
5X-10X in 1X increments
DRAM: 4-8X in 1.33X increments
HT: 5X-10X in
1X increments
Voltages CPU: 0.7-2.1V in 0.003125V increments

CPU NB: 0.475-1.875V in 0.003125V increments

CPU VDDA: 2.2-2.9V in
0.00625V increments

DRAM: 1.2-2.5V in
0.00625V increments

HT: 0.8-1.4V in
0.00625V increments
NB: 0.8-2.0V in
0.00625V increments
NB 1.8V: 1.8-2.1V in
0.05V increments
SB: 1.1-1.4V in
0.05V increments
Sideport: 1.5-1.8V in 0.1V increments

CPU: -0.6-+0.6V in 0.025V increments

CPU NB: -0.6-+0.6V in 0.025V increments

CPU PLL: 2.22-3.1V in
0.02V increments
DRAM:
1.275-2.245V in
0.015V increments

NB:
0.9-1.6V in
0.02V increments

NB PLL
: 1.45-2.1V in
0.01V increments
Sideport: 1.37-1.8V in
0.05V increments

Monitoring Voltage, fan status, and
temperature
Voltage, fan status, and
temperature
Fan speed control CPU, system CPU, system

Gigabyte prefers that you adjust clock speeds via explicit multipliers, while Asus gives users control over actual clock speeds. You say tomato, I say, uh, tomato. To Asus’ credit, the M4A89GTD does have a couple of extra voltage knobs to twirl. Asus’ voltage controls also offer more granularity than Gigabyte’s, although that doesn’t strike me as a difference that has much practical import.

The multiplier options listed above are what’s presented with an Athlon II X4 635 processor. Expect support for higher multipliers if you’re using a Black Edition CPU with an unlocked upper multiplier.

Asus M4A89GTD PRO/USB3 Gigabyte GA-890GPA-UD3H
CPU support Socket AM3-based
Athlon II, Phenom II processors
Socket AM3-based
Athlon II, Phenom II processors
North bridge AMD 890GX AMD 890GX
South bridge AMD SB850 AMD SB850
Interconnect Alink Express III (4GB/s) Alink Express III (4GB/s)
Graphics Integrated Radeon HD 4290 with 128MB
DDR3-1333 sideport memory
Integrated Radeon HD 4290 with 128MB
DDR3-1333 sideport memory
Expansion slots 2 PCI Express x16
1 PCI
Express x4
1 PCI Express x1

2 32-bit/33MHz PCI

2 PCI Express x16
3 PCI Express x1

2 32-bit/33MHz PCI

Memory 4
240-pin DIMM sockets

Maximum of 16GB of DDR3-1066-1866 SDRAM

4
240-pin DIMM sockets

Maximum of 16GB of DDR3-1066-1866 SDRAM

Storage I/O Floppy disk

1 channel ATA/133 via JMicron JM361

6 channels 6Gbps Serial ATA with RAID 0, 1, 10, 5 support via SB850

Floppy disk

1 channel ATA/133 via GSATA2

6 channels 6Gbps Serial ATA with RAID 0, 1, 10, 5 support via SB850
2
channels 3Gbps Serial ATA with RAID 0, 1 support via GSATA2

Audio 8-channel HD audio via Realtek
ALC892 codec
8-channel HD audio via Realtek
ALC892 codec
Ports 1 PS/2 keyboard

1 HDMI

1 VGA
1 DVI
4
USB
2.0 with headers for 8 more

2 USB
3.0 via NEC D720200F1
1 eSATA via JMicron JM361

1 RJ45 10/100/1000
via Realtek RTL8111E
1 1394a FireWire via
VIA VT6308P with header for 1 more


1 analog front out

1 analog bass/center out
1 analog
rear out
1 analog surround out

1 analog line in

1 analog mic in

1 digital S/PDIF out (TOS-Link)

1 PS/2 keyboard/mouse
1 HDMI
1 VGA
1 DVI

4 USB
2.0 with headers for 8 more

2 USB
3.0 via NEC D720200F1

1 RJ45 10/100/1000
via Realtek RTL8111D
1 1394a FireWire via
TI TSB43AB23 with headers for 2 more

1 analog front out
1 analog bass/center out
1 analog
rear out
1 analog surround out

1 analog line in

1 analog mic in

1 digital S/PDIF out (TOS-Link)

Lots of similarities here. Asus and Gigabyte differ on a few auxiliary peripheral chips but little else.

Our testing methods

The 890GX is really the only mid-range integrated graphics chipset on the market. Direct competition simply doesn’t exist, but we can cobble together a competent rival using Intel’s latest Clarkdale platform. Intel puts an integrated graphics processor—the Graphics Media Accelerator HD—right next to the processor core on its Core i3 and i5 CPUs. Slap one of those into an H55 or H57 Express-based motherboard with video output ports, and you’ve got yourself an integrated graphics platform.

With four 2.9GHz cores and a price tag around $120, the Athlon II X4 635 is one of AMD’s most attractive CPUs and a perfect match for the 890GX. On the Intel side, the most appropriate competition is probably the Core i3-530, which is the same price as the X4 635 and features two 2.93GHz cores that can process four threads in parallel thanks to Hyper-Threading. We’ve paired the i3-530 with an H55 Express motherboard. The H55 chipset should offer equivalent performance to the H57, since the only major difference between the two appears to be support for multi-drive RAID arrays, which we won’t be using today.

I conducted our application, gaming, and video playback tests with only the Gigabyte 890GX board, but all the other tests were run on both the Asus and Gigabyte 890GX mobos. We used the integrated graphics processor for each platform during testing. Windows 7’s power plan was set to “balanced” for all but a subset of our power consumption tests.

With few exceptions, all tests were run at least three times, and we reported the median of the scores produced. For IOMeter, we’ve reported average rather than median scores. Also, our power consumption tests were only run once.

Processor

AMD Athlon II X4 635 2.9GHz


Intel Core i3-530 2.93GHz

Motherboard
Asus M4A89GTD PRO/USB3 Gigabyte GA-890GPA-UD3H Gigabyte GA-H55M-USB3
Bios revision 0211 F3 F1

North bridge
AMD 890GX AMD 890GX Intel H55 Express

South bridge
AMD SB850 AMD SB850
Chipset drivers Chipset: Catalyst 10.3
AHCI: 8.70RC1
Chipset: Catalyst 10.3
AHCI: 8.70RC1
Chipset: 9.1.1.1025
AHCI: 8.9.0.1023
Memory size 4GB (2 DIMMs) 4GB (2 DIMMs) 4GB (2 DIMMs)

Memory type


OCZ OCZ3G1600LV6GK DDR3 SDRAM
at 1333MHz


OCZ OCZ3G1600LV6GK DDR3 SDRAM
at 1333MHz


OCZ OCZ2G8008GQ DDR2 SDRAM
at 800MHz
Memory timings 7-7-7-20-1T 7-7-7-20-1T 7-7-7-20-1T

Audio
Realtek ALC892 with 2.42
drivers
Realtek ALC892 with 2.42
drivers
Realtek ALC889 with 2.42
drivers
Graphics Integrated Radeon HD 4290
with
Catalyst
10.3 drivers
Integrated Radeon HD 4290
with
Catalyst
10.3 drivers
Integrated
GMA HD with 15.16.5.64.2021 drivers
Hard drive
Western Raptor X 150GB
Power Supply

OCZ GameXStream 700W
OS

Microsoft Windows 7 Ultimate x64

We’d like to thank Western Digital for sending Raptor WD1500ADFD hard drives for our test rigs.

We used the following versions of our test applications:

The test systems’ Windows desktop was set at 1280×1024 in 32-bit color at a 60Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.

All the tests and methods we employed are publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Memory performance

Memory bandwidth doesn’t always dictate real-world performance, but it’s a good place to start when testing systems whose integrated graphics processors consume a chunk of main memory. Both of our 890GX boards feature 128MB of dedicated sideport video memory, however, which should reduce bandwidth sharing.

Despite its lack of dedicated video memory, the H55 Express platform delivers higher memory bandwidth than either 890GX board. Score one for the Core i3-530. The Asus and Gigabyte 890GX boards offer nearly equivalent memory bandwidth, with the Gigabyte board having a slight edge overall.

Shift your attention to memory latency, and the Asus 890GX board edges out the Gigabyte by a nanosecond. The real story there is how much the H55 Express trails—its memory access latency is a full 30 nanoseconds slower than the 890GX boards.

The following graphs are a little indulgent, but they paint the latency picture in three dimensions, across multiple block and step sizes. I’ve arranged the graphs in order of highest latency to lowest. Yellow represents L1 cache, light orange is L2, red is L3, and dark orange is main memory.

Here we’re focused on how different processor platforms compare, so I’ve omitted scores for the 890GX board. The H55 Express and its Core i3-530 may have much higher memory access latencies than the 890GX and Athlon II X4 635 combo, but thanks to the Core i3’s L3 cache, that platform doesn’t have to hit main memory until the block size exceeds 4MB. The AMD system starts dipping into main memory once we pass 512KB, giving it higher access latencies than the Intel rig at block sizes between 512KB and 4MB.

Application performance

Core-logic chipsets don’t have a huge impact on application performance, but we’ve whipped up a diverse suite of application tests to give you a sense of how our test platforms stack up when their respective CPUs become the bottleneck. Most of these tests are effectively multi-threaded, so keep in mind that we’re testing a proper quad-core Athlon II X4 against a dual-core, four-thread Core i3.

Cinebench’s rendering test is handled by the CPU, and the 890GX platform has a sizable lead over the H55. The results of the OpenGL modeling test are even more striking. In that test, the Radeon HD 4290 easily outruns the Intel GMA HD.

The Core i3-530 is hardly a slouch when it comes to HD video encoding, but the Athlon II X4 635 is faster—by a healthy margin, too.

In another highly multi-threaded test, the 890GX system prevails once more.

TrueCrypt isn’t even close. Core i3 and i5 processors do have new instructions designed to improve encryption performance, but TrueCrypt doesn’t yet support them.

The Panorama Factory’s stitch operation is nicely multithreaded, and the Athlon II X4’s four cores are faster than the i3-530’s two.

Gaming

The 785G is certainly a competent integrated graphics processor, but it’s hardly a recommended solution for gamers looking to play the latest titles with the details turned up at reasonable resolutions. The 890GX’s Radeon HD 4290’s GPU is clocked 40% higher, so it should fare better. But is it really good enough to manage playable frame rates with the latest games? To find out, we tested a handful of recent releases at a charitably low resolution of 1024×768. We used the in-game timedemo features built into Left 4 Dead 2, DiRT 2, and Borderlands, and relied on FRAPS to log gameplay sessions in Modern Warfare 2. Median scores were taken from five runs of DiRT 2 and Modern Warfare 2, while three runs were used with the other games, whose scores were more consistent.

Lesson 1: even the latest incarnation of Intel’s Graphics Media Accelerator has serious issues. Our H55 Express system crashed repeatedly in Left 4 Dead 2 regardless of whether we were running a timedemo or just trying to play the game. DiRT 2 didn’t work, either, presenting us a blank screen instead of the game’s usual menu. Both are recent, popular titles that really should work.

With the GMA HD pulling up lame in two games, it’s easy for the Radeon HD 4290 to look good. But there is some bad news. I had to run DiRT 2 at the lowest in-game detail levels to manage playable frame rates at 1024×768. Fortunately, Left 4 Dead 2 was much more accommodating. The Source-engine title yielded decent frame rates even with all details turned up, albeit with extras like antialiasing and anisotropic filtering disabled.

Modern Warfare 2 ran acceptably on the 890GX, too, but only after I disabled antialiasing and a couple of in-game effects. Amusingly, the GMA HD’s average frame rate is the same as the Radeon HD 4290’s low point.

The only game of the bunch that I wouldn’t deem playable at 1024×768 on either platform is Borderlands. For whatever reason, this game doesn’t seem to scale well down to middling graphics hardware. Even with the lowest in-game detail levels, the Radeon HD 4290 only managed an average of 20 frames per second. This is why real gamers run discrete graphics cards—even sub-$100 models are a heck of a lot faster than either of these IGPs.

Blu-ray playback

We conducted our Blu-ray playback tests across three high-bitrate movies covering the major formats available on the market. 28 Days Later was used to represent the H.264 camp, Nature’s Journey for VC-1, and Click (which it pains me to even admit that we purchased on Blu-ray) for MPEG2. The latest version of PowerDVD, which supports the decode acceleration built into both the 890GX and GMA HD, was used for testing. Playback was run full-screen over HDMI at 1080p resolution.

Both systems played back our three movie samples flawlessly. Based on the scores above, the H55’s decode logic looks to be much more efficient. However, the Core i3-530 doesn’t lower its clock speed below 2.93GHz when playing the movies. The Athlon II X4 635 drops its CPU multiplier from 14.5X to 4X, taking the CPU clock from 2.9GHz down to just 800MHz. 20% of 800MHz is less than 11% of 2.93GHz.

Serial ATA performance — IOMeter

We’ll begin our storage tests with IOMeter, which subjects our systems to increasing multi-user loads. We used IOMeter’s workstation and database test patterns, since those are more relevant to desktop systems than the file or web server test patterns. This particular test makes good use of the Native Command Queuing capability built into the AHCI specification.

Drives capable of taking advantage of the SB850’s 6Gbps SATA controller are few and far between. The best candidate is currently Crucial’s new RealSSD C300, which is the first 6Gbps solid-state drive to hit The Benchmarking Sweatshop. Naturally, we couldn’t resist testing with it. But plenty of folks also use mechanical drives, and likely will for some time, so we’ll kick things off with a look at SATA performance using a Western Digital VelociRaptor.

AMD has a history of poor storage controller drivers, so in addition to testing the Gigabyte 890GX board with AMD’s own drivers, we tested it with the Microsoft AHCI drivers included with Windows 7.

When they’re both using AMD’s AHCI drivers, the Asus and Gigabyte 890GX boards offer nearly identical transaction rates. Performance levels off after we hit 32 concurrent I/O requests, which just happens to be the queue depth for Native Command Queuing. Interestingly, there’s no performance plateau when we combine the 890GX with Microsoft’s own AHCI drivers. The H55 Express’ transaction rates don’t trail off after 32 I/Os, either.

Regardless of which drivers are used, the Gigabyte 890GX board uses relatively more CPU time than the Asus. The H55’s CPU utilization is lower than both boards, although we’re still looking at less than 4% CPU utilization overall.

Switching over to a high-end SSD exposes a weakness with Microsoft’s AHCI drivers, at least at lower loads. The H55 Express still manages slightly higher transaction rates than any of our 890GX configurations—and that’s with a 3Gbps SATA controller.

Again, the Gigabyte 890GX board exhibits higher CPU utilization than the Asus. Both consume relatively more CPU cycles than our H55 system, but the differences don’t amount to much as the load scales upward. I didn’t expect CPU utilization to peak with the fewest outstanding I/O requests, but that’s what happened with all four configurations.

Serial ATA performance — HD Tach

We used HD Tach 3.01’s 8MB zone test to measure basic SATA throughput and latency.

The RealSSD has some of the fastest read burst speeds we’ve ever measured, and it’s clearly quicker on the 890GX than it is on the H55 Express. Even the VelociRaptor has higher burst speeds on the 890GX, but only with the Gigabyte motherboard. The Asus board’s VelociRaptor burst speeds are nearly 20MB/s slower than those of the Gigabyte.

Here’s where things start to get a little weird. With the VelociRaptor, average read speeds don’t vary all that much from one platform to the next. However, the RealSSD’s read speeds are all over the map, with the H55 Express wedged between the Asus and Gigabyte 890GX boards. The read speeds on each platform were quite consistent from one run to the next, making the differences between platforms all the more puzzling.

We haven’t had much time to work with the RealSSD, but before testing it with each configuration, I ran a secure erase and a full-disk HD Tach write speed test on the drive to establish an even, used-state playing field. This should prevent things like the block-rewrite penalty from affecting our results.

The intrigue continues when we look at average write speeds, which again find the Asus and Gigabyte boards separated by quite a margin. Asus is fastest again, but neither 890GX board is quick enough to catch the H55 Express.

More troubling than the wide gap in SSD write speeds between the Asus and Gigabyte 890GX boards is the fact that both pull up lame with the VelociRaptor. The H55 Express’s average write speeds are 30MB/s faster than the 890GX’s when the mechanical drive is installed.

HD Tach doesn’t quote a margin of error for its random access time test, but I can’t help but wonder if it’s more than a tenth of a millisecond. Things are pretty even here, with the obvious exception that the RealSSD’s access times are more than an order of magnitude shorter than the VelociRaptor’s.

There is a +/- 2% margin of error for HD Tach’s CPU utilization tests, but the differences in CPU utilization are much greater than that. With both the VelociRaptor and the RealSSD, the Gigabyte 890GX scores much higher than the Asus in HD Tach’s CPU utilization test.

USB performance

Our USB transfer speed tests were conducted with a USB 2.0/FireWire external hard drive enclosure connected to a 7,200-RPM Seagate Barracuda 7200.7 hard drive. We tested with HD Tach 3.01’s 8MB zone setting.

The SB850 may have a rearchitected USB controller, but it’s not as fast as the one inside the H55 Express. The 890GX boards pull up short in the burst and average read speed tests, and the Gigabyte trails behind the leaders with writes, too.

Matters get worse for the Gigabyte board when we look at CPU utilization, which is notably higher than the Asus. The 890GX boards are both running the “Balanced” Windows power plan with Cool’n’Quiet enabled, so there shouldn’t be this much of a difference between them.

When asked about the SB850’s slower USB transfer rates, AMD suggested that real-world transfers shouldn’t be affected. The company also indicated that Cool’n’Quiet can react oddly to CPU utilization tests, although that wouldn’t explain the difference in CPU utilization between the Asus and Gigabyte boards.

PCI Express performance

We used NTttcp to test PCI Express Ethernet throughput using a Marvell 88E8052-based PCI Express x1 Gigabit Ethernet card.

PCI performance

To test PCI performance, we used the same NTttcp test methods and a PCI Intel GigE NIC.

A Gigabit Ethernet controller may not be the most bandwidth-intensive peripheral to throw at an expansion interface, but it’s certainly the most common. All of our system configurations do well in the throughput tests, but the 890GX rigs have higher CPU utilization than the H55 Express. I suspect we’re seeing the Athlon II’s clock throttling in action again, but that doesn’t explain why the Asus board has lower CPU utilization than the Gigabyte in the PCI test.

Power consumption

That covers the chipset-specific portion of today’s festivities. Now it’s time to switch gears to exploring variables more dependent on motherboard attributes than core-logic components. First up, we have power consumption tests. We measured system power consumption, sans monitor and speakers, at the wall outlet using a Watts Up Pro power meter. Readings were taken at idle and under a load consisting of a Cinebench 11.5 render alongside the rthdribl HDR lighting demo. We tested with Windows 7’s High Performance and Balanced power plans.

Motherboard makers usually ship their boards with energy-saving software that’s supposed to lower power consumption without impeding performance. We’ve tested each board with and without this software installed. Gigabyte’s H55 Express board uses Dynamic Energy Saver software, while the company’s 890GX offering uses a new app called EasySaver. The Asus board uses an EPU app that must be configured in “auto” mode to avoid performance-sapping clock throttling.

Even with fewer power phases than its Asus counterpart, the Gigabyte 890GX board draws notably more power. Running each company’s power-saving software is good for a watt or two, but that’s about it.

As one might expect, our H55 system has the lowest idle power draw of the lot. The Core i3-530 is more power-efficient than the Athlon II X4 635, and the Intel CPU is also running on a smaller microATX motherboard.

Under load, the Intel system has even more of a power-efficiency advantage. It’s not even close.

Between the 890GX boards, the Asus draws less power under load by about 10W. Power-saving software has more of an effect here than it did at idle, particularly on the Asus board.

Overclocking

I had wanted to dip into IGP overclocking with this review, but there simply wasn’t time. And then I got to thinking and figured there wasn’t a point, either. Sure, the Asus and Gigabyte 890GX boards both give users the ability to tweak Radeon HD 4290 clock speeds, but if you’re that desperate for graphics performance, you’re better off saving your pennies, shoveling a few driveways, and buying a real graphics card, even if it’s a generation or two old.

Fortunately, I did have time to run a few quick overclocking tests on the motherboards themselves. First, I experimented with the auto-overclocking utility built into the Asus BIOS.

There wasn’t much to it: select the option in the BIOS, wait for the reboot, and see what you get. My system settled on a 233MHz base clock, which combined with a 14.5X multiplier, yielded a 3.4GHz CPU clock speed—not bad for auto-tuning.

Since the Gigabyte board lacks an in-BIOS auto-overclocking utility and doesn’t come with Windows software that accomplishes the same task, I kicked it old school with some base clock overclocking on the Asus. First, I lowered the CPU and memory multiplier to take those components out of the equation. Next, I turned up the base clock speed, checking for stability along the way using a four-core Prime95 load.

The Asus board cruised up to a 300MHz base clock speed with ease, but it would go no further. 310MHz wouldn’t post, even with extra voltage applied to the CPU and chipset. Still, it’s hard to complain about a 50% boost for the base clock.

Next up: the GA-890GPA-UD3H.

Much like the Asus board, the Gigabyte didn’t put up a fuss as I turned up the base clock—but only up to 280MHz. Try as I might, I couldn’t get the system to post at 290MHz. A 280MHz base clock speed is still capable of taking an Athlon II X4 635 up to an even 4GHz, which ain’t half bad.

Motherboard peripheral performance

Core logic chipsets integrate a wealth of peripherals, but they don’t handle everything. FireWire, Ethernet, USB 3.0, and audio are farmed out to auxiliary chips, for example. To provide a closer look at the peripheral performance you can expect from the motherboards we’ve tested today, we’ve compiled Ethernet, Serial ATA, USB 3.0, FireWire, and audio performance results below.

HD Tach
FireWire performance

Read burst

speed (MB/s)


Average read

speed (MB/s)


Average write

speed (MB/s)


CPU utilization

(%)


Asus 890GX
40.2 34.6 16.4 5

Gigabyte

890GX
40.0 34.0 19.8 7

Gigabyte H55
32.6 28.9 19.8 2

The 890GX boards have similar FireWire performance, which is interesting considering that they use completely different controller chips. CPU utilization is a little higher on the Gigabyte, but the scores are within that test’s +/- 2% margin of error.

HD Tach
USB 3.0 performance

Read burst

speed (MB/s)


Average read

speed (MB/s)


Average write

speed (MB/s)


CPU utilization

(%)


Asus 890GX
152.4 81.0 75.6 19

Gigabyte

890GX
142.5 40.6 52.3 17

Gigabyte H55
152.5 119.1 123.7 7

USB 3.0 may be the new hotness, but it just isn’t stable on the Gigabyte 890GX board. Scores were wildly inconsistent from one run to the next, and the system even locked up a couple of times during testing. Using the exact same SuperSpeed USB hard drive, I saw much more consistent performance on the other two boards.

All three boards use the very same NEC USB 3.0 controller, yet the H55 has much higher average read and write speeds than the Asus 890GX. Clearly, some implementations are superior to others.

HD Tach
Serial ATA performance

Read burst

speed (MB/s)


Average read

speed (MB/s)


Average write

speed (MB/s)

Random access time
(ms)

CPU utilization

(%)


Asus 890GX
210.2 110.3 81.3 7.3 10

Gigabyte 890GX (AMD)
228.5 106.6 79.5 7.5 24

Gigabyte 890GX (GSATA)
169.8 108.2 76.1 7.3 18

Gigabyte H55 (Intel)
218.8 109.4 110.2 7.4 7

Gigabyte H55 (GSATA)
179.5 110.5 80.0 7.0 3

We’ve already covered the interesting SATA results, but these fill out some missing scores for the auxiliary storage controllers on each board. The scores above were all obtained with the VelociRaptor serving as the test drive.

NTttcp Ethernet
performance
Throughput (MBps)
CPU utilization
(%)

Asus 890GX
940.4 18.5

Gigabyte 890GX
940.7 26.9

Gigabyte H55
926.8 9.6

The Asus and Gigabyte 890GX boards use Realtek’s RTL8111E and 8111D Gigabit Ethernet controllers, respectively. We don’t see much difference in throughput between the two, but again, the Gigabyte turns in a higher CPU utilization score.

RightMark Audio
Analyzer audio quality

Overall score

Frequency response

Noise level

Dynamic range

THD

THD + Noise

IMD + Noise

Stereo Crosstalk

IMD at 10kHz

Asus 890GX
4 5 4 4 5 3 5 5 5

Gigabyte 890GX
4 5 4 4 5 3 5 5 5

Gigabyte H55
5 5 5 5 5 3 5 5 5

The 890GX boards score identically in our 24-bit, 192kHz RMAA loopback test. That said, the Gigabyte H55 board scores one point higher in three of the eight component tests—and overall.

Conclusions

Like the 790GX that came before it, I just don’t get the 890GX. Sure, part of me marvels at the idea of a jack-of-all-trades chipset with the fastest integrated graphics component on the planet, next-gen 6Gbps SATA connectivity, loads of interconnect bandwidth, and plenty of PCI Express 2.0 connectivity. But AMD expects 890GX boards to sell for between $130 and $180, which is a whole lot more than I’d advise anyone spend on an integrated graphics platform. Heck, you can pick up a microATX 785G board for $80 and get the very same north bridge chip and identical Blu-ray decode capabilities. The 785G’s Radeon HD 4200 won’t be as fast as the HD 4290 in games, but that’s sort of like saying a standard Smart car isn’t as fast as a turbo-charged one. Neither is quick enough if you’re looking for speed, just like neither integrated Radeon is sufficient if you really want to play games.

So what else does the 890GX give you that the 785G doesn’t? Dual-x8 CrossFire support, which is nice, but that makes the integrated GPU even more of a waste. Then there’s the full-sized ATX motherboard rather than a microATX model, but the 785G is also available on ATX boards that cost less than $100. Hmmm. Maybe you want the new SB850 south bridge, and specifically, its 6Gbps SATA controller.

If we ignore, just for a moment, that the only storage devices likely to be capable of taking advantage of 6Gbps SATA are extremely expensive SSDs, then yes, I can see being tempted by the 890GX just to get a taste of the SB850. But I can’t ignore that the SB850 appears to have a few kinks that need ironing out. Our testing has exposed weaknesses in AMD’s AHCI drivers and with the SB850’s sustained write performance. The fact that the Asus and Gigabyte boards exhibited wildly different SATA performance in some of our tests is reason for concern, as well. Neither board impressed in our USB performance tests, which is another traditional area of weakness for AMD.

Perhaps BIOS and driver updates could smooth out the SB850’s rough edges over time. If and when that happens, the 890GX may start to make more sense. Or it may make even less sense, because it didn’t take long for AMD to bring the 790GX’s then-new SB750 south bridge over to high-end 790FX motherboards. Surely, the SB850 will migrate to high-end FX territory before long.

Still, I suppose the 890GX makes a certain kind of sense for AMD. It puts the integrated graphics crown and whatever bragging rights that’s worth even further out of Intel’s reach. The 890GX also provides a mid-range replacement for the aging 790GX, which can currently be found on a whole lot of mid-range motherboards, even if few of their integrated Radeons will ever be called into action. All AMD had to do was sort out some of its better 785G north bridge chips and incorporate its new south bridge. That south bridge can now be distributed across AMD’s chipset lineup.

Although the 890GX doesn’t quite add up for me, it’s much easier to pass judgment on the two motherboards we’ve looked at today. I’ve liked a lot of the Gigabyte motherboards I’ve seen over the past few years, but the GA-890GPA-UD3H has some issues that must be addressed. The board’s USB 3.0 performance was flaky at best, and its CPU utilization was consistently higher than the Asus model across multiple peripheral performance tests. We’ve contacted Gigabyte about those problems, and while the company is looking into the issues, they’ve yet to be resolved. Those problems may be easier to fix than the UD3H’s comparatively high power consumption and its lackluster BIOS-level fan speed controls, making the Gigabyte board difficult to recommend, even if it costs $15 less than Asus’ M4A89GTD PRO/USB3.

Normally, I wouldn’t dream of paying $15 more for a board with essentially the same feature set, but not all else is equal this time around. The M4A89GTD offers better peripheral performance, lower power consumption, and useful BIOS features like automatic overclocking and robust fan speed controls. I wish it supported real-time Dolby Digital Live encoding and that its eSATA port was of the hybrid USB variety, but that’s really all I’d change.

In the end, though, I’m left with too many lingering concerns to wholeheartedly recommend any motherboard based on the 890GX. If you’re looking for integrated graphics on a budget, I think the 785G is still the best game in town. Asus and Gigabyte both have a number of excellent 785G boards from which to choose. If you were hoping to get in on next-gen SATA with a new AMD south bridge, hold tight. The SB850 may yet prove its worth, but it’s not ready for prime time yet.

Comments closed
    • demani
    • 10 years ago

    How about Hulu and Netflix performance? Can you scale a stream to 1080p and still have it be watchable? That’s the kind of mainstream usage that IGPs are more interesting to me for (I’d rather spend an extra $25 on an IGP than a separate video card with another fan and more power draw).

    Anyone have any info on that?

      • MadManOriginal
      • 10 years ago

      I tink you’d be hardpressed to find a modern dual core CPU that will go in these boards that can’t handle it on its own. The only advantage of acceleration might be lower power draw.

    • DiRRRtyFlip
    • 10 years ago

    Considering these board for my next Phenom II build but I was looking to use two GTX 260 on SLI, except this board exclusively only Crossfire!? I have looked at other reviews but from what I’ve gathered, AMD’s 890GX doesn’t allow for SLI. 🙁

    • GabrielG
    • 10 years ago

    I’ve I question I hope it’s not off topic.
    I wanna buy a AMD Phenom II board. I was inclined to buy a 785G one, as 980GX are not available yet and may be too expensive.
    But most important of all, I’m concerned with the fact that AMD chipset based motherboards show a poor SATA write performance, compared to NVIDIA ones (750a, 780a). (Check TechReport on 790GX review, for instance and 785G too.)
    I don’t play games, and I must do backups from one HD to two other HD at least twice a weak (50GB to 100 GB). And I really hate poor write performance. Should I be concerned with that question qith AMD chipsets? Should I prefer NVIDIAS’s ones? I was thinking about Asus M4N82 Deluxe or the not so expensive M3N-HT Deluxe.
    980GX has not yet solved this problem with the appropriate care, I think. I’ll be using Windows XP. Would Windows 7 drivers solve the problem?

    Thanks for your opinion.

    • FuturePastNow
    • 10 years ago

    Disappointed that the northbridge is just a rebadge of 785G. There’s really nothing they could have changed in the northbridge except the IGP- but they made no effort to improve that. Ho hum.

    As for the southbridge, SATA 6Gbps is great- but no AHCI improvements? Come on.

    • pogsnet
    • 10 years ago
      • maroon1
      • 10 years ago

      890GX didn’t offer a playable performance (except in MW2) even though all the games tested were running at 1024×768 with low settings

      So, what is the point of having a faster IGP that doesn’t offer a reasonable or playable performance in any modern game ? 890GX is not good for gaming, and most people who probably use IGP are not gamers at all.

      For non-gamers, for people who don’t use computer for gaming (those people are the majority), AMD IGP has absolutely no advantage over Intel IGP. And if you are gamer then you should get something better than AMD IGP because you won’t get playable performance in most games even if you play at low settings.

      Also, none of the AMD fanboys here metioned that Intel IGP was better in blu-ray playback

      Let us not forget that Intel Clarkdale GPU can bitstream Dolby TrueHD, DTS HD-MA and 8-channel LPCM over HDMI

        • Kaleid
        • 10 years ago

        re: bluray, from the article

        “Both systems played back our three movie samples flawlessly. Based on the scores above, the H55’s decode logic looks to be much more efficient. However, the Core i3-530 doesn’t lower its clock speed below 2.93GHz when playing the movies. The Athlon II X4 635 drops its CPU multiplier from 14.5X to 4X, taking the CPU clock from 2.9GHz down to just 800MHz. 20% of 800MHz is less than 11% of 2.93GHz.”

        • OneArmedScissor
        • 10 years ago

        “So, what is the point of having a faster IGP that doesn’t offer a reasonable or playable performance in any modern game?”

        The point is that it works for the other over 9,000 people who play less than “modern” games on their PCs, as opposed to the three or so people who actually ever played Crysis.

        “890GX is not good for gaming, and most people who probably use IGP are not gamers at all.”

        Most people who play PC games play WoW and The Sims at low to middle of the road resolutions.

      • NeelyCam
      • 10 years ago

      Awesome! That’ll make the comparison between MSI H55M-ED55 more fair.

    • pogsnet
    • 10 years ago
    • Ryhadar
    • 10 years ago

    This… is an extremely peculiar review, both because of the often huge performance discrepancy between the ASUS and Gigabyte boards and the performance discrepancy from other TR reviews.

    Maybe it’s because of the lack of direct comparison, but if you look back at your own 785G review both the 785G and the G41 do better in USB performance than the H55 and 890GX. So yeah, USB performance is disappointing but how did it get worse on both chipsets?

    Moreover, if you look at that same review the SB850 does much, much better with AMDs own AHCI drivers in IOmeter. Yes, there is some weakness in average writes but otherwise it’s a significant improvement over the last generation. I can’t say that for sure, since they’re different reviews, though.

    I dunno, something just smells fishy. This is a chipset I’d definitely like to see compared in a future review when BIOS and drivers are more mature.

    Oh, and I’m surprised you guys didn’t mention that ACC support is dropped with this chipset.

    *edit*

    Forgot to mention that it’s a shame that, once again, AMD ignored the market’s request for 8-channel LCPM audio support.

      • Meadows
      • 10 years ago

      g{

        • MadManOriginal
        • 10 years ago

        Want to try again nublet? There are quite a few AM3 (DDR3) motherboards with ACC.

          • grantmeaname
          • 10 years ago

          ACC doesn’t do anything for Phenom IIs anyways, as it’s entirely implemented inside the processor and the motherboard never has to deal with any of it. It’s a nonissue.

            • insulin_junkie72
            • 10 years ago

            Unless you’re planning on buying a 8xx series board and unlocking cores.

            If everyone eventually follows Asus’ lead and comes up with a homebrewed solution, it might not be such a big deal.

            • pogsnet
            • 10 years ago
            • derFunkenstein
            • 10 years ago

            I’ve never heard that, but yes, I wouldn’t ever buy an SB850-equipped motherboard just because I have an unlockable CPU.

          • Meadows
          • 10 years ago

          Those have older chipsets and the AM3 socket for them is just a kludge.

            • insulin_junkie72
            • 10 years ago

            ?

            785g boards with the SB710 had ACC, and before the 890GX, that was as new as AMD’s chipsets got.

            • Meadows
            • 10 years ago

            Your comment is not relevant.

            • MadManOriginal
            • 10 years ago

            Ok Wrongy McWrong.

    • Helmore
    • 10 years ago

    “The whole thing is fabricated by TSMC at 65 nm, resulting in a chip that measures about 50 x 70 mm”

    That’s one very very big chip you got there. At 50 × 70 mm this chip is 3500 mm² in size. Fermi would seem small compared to a chip of this size. I think you meant to say the chip is 5.0 × 7.0 mm?

      • Meadows
      • 10 years ago

      Edit: I was mistaken, and probably.

      As an aside, Fermi is 42×42 mm on a 40 nm process.

    • crochat
    • 10 years ago

    The article thoughts on power efficiency at load are really misleading. The figure show power consumption and not efficiency. As AMD had a clear lead in cinebench, the efficiency might be better with AMD.

    In my opinion every benchmark should be paired with power efficiency analysis, or at least benchmarks simulating usage scenario, e.g. every day multitasking, gaming, multimedia, etc.

    • sluggo
    • 10 years ago

    So let me make sure I understand this: same CPU, same north bridge, same south bridge, same disk drive, same OS, same drivers … but the CPU utilization during SATA I/O on the two AMD-based boards differs by as much as 60 percent??!?

    I’m having trouble believing this. These two boards had to be throttling the CPU differently during this test suite.

    • Prion
    • 10 years ago

    Just wanted to say, that Hayabusa-powered Smart ForTwo was a beast

    • Voldenuit
    • 10 years ago

    Same, same.
    But different!

      • NeelyCam
      • 10 years ago

      Just add another 100 to the number. And it’s brand new!

    • rube798
    • 10 years ago

    It seems that in Panorama Factory 890gx scores better (25 seconds vs 32 seconds), yet the review claims “a win for the H55″…

      • Dissonance
      • 10 years ago

      Fixed.

    • OneArmedScissor
    • 10 years ago

    Wait a minute, am I reading the BIOS settings right? It looks like you can undervolt the NB in the BIOS on these boards.

    I’ve always wondered if that would accomplish anything for idle power. I think the 785G chipset idles at 3w, so it’s probably irrelevant on a desktop, but you never know until you try.

    At least they finally have lower RAM voltages. I don’t think any 785G boards did lower than 1.5v. Probably not terribly important, either, but very goofy when DDR3 has continually been dropping lower than that for a while.

    Those could be interesting things for laptops using the 800 series chipsets.

      • NeelyCam
      • 10 years ago

      3W? I guess all the big-power stuff is already on the CPU… How much power does HT consume?

      I’m wondering how much I can save by removing the CPU fan. That could take some 3W by itself…

        • OneArmedScissor
        • 10 years ago

        I’ve unplugged my CPU fan, and several other different fans, and measured the power difference. It’s pointless. If it’s not spinning fast, its power use is negligible.

        And yes, I also wonder about HT speed, though I think it’s the CPU’s that has the most impact there. I believe Athlon 64s idled at lower levels because theirs were much slower, and that’s why AMD still use them as their low power CPUs.

          • NeelyCam
          • 10 years ago

          Got it. I won’t have a CPU fan anyways (for noise reasons), but I was hoping it would score me a watt or two in idle reduction. Oh well…

          Yeah, I think DDR3 undervolting could reduce idle a bit. Maybe uncore (HT) undervolting helps as well. AMD doesn’t powergate everything yet, so idle power could reduce through undervolting, but with Intel CPUs undervolting mainly helps with load power.

          I’m thinking that best idle power reductions could be had through appropriate PSU sizing. Maybe switching to “green” HDDs or SSDs.

            • OneArmedScissor
            • 10 years ago

            I played with RAM voltages and clock speeds on my computer with 2GB of DDR2 and it’s definitely enough to impact laptop battery life. I saw differences of a few watts at idle in some cases, which I can’t gauge completely accurately since the PSU isn’t so great, but it’s there.

            DDR3 may be a bit lower voltage, but the clock speeds are pretty much universally higher. There should still be some wiggle room there, too. It’s interesting that there’s not yet something like 1.0v 667 MHz DDR3, but I guess we’ll see that with the DDR3 update of Atom.

            I know AMD desktop boards don’t reduce chipset voltages at idle, but it looks as if that doesn’t really matter. I guess that’s one of those things where maybe on a laptop, it could save 0.5w, but it just increases the cost, otherwise. I’m still going to wonder unless I can play with it myself, though, which is why I pointed that out.

            Now I’m really wondering about the HT links, though. That will be something else to tinker with at another time…

            • NeelyCam
            • 10 years ago

            As far as I know, the DDR3 voltages are specification-driven. Also, I don’t think they are CPU/chipset-limited, but DIMM limited. DDR chips are done with processes that are not tuned for performance… 1.0V DDR3 I/O would be pretty easy to make on a CPU process.

            I don’t know much about HT, but I doubt I would gain much by lowering the supply on the Clarkdale in-package QPI link – the link is so short that power consumption is probably pretty low already.

            • OneArmedScissor
            • 10 years ago

            The new Atom is supposed to be DDR3 667 MHz, so it will become prevalent soon enough.

            But since an 800 MHz DDR3 JDEC spec has been official all along, it surprises me no one goes with that for normal laptops.

            I don’t think they’d have any problem making it extremely low voltage at low clock speeds. It’s just that no one is asking for it…yet. There’s already relatively high speed DDR3 that runs at 1.25v for desktops, and there’s much more of it for servers.

    • derFunkenstein
    • 10 years ago

    Man, this almost makes nVidia’s chipsets look attractive. Almost. At least they don’t have these I/O transaction performance issues.

      • Kurotetsu
      • 10 years ago

      I was thinking that I’d be able to jump back to AMD when I first heard about the 800-series chipsets around 1 or 2 years (I had been using AMD since my first computer, up until the P35 chipset came out). That doesn’t look like thats going to happen now.

      The 890FX, 890X, 870, and 880G/885G chipsets haven’t come out though, hopefully they might have some of what’s missing with this.

        • derFunkenstein
        • 10 years ago

        They’ll still have the same southbridge. If the 890GX is any indication it’ll just be a rebadged northbridge and this terrible southbridge.

    • NeelyCam
    • 10 years ago

    l[

      • MadManOriginal
      • 10 years ago

      Well maybe you shouldn’t have been called an idiot (although your Intel bias probably doesn’t help there) but seeing what else is on offer is never a bad idea if one is able to wait and the timeframe isn’t too long. If you knew ahead of time that those features wouldn’t be present that’s another thing.

        • NeelyCam
        • 10 years ago

        Me saying that Intel is good qualifies me as an idiot? This forum is just too loaded with AMD-biased bigots…

        Although I’m fully aware that you’re not one of them

          • Meadows
          • 10 years ago

          No, saying that something is good doesn’t make you an idiot.
          Being prime1ey about it does.

          • MadManOriginal
          • 10 years ago

          In this specific case I think the point is that no one knew the details of the upcoming AMD chipset at the time. What you said might not have been wrong if it was just about features but if it was a comparison to an unknown then it wasn’t really a comparison and combined with your apparent bias you saying ‘Intel is better’ when compared to an unknown brought out the insults.

            • NeelyCam
            • 10 years ago

            No; my comparison was to the current AMD option (785/790G), and I pointed out why I thought Clarkdale was better. It was others who wanted to compare it to the unknown (890G)

      • Palek
      • 10 years ago

      q[

        • insulin_junkie72
        • 10 years ago

        I was at least expecting the IGP performance would be a bit better than the 790GX’s, but it’s the same identical game performance.

          • NeelyCam
          • 10 years ago

          Would you really play games on an IGP?

      • OneArmedScissor
      • 10 years ago

      Something tells me they really just don’t care. It’s hard for them to miss a boat that was never coming their way to begin with. How many people are going to buy an AMD board specifically because of those features?

      This update is disappointing, but not because of techno babble bullet points like that. The big thing is that the graphics aren’t any better. Something tells me that Nvidia’s exit from the motherboard market is to blame there.

        • MadManOriginal
        • 10 years ago

        Uh, anyone who wants to put together an inexpensive HTPC would want those features.

          • OneArmedScissor
          • 10 years ago

          Yeah, point being, how many of those people are there, really, and on top of that, how many are actually looking for something new? Obviously, AMD didn’t think it was enough to dump money into making changes just for them.

          As time goes on, HTPCs become less and less interesting, and they weren’t that interesting to most people to begin with. Rather than buying a new computer, you can just stream things straight to your TV countless new ways as time goes on.

            • MadManOriginal
            • 10 years ago

            Look, it’s an HTPC feature that others have and this chipset doesn’t. You only asked ‘How many people are going to buy an AMD board specifically because of those features?’ and the answer is HTPC builds then you move the bar because you don’t like the answer. Since the northbridge here is literally just a 790GX chip they didn’t dump i[

            • OneArmedScissor
            • 10 years ago

            I didn’t move any bar. Both times I said, “How many people?” This is business.

            The return has to pay for the investment. AMD are not designing a new chip just to attempt to satisfy a few people who are probably just going to buy something else, anyways. The end.

            All they really did here was swap the southbridge for one that will become prevalent in the future.

            Why design a new northbridge when they’re about to dump it altogether and can’t reuse it down the road?

            • NeelyCam
            • 10 years ago

            Are you saying they /[

            • NeelyCam
            • 10 years ago

            And how many people will attempt to play games on an IGP?

            IGP performance doesn’t matter much – features do.

    • Skrying
    • 10 years ago

    Every system should include a “integrated” graphics solution of some sort. Be it in the northbridge as the case with current AMD setups or on the CPU with the new Intel platform. I can’t overstate enough how wonderful it was having my integrated graphics as a back up when my dedicated card didn’t want to behave.

    It is extremely useful, one of the most useful features a chipset can have IMO.

      • SomeOtherGeek
      • 10 years ago

      I totally agree. Not only for that, but in testing too. Testing without a video card is awesome!

      • OneArmedScissor
      • 10 years ago

      It’s especially nice if you start using a computer for something else when it gets older and you don’t need a graphics card that would otherwise be wasting power.

        • hermanshermit
        • 10 years ago

        Been building every PC for years with an integrated core, regardless of whether I add a grapahics card. It’s basically “free” so why not?

        The main reason being when these are passed on I can keep my graphics card and of course when retired to more basic duties who wants a power-sucking card card doing only basic desktop anyway?

      • crabjokeman
      • 10 years ago

      Only if power consumption when disabled is the same as not having it at all. Sadly, that’s usually not the case with IGP’s (at least it isn’t when I compare my AMD 770 board to my old 780G).

      • shank15217
      • 10 years ago

      Or you could have a cheap discrete card as a spare. Your reasons don’t make much sense.

        • flip-mode
        • 10 years ago

        His reason makes plenty of sense to me. “Free” is less than “cheap”, for one. For another, IGP means you don’t have to open up the computer case. Also, IGP means you don’t have to safely store a spare card. Also, IGP will typically use less power than a discreet card, if that matters to you.

        It’s just too damn convenient, and I agree, IGP should be on ever mobo. With the advent of Clarkdale, it essentially will be.

        IGP makes sense, and it doesn’t hurt a darn thing, even on top end mobos. If I could choose between two top end mobos – one with IGP and one without – and all else was equal or if the only difference was a couple percent of overclock lost, I’d take the IGP every time.

          • SomeOtherGeek
          • 10 years ago

          Yes, that was one of the reasons I was so torn between an Intel and AMD before the Clarksdale era. Having an integrated graphics was more useful in more ways than one.

          • indeego
          • 10 years ago

          1. IGP: Costs more than a non IGP board all things being equal.
          2. Graphic Cards rarely die. At least, I’ve never seen one die.
          3. If your Graphics Card dies, there’s good chance your IGP might die also, which might kill the MB. IGP may add a level of complexity to the build.

          I guess I don’t see the need for IGP+dedicated with the intention of redundancy. Seems like waste more than useful. It takes less time to pop a case open and swap a card/cable than go into the BIOS and turn on IGP again and reconfigure a new driver set (and uninstall an old driver setg{<.<}g)

            • clone
            • 10 years ago

            historically IGP motherboards sell for notably less than non IGP motherboards….. the added complexity/cost is likely negated by sales volumes given how much OEM’s love to bundle them.

            it used to be that IGP motherboards didn’t overclock well which was for enthusiasts a deciding factor but those days are gone.

            I wouldn’t run out and buy an 890g today because of price but last year I was buying 690g and 740g motherboards for $50.00 and I’ve since switched to 780 and 785g motherboards for $80.00……. when 890g comes down and the bios and later board revisions address the “gremlins” it’ll be hard to pass on this series.

            I’d like to see AMD double the available IGP perf….. or offer sideport memory slots if it wasn’t overly spendy.

    • insulin_junkie72
    • 10 years ago

    I wonder if AMD killing ACC on the 890GX (per Anand, and requiring mobo makers to figure out workarounds like Asus did) is going to be the way forward for AMD chipsets now? Make it more difficult to unlock disabled cores and all that…

    • bdwilcox
    • 10 years ago

    Is it just me, or does not having an SB7x0 southbridge in the mix make this review feel incomplete?

      • Veerappan
      • 10 years ago

      Definitely agreed. I really would’ve liked to have seen the 890GX motherboards compared to a 785G (or similar) along with the intel board. The big feature here I was wondering about was comparative SATA/AHCI performance between the 780G (SB700) and 785G (SB750) boards I have at home, and the new SB850 controller included on these boards.

    • Jambe
    • 10 years ago

    If SATA 6Gbps were that important to you, you could just get the 2-port PCI-E x4 adapter Asus makes for $26.

    There are plenty of great 785G boards selling for $60-80, and even less with promos and such. Hell, there’s a JetWay with Sideport memory for $70 shipped. So, $96 vs $140/150, or $106 vs, if you get the adapter with 2xUSB3 on it as well.

    • Flying Fox
    • 10 years ago

    Looking at the IOmeter graphs I actually think the SB750 has improved a lot when it comes to AHCI using AMD’s own drivers? Beyond 32 I/O’s I don’t think many people will hit that in their queue depths. So what other AHCI driver issues are you talking about Dissonance?

      • Dissonance
      • 10 years ago

      That is the AHCI driver issue. Regardless of whether you’ll often hit a queue depth of greater than 32, the fact remains that AMD’s drivers cap performance and Microsoft’s don’t. The limitation isn’t the controller, so this should be something AMD can fix.

      The 890GX may be a huge improvement over its predecessor on this front, but the SB750’s dismal AHCI performance with AMD’s drivers hardly set the bar high.

    • KGA_ATT
    • 10 years ago

    From all the links in the Daily Bread ‘Systems and Storage’ section I am a little perplexed from reading the forward comments/analysis. Along with the review here(TR), at Techgage and Hardware Canucks being most candid on AMD’s 890GX chipset, with accompanying Motherboards from Asus, Gigabyte and/or MSI.

    I think AMD could have captured some wind in their sails if there had been a competent implementation of SATA 6Gb/s performance. It’s like a boxer in the 11th round, who hasn’t won any of the previous 9 rounds and can’t deliver a blow to the competitor decisively. That may be a bit dramatic but they could have had a better event opener to their upcoming Thuban’s.

    I think the speed boost to the IGP is great. It’s even compatible apparently with the HD 5450 and possibly another Radeon HD 5XXX low end product for Hybrid Crossfire for additional performance. But now perhaps this release in a mATX MB with an Athlon II and a competent SATA 6Gb/s controller card for $40-$50 can keep much or all of the i3’s on the ropes.

    TR may have to redo the match up review article “Core i3 takes on Athlon II”.

    • paulWTAMU
    • 10 years ago

    I *like* mobos with IGPs; that way if i”m having problems I can rule out of GPU and PCIe slot.

    • JoshMST
    • 10 years ago

    Glad I wasn’t the only reviewer that noticed the piss poor write performance of the SB850. I honestly wonder if it wasn’t tuned more for SSD usage, and has thrown spinning disks under the bus. You would think that after the misses when it comes to southbridges would have inspired them to hire some extra designers to iron out the kinks that AMD would have done so by now…

    • grantmeaname
    • 10 years ago

    “with the H55 Express wedged between the r[

      • Dissonance
      • 10 years ago

      Doh, results got switched transcribing from Excel. Fixed.

    • flip-mode
    • 10 years ago

    Oh my, lots to talk about here.

    First, I’m going to repeat the same thing I say every time TR reviews one of AMD’s xxxGX boards: why complain about an IGP on a mid range (or even high end, though that’s not what we have here) motherboard? Think of it as an extra feature that, if not useful now, will be useful down the road when the mobo transitions from your main rig to a secondary rig. Why complain about dual PCIe just because the board has IGP, and why complain about IGP just because the board has dual PCIe or just because it costs $180. I just don’t get how IGP can be a bad thing, and I don’t buy the argument that it’s bad because it could hurt overclocking – not to any degree that it is a big deal. Just my opinion.

    Second, the above not withstanding, I totally agree with Geoff’s final assessment of the board – not any reason to choose 890GX over 785G boards unless you really want that dual PCIe.

    Third, and most disappointing is the storage controller is still a weak link and there is simply no excuse for it anymore. It’s beyond ridiculous.

    Essentially, this is a totally unimpressive product, as far as I can tell.

      • NeelyCam
      • 10 years ago

      Yes – this is disappointing.

      • ssidbroadcast
      • 10 years ago

      AMD has really been pooing on your dad, lately.

    • Ushio01
    • 10 years ago

    AMD needs to go buy Nvidia’s chipset division at least there performance could match Intel’s.

      • pogsnet
      • 10 years ago
        • Ushio01
        • 10 years ago

        They may not now but they did and Nvidia’s southbridge has been superior to AMD’s for quite a while.

    • obarthelemy
    • 10 years ago

    I’m starting to be worried about AMD. Even though for the last 5 yrs I’ve been true to them, especially for low-end systems, it seems their value proposition is fading:

    – Intel’s IGP is still inferior, but not by much. 10-20% don’t really count, office+video jobs are still OK
    – Intel’s power consumption is much better across the board. As a consequence, silence, too.
    – Peripheral buses (SATA, USB…) are much better on Intel’s side: reliability, throughput, CPU utilization…
    – Intel has more Mini-ITX boards, and for single- or dual-spindle IGP configs, Mini-ITX makes a lot of sense.

    The one remaining advantage is lower purchase cost, but Intel’s lower power consumption makes up for that quickly.

    If I had to buy or recommence yet another cheap PC today, it would probably be i530 based.

    • MadManOriginal
    • 10 years ago

    Going to repost an idea from the 5830 review for something I’d like to see:

    Hey Scott *or Geoff, in the interest of ‘looking at old hardware’ there’s something that’s been on my mind for a while. I haven’t really played any truly modern games (the newest being at least 2 years old) and have actually picked up some older games that have low by modern standard system requirements thanks to Steam sales and the sysreqs are like 6600GT or lower. Now it’s easy to say ‘this card is faster than an old one’ but what isn’t so easy is what new card or *especially* what IGP might be equal to an old one. I say IGP especially because I’ve been thinking of going with an IGP in the future (AMD DX11/Llano, Sandy Bridge IGP, whatever NV comes up with) for the little lightweight gaming I do for my next new system build.

    The problem is that most any IGP review uses more modern games but at very low resolutions and/or settings. Great, we know IGPs are bad for newer games…not the most useful tests. It would however be very useful to know what older cards low-end GPUs or IGPs are equivalent to for the sake of playing older games. Even just one such article would be useful because then we could reference off of it in the future using extrapolation for newer IGPs.

      • obarthelemy
      • 10 years ago

      I second that. There are plenty of very good older games. I’d love to know which ones are actually playable on today’s IGPs. Detail level doesn’t matter, but resolution does since it can’t really be tuned down, and my PCs all have at least 1680×1050 minimum, 1920×1200 most often.

    • dpaus
    • 10 years ago

    I’m wondering if you’re looking at it the wrong way – the integrated graphics is just a “gimme”, a throw-away – /[

      • UberGerbil
      • 10 years ago

      Isn’t that what he implied?

      g[

        • dpaus
        • 10 years ago

        Well, I’m thinking of his l[<"I just don't get the 890GX....dual-x8 CrossFire support.... is nice, but that makes the integrated GPU even more of a waste"<]l The IGP is only a waste if you think it's supposed to be an important part of the value proposition in the first place. I'm suggesting that it's there because there was an otherwise-unused corner of the wafer...

    • Buzzard44
    • 10 years ago

    I’m confused. Sata 6Gbps coupled with USB2.0 in the south bridge?

    But why?

Pin It on Pinterest