review amds 890gx integrated graphics chipset

AMD’s 890GX integrated graphics chipset

AMD has been on a bit of a hot streak lately. No, I’m not talking about the dominating performance of its peerless Radeon HD 5870 GPU. I’m not referring to quad-core bargains like the nearly-$100 Athlon II X4 630, either. I speak of something far more exciting: integrated graphics chipsets.

Ok, so maybe exciting is a bit of a stretch. But AMD has definitely been on a run in the integrated graphics world, and it all started with a 780G chipset launched two years ago nearly to the day. With a DirectX 10-class graphics core, Blu-ray decode acceleration logic, and gen-two PCI Express, the 780G quickly became our integrated graphics chipset of choice—and our recommended platform for budget desktops and home-theater PCs.

This past summer, AMD replaced its mainstream integrated graphics chipset with the 785G. A refreshed graphics core with tweaked shader units and an updated video decode block punctuated this release, propelling Gigabyte’s implementation into TR Editor’s Choice territory.

AMD hasn’t been content to confine its integrated graphics chipsets to budget microATX motherboards, though. Some six months after introducing the original 780G, a hopped up version of the chipset dubbed the 790GX arrived astride mid-range ATX boards targeted at PC enthusiasts, gamers, and overclockers. The 790GX also brought with it a new SB750 south bridge chip with AMD’s first chipset-level RAID 5 implementation and an Advanced Clock Calibration capability that typically gave overclockers an extra few hundred MHz to play with.

Six months have passed since AMD lifted the curtain on the 785G, and right on schedule, an amped-up version is set to debut as the 890GX. Like the 790GX that came before it, the 890GX boasts higher GPU clock speeds and a penchant for full-sized ATX motherboards. It also sports new SB850 south bridge silicon with a 6Gbps Serial ATA controller of AMD’s own design, which is very exciting indeed. Naturally, we had to take a closer look.

The core-logic Swiss Army knife
As someone who has long chastised marketing departments for escalating model numbers without merit, I would be remiss not to take issue with the 890GX’s primary digit. AMD showed admirable restraint when it updated the 780G with the appropriately named 785G. The 790GX made perfect sense as a tuned-up version of the 780G, too. You’d think, then, that a hopped-up 785G would carry a 795GX model designation. But no, AMD apparently couldn’t resist and has dubbed its latest north bridge component the 890GX.

Look past the model number, and you’ll find that the 890GX shares the very same north bridge silicon as the 785G. The chip features just over 200 million transistors and is fabricated on a 55-nm process by TSMC. AMD sorts the chips it gets from the Taiwanese semiconductor firm, reserving only the best for the 890GX, while the rest live on as 785Gs.

The 890GX needs the cream of the crop because its Radeon HD 4290 integrated graphics core runs at 700MHz—200MHz faster than the Radeon HD 4200 in the 785G. Apart from the difference in clock speeds, though, the graphics cores are identical. Both share the same RV620 architecture, which serves up 40 DirectX 10.1-compliant stream processors.

Like most integrated graphics components, the Radeon HD 4290 is capable of carving out a slice of system memory for its own use. With such an arrangement, the IGP is forced to share memory bandwidth with the rest of the system. Fortunately, motherboard makers also have the option of pairing the 890GX’s integrated Radeon with “sideport” memory. Also referred to by AMD as a DDR3 performance cache, this sideport RAM is typically a single, 128MB DDR3-1333 memory chip. One such chip can be seen sitting next to the 890GX north bridge component in the picture above.

Since graphics chips are responsible for more than 3D pixel pushing these days, I should also note that the Radeon HD 4290’s Universal Video Decoder (UVD) block is fully up to date. The UVD supports dual-stream decode acceleration for high-definition MPEG2, VC-1, and H.264 video, which neatly covers all the formats used by Blu-ray movies. Video output can be piped over HDMI with an accompanying audio stream, but there are a few limitations on that front. The 890GX can’t pass TrueHD, DTS-HD, or uncompressed multi-channel LPCM audio over HDMI, putting it a step behind some integrated graphics platforms.

An 890GX block diagram. Source: AMD

In addition to its graphics core, the 890GX north bridge features second-generation PCI Express logic. 16 lanes of connectivity are reserved for discrete graphics cards, and unlike the 785G, the 890GX can split those lanes evenly between a pair of x8 links for CrossFire. The 890GX has an additional six PCIe lanes reserved for expansion slots and peripherals, too.

The rest of the chipset’s connectivity is consolidated in its new SB850 south bridge component, which is connected to the 890GX via an Alink Express III interconnect that offers 4GB/s of bidirectional bandwidth—twice the bandwidth of Intel’s DMI interconnect. (The 2GB/s in the block diagram above refers to one-way speed). Alink Express looks a whole lot like PCIe, and I’d wager the interconnect is little more than four lanes of PCI Express 2.0.

The south bridge has two more PCIe lanes for talking to peripherals, giving the chipset 24 lanes in total. Unlike Intel’s P55, H55, and H57 Express Platform Controller Hubs, whose second-gen PCIe lanes signal at the 2.5GT/s rate typical of gen-one implementations, the SB850’s PCI Express lanes each boast a full 5GT/s signaling rate.

By far the most interesting element of the SB850 is its Serial ATA “3.0” controller, which supports transfer rates up to 6Gbps—roughly 600MB/s, with overhead taken into account—and all the usual RAID array configs. This is the first 6Gbps SATA controller we’ve seen make its way into a core-logic chipset, and it’s only the second implementation of the new standard currently in the wild. AMD designed the new SATA controller itself, too, which is a departure from previous south bridge chips that used third-party storage controller logic.

AMD’s older south bridge chips have a history of Serial ATA performance and compatibility issues, particularly in AHCI mode, which is necessary for features like Native Command Queuing. Developing the SB850’s SATA controller itself should give AMD more control this time around, and in a moment, we’ll see whether that paid off.

The dearth of storage solutions—including even high-end SSDs—capable of exceeding the bandwidth available with old-school 3Gbps SATA makes the SB850’s 6Gbps SATA support feel more like forward-looking insurance than a must-have feature. AMD didn’t look to the future when crafting the SB850’s USB controller, though. The controller design has changed from older SB700-series implementations and now features 14 ports instead of 12. But they’re all USB 2.0 rather than SuperSpeed USB 3.0. Given the speed of today’s external storage devices, that strikes me as a little short-sighted.

At least AMD has squeezed a Gigabit Ethernet controller into the SB850 alongside the usual HD audio interface and, surprisingly, an old-school ATA channel. The whole thing is fabricated by TSMC at 65 nm, resulting in a chip that measures about 50 x 70 mm. According to AMD, the SB850 draws just 0.85W at idle, which is a quarter-watt less than the old SB750.

Asus’ M4A89GTD PRO/USB3 motherboard
Next-gen connectivity with a side of throwback

Manufacturer Asus
Price (MSRP) $155
Availability Soon

It might seem odd for an integrated graphics chipset to anchor a mid-range motherboard targeted squarely at PC enthusiasts, but that’s what you get with the 890GX. For this market, the chipset’s embedded Radeon is best thought of as a backup display adapter or a source of additional monitor outputs rather than as a primary GPU. Even if the Radeon HD 4290 is the fastest IGP around, anyone who wants to enjoy recent games with all their eye candy turned up at reasonable resolutions will be plugging in a discrete graphics card.

Asus’ M4A89GTD PRO/USB3, then, is really more of a traditional mid-range motherboard than its video outputs might otherwise suggest. That’s a good thing, because over the years, Asus has become pretty proficient at building mid-range enthusiast boards.

The M4A89GTD certainly looks the part of something you might find in an overclocked gaming rig. Asus offsets the dark brown board with a peppering of blue and white slots and ports, and I quite like the understated styling.

Of course, aesthetics won’t make or break a motherboard. The layout can, and Asus has done a good job on that front. All of the onboard slots and ports are intelligently organized to avoid clearance conflicts. Users with upside-down enclosures that put the PSU below the motherboard may need extra-long power cables to reach the board’s auxiliary 12V power connector, but that’s only because the plug is situated next to the top edge of the board, which is our preferred location for traditional enclosures.

Like other mid-range boards, Asus’ M4A89GTD covers its north bridge and voltage regulation circuitry with ornate—but not outlandish—heatsinks. A single heatpipe links the two coolers, which are short enough to avoid clearance conflicts with most aftermarket cooler designs.

Asus uses an 8+2 power-phase design to feed the AM3 CPU socket. The board can reputedly handle processors with TDP ratings up to 140W, which should allow it to support the fastest Phenom II chips, including perhaps the upcoming Phenom II X6. There’s also a core-unlocking switch next to the DIMMs slots that will allow Athlon and Phenom X3 owners to try their luck at enabling the fourth core on their CPUs.

Those with keen eyes will note that Asus’ spin on the 890GX features solid-state capacitors throughout. Asus squeezes two-ounce copper layers into the four-layer board, as well.

Rather than employing an auxiliary storage controller to feed additional internal SATA ports, Asus makes do with the six 6Gbps Serial ATA ports offered by the SB850. However, Asus has elected to connect the board’s sole IDE port to a JMicron JMB361 storage controller rather than the south bridge. The JMicron chip is also linked to an eSATA port in the rear port cluster.

As you can see, the low-profile south bridge cooler won’t interfere with longer graphics cards. The SATA ports are neatly positioned out of the way of double-wide graphics coolers, as well, which is something you don’t always see even on high-end motherboards. The mix of edge- and surface-mounted SATA ports should ensure that users with extremely tight enclosures that snug the hard drive bay right up next to the mobo will still be able to connect a collection of drives with ease, too.

The M4A89GTD is stacked with half a dozen expansion slots, including dual PCIe x16 slots, one x4, one x1, and a couple of retro PCI slots. The x4 slot’s a nice touch, and I quite like the fact that double-wide CrossFire configs will still leave users with access to it and a standard PCI slot.

The board will automatically split 16 PCIe lanes evenly between its x16 slots when two graphics cards are installed. However, if you want a full 16 lanes of bandwidth running to the primary slot, you have to install an included switch card in the secondary slot. The card takes all of a few seconds to slide into place, which is hardly a hassle. An automatic or BIOS-level switch would have been slicker, though.

The M4A89GTD’s port cluster has all the bases covered: DVI, HDMI, eSATA, FireWire, S/PDIF, and USB 3.0. The blue USB ports offer SuperSpeed connectivity via an NEC controller. A VIA chip is tasked with FireWire, while Realtek’s new ALC892 codec chip handles audio. Interestingly, Asus employs a Realtek Gigabit Ethernet chip rather than taking advantage of the GigE MAC built into the SB850. We’ve seen mobo makers snub the built-in GigE offered by Intel’s chipsets for years, and it appears AMD is getting the same treatment.

As one might expect from an Asus board, the M4A89GTD’s BIOS is packed to the gills with overclocking and tweaking options. Multipliers, clock speeds, memory timings, and voltages can all be adjusted with ease and pushed well beyond reason. All of the voltages and most clock speeds can be keyed in directly rather than selected from a list, which makes trial-and-error tweaking a lot quicker for folks who know what they’re doing. For those who don’t, the BIOS also sports an auto-overclocking feature that does all the dirty work. Auto-overclocking schemes are relatively new in the motherboard world, and I like that Asus has implemented this one in the BIOS rather than tying it to auxiliary Windows software.

For me, though, the real star of the BIOS is the fan control section. I’ve long complained that rudimentary fan speed controls were inadequate for enthusiast-oriented motherboards, and Asus has finally taken notice. With the M4A89GTD, the user can set minimum and maximum fan speeds for the CPU and system fans. One can also control the temperatures at which those fans kick into high gear. The low-temperature limit is greyed out, but Asus tells me these fan controls are still a work in progress, so we could see it unlocked eventually. Props to Asus for putting some effort into a section of the BIOS that’s been largely ignored by mobo makers for far too long.

Gigabyte’s GA-890GPA-UD3H motherboard
Undercutting the competition

Manufacturer Gigabyte
Model GA-890GPA-UD3H
Price (MSRP) $140
Availability Soon

We’ve noticed an interesting trend develop in the motherboard world over the last little while. In an aggressive bid to increase its share of the North American retail motherboard market, Gigabyte has been selling its mobos for less than equivalent models from Asus. This pricing strategy is evident with the GA-890GPA-UD3H, whose suggested retail price is $15 cheaper than the Asus board despite the fact that both offer similar feature sets. Indeed, even Asus’ M4A89GTD PRO, which lacks USB 3.0 connectivity, is slated to sell for $5 more than Gigabyte’s SuperSpeed-equipped UD3H.

When all else is equal—including reputation and expected reliability—we’ll recommend a cheaper board that offers better value ten times out of ten. The question, of course, is whether all else is equal with this first batch of 890GX boards.

On the surface, that looks to be the case. The UD3H is another full-sized ATX board aimed at gamers and overclockers. Like Asus, Gigabyte has figured out that enthusiasts’ tastes have outgrown the days when a clashing, neon rainbow of colors was considered an acceptable palette. Bravo.

For the most part, the UD3H’s layout is uncluttered and free of potential problems. Again, though, users with upside-down cases will want to make sure that their PSUs have long auxiliary 12V cables.

Those keeping score in the power-phase pissing match between mobo makers will want to note that the Gigabyte board uses a 4×1 phase arrangement—half the number of phases available on Asus’ 890GX offerings. But the UD3H still supports 140W CPUs, and as we’ve observed with numerous other motherboards in the past, more CPU power phases isn’t necessarily better.

Like Asus, Gigabyte uses two-ounce copper layers that purportedly offer lower impedance than typical one-ounce layers. There are solid-state capacitors across the board, as well, and nerdy puns at no extra charge.

The stumpy heatsinks for the north bridge and voltage circuitry do a good job of staying out of the way, which should allow users to run larger CPU coolers without issue. The low-profile south bridge heatsink shouldn’t interfere with expansion cards, either.

Massive graphics cards like those in AMD’s Radeon HD 5800 series can stretch all the way across an ATX motherboard, creating all sorts of clearance problems for SATA cabling. Gigabyte neatly avoids the issue by lining up all eight of the board’s SATA ports along the board’s edge, where they’ll tuck just under longer cards and coolers. This arrangement isn’t without potential for peril, though. A hard drive cage mounted right next to the motherboard tray may not leave enough room to plug into edge-mounted SATA ports.

Gigabyte squeezes an extra PCIe x1 slot into its stack, but the close proximity of the north bridge cooler may complicate compatibility with longer expansion cards. Of course, there are still two x16 slots, two more x1 slots, and a couple of PCI slots from which to choose. Unlike the Asus, this board doesn’t require a switch card to juggle lanes between the PCIe x16 slots, either. When the secondary slot is empty, all 16 lanes are automatically routed to the primary slot. Install a graphics card into the secondary slot, and the board will split the lanes in a dual-x8 config.

Look familiar? The Gigabyte board’s port cluster nearly mirrors what we saw from Asus. The only difference that really matters is the UD3H’s lack of eSATA connectivity. One could argue the presence of those blue USB 3.0 ports makes external Serial ATA ports unnecessary for this board. However, for the overwhelming majority of users, I suspect an eSATA port would’ve been more useful than the seventh and eighth internal Serial ATA connectors, especially if it was one of those fancy new hybrid eSATA/USB ports.

Although the Asus and Gigabyte boards both use the same Realtek ALC892 codec chip, only the latter appears to have implemented support for real-time Dolby Digital Live encoding, which allows multi-channel game audio to be passed to a compatible digital receiver or speakers over a single S/PDIF cable. Without real-time encoding, only source material with pre-encoded audio tracks, such as movies, can take advantage of multi-channel digital audio output.

For the most part, the first 890GX boards from Asus and Gigabyte offer identical features. Their interfaces differ, but the two boards’ BIOSes serve up similar tweaking and overclocking functionality. Both include embedded BIOS flashing utilities and support for multiple configuration profiles, too.

So where do they differ? In the automatic overclocking department, for one. You won’t find a BIOS-based auto-overclocking utility on the UD3H. Gigabyte doesn’t even ship the board with Windows software that’ll turn up your CPU’s clock speed automatically, although it has done so in the past with other boards.

By far the biggest between the two BIOSes comes when we look at fan speed controls. Those on the UD3H look positively prehistoric. The user has the option of turning automatic fan speed control on or off for the CPU and system fan headers, and one can toggle whether the CPU fan is a three- or four-pin model. That’s it. We’ve been asking Gigabyte for control over temperature thresholds and actual fan speeds or voltages for years now, and nothing has changed. Apparently, the ability to tweak obscure system voltages by hundredths of a volt is more important than meaningful fan speed controls.

The devil’s in the details
This wouldn’t be TR motherboard coverage without a painstakingly detailed assessment of each board’s BIOS options and specifications. These details don’t exactly lend themselves to eloquent prose, but you should be able to find what you need in the tables below.

Asus M4A89GTD PRO/USB3 Gigabyte GA-890GPA-UD3H
Clock speeds Base: 100-600MHz in 1MHz
DRAM: 800-1600MHz in 266MHz increments

100-150MHz in 1MHz increments
: 200-2000MHz in
200MHz increments
CPU NB: 1400-2000MHz in 200MHz increments
400-1500MHz in 1MHz increments
Sideport: 1333, 14000-1820MHz in 30MHz increments
Base: 200-500MHz in 1MHz

100-150MHz in 1MHz increments
200-2000MHz in
1MHz increments

Sideport: 667, 800, 1067, 1333, 1400-2000MHz in 30-40MHz increments
Multipliers CPU: 4X-14.5X in 0.5X increments CPU: 5X-14.5X in 0.5X increments
5X-10X in 1X increments
DRAM: 4-8X in 1.33X increments
HT: 5X-10X in
1X increments
Voltages CPU: 0.7-2.1V in 0.003125V increments
CPU NB: 0.475-1.875V in 0.003125V increments

CPU VDDA: 2.2-2.9V in
0.00625V increments

: 1.2-2.5V in
0.00625V increments

: 0.8-1.4V in
0.00625V increments
NB: 0.8-2.0V in
0.00625V increments
NB 1.8V: 1.8-2.1V in
0.05V increments
SB: 1.1-1.4V in
0.05V increments
Sideport: 1.5-1.8V in 0.1V increments
CPU: -0.6-+0.6V in 0.025V increments
CPU NB: -0.6-+0.6V in 0.025V increments

CPU PLL: 2.22-3.1V in
0.02V increments
1.275-2.245V in
0.015V increments

0.9-1.6V in
0.02V increments

: 1.45-2.1V in
0.01V increments
Sideport: 1.37-1.8V in
0.05V increments
Monitoring Voltage, fan status, and
Voltage, fan status, and
Fan speed control CPU, system CPU, system

Gigabyte prefers that you adjust clock speeds via explicit multipliers, while Asus gives users control over actual clock speeds. You say tomato, I say, uh, tomato. To Asus’ credit, the M4A89GTD does have a couple of extra voltage knobs to twirl. Asus’ voltage controls also offer more granularity than Gigabyte’s, although that doesn’t strike me as a difference that has much practical import.

The multiplier options listed above are what’s presented with an Athlon II X4 635 processor. Expect support for higher multipliers if you’re using a Black Edition CPU with an unlocked upper multiplier.

Asus M4A89GTD PRO/USB3 Gigabyte GA-890GPA-UD3H
CPU support Socket AM3-based
Athlon II, Phenom II processors
Socket AM3-based
Athlon II, Phenom II processors
North bridge AMD 890GX AMD 890GX
South bridge AMD SB850 AMD SB850
Interconnect Alink Express III (4GB/s) Alink Express III (4GB/s)
Graphics Integrated Radeon HD 4290 with 128MB
DDR3-1333 sideport memory
Integrated Radeon HD 4290 with 128MB
DDR3-1333 sideport memory
Expansion slots 2 PCI Express x16
Express x4
1 PCI Express x1
2 32-bit/33MHz PCI
2 PCI Express x16
3 PCI Express x1
2 32-bit/33MHz PCI
Memory 4
240-pin DIMM sockets
Maximum of 16GB of DDR3-1066-1866 SDRAM
240-pin DIMM sockets
Maximum of 16GB of DDR3-1066-1866 SDRAM
Storage I/O Floppy disk
1 channel ATA/133 via JMicron JM361

6 channels 6Gbps Serial ATA with RAID 0, 1, 10, 5 support via SB850
Floppy disk
1 channel ATA/133 via GSATA2

6 channels 6Gbps Serial ATA with RAID 0, 1, 10, 5 support via SB850
channels 3Gbps Serial ATA with RAID 0, 1 support via GSATA2
Audio 8-channel HD audio via Realtek
ALC892 codec
8-channel HD audio via Realtek
ALC892 codec
Ports 1 PS/2 keyboard


2.0 with headers for 8 more

3.0 via NEC D720200F1
1 eSATA via JMicron JM361

1 RJ45 10/100/1000
via Realtek RTL8111E
1 1394a FireWire via
VIA VT6308P with header for 1 more

1 analog front out

1 analog bass/center out
1 analog
rear out
1 analog surround out
1 analog line in
1 analog mic in
1 digital S/PDIF out (TOS-Link)

1 PS/2 keyboard/mouse
2.0 with headers for 8 more

3.0 via NEC D720200F1

1 RJ45 10/100/1000
via Realtek RTL8111D
1 1394a FireWire via
TI TSB43AB23 with headers for 2 more

1 analog front out
1 analog bass/center out
1 analog
rear out
1 analog surround out
1 analog line in
1 analog mic in
1 digital S/PDIF out (TOS-Link)

Lots of similarities here. Asus and Gigabyte differ on a few auxiliary peripheral chips but little else.

Our testing methods
The 890GX is really the only mid-range integrated graphics chipset on the market. Direct competition simply doesn’t exist, but we can cobble together a competent rival using Intel’s latest Clarkdale platform. Intel puts an integrated graphics processor—the Graphics Media Accelerator HD—right next to the processor core on its Core i3 and i5 CPUs. Slap one of those into an H55 or H57 Express-based motherboard with video output ports, and you’ve got yourself an integrated graphics platform.

With four 2.9GHz cores and a price tag around $120, the Athlon II X4 635 is one of AMD’s most attractive CPUs and a perfect match for the 890GX. On the Intel side, the most appropriate competition is probably the Core i3-530, which is the same price as the X4 635 and features two 2.93GHz cores that can process four threads in parallel thanks to Hyper-Threading. We’ve paired the i3-530 with an H55 Express motherboard. The H55 chipset should offer equivalent performance to the H57, since the only major difference between the two appears to be support for multi-drive RAID arrays, which we won’t be using today.

I conducted our application, gaming, and video playback tests with only the Gigabyte 890GX board, but all the other tests were run on both the Asus and Gigabyte 890GX mobos. We used the integrated graphics processor for each platform during testing. Windows 7’s power plan was set to “balanced” for all but a subset of our power consumption tests.

With few exceptions, all tests were run at least three times, and we reported the median of the scores produced. For IOMeter, we’ve reported average rather than median scores. Also, our power consumption tests were only run once.


AMD Athlon II X4 635 2.9GHz

Intel Core i3-530 2.93GHz

Asus M4A89GTD PRO/USB3 Gigabyte GA-890GPA-UD3H Gigabyte GA-H55M-USB3
Bios revision 0211 F3 F1

North bridge
AMD 890GX AMD 890GX Intel H55 Express

South bridge
Chipset drivers Chipset: Catalyst 10.3
AHCI: 8.70RC1
Chipset: Catalyst 10.3
AHCI: 8.70RC1
Memory size 4GB (2 DIMMs) 4GB (2 DIMMs) 4GB (2 DIMMs)

Memory type

at 1333MHz

at 1333MHz

at 800MHz
Memory timings 7-7-7-20-1T 7-7-7-20-1T 7-7-7-20-1T

Realtek ALC892 with 2.42
Realtek ALC892 with 2.42
Realtek ALC889 with 2.42
Graphics Integrated Radeon HD 4290
10.3 drivers
Integrated Radeon HD 4290
10.3 drivers
GMA HD with drivers
Hard drive
Western Raptor X 150GB
Power Supply

OCZ GameXStream 700W

Microsoft Windows 7 Ultimate x64

We’d like to thank Western Digital for sending Raptor WD1500ADFD hard drives for our test rigs.

We used the following versions of our test applications:

The test systems’ Windows desktop was set at 1280×1024 in 32-bit color at a 60Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.

All the tests and methods we employed are publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Memory performance
Memory bandwidth doesn’t always dictate real-world performance, but it’s a good place to start when testing systems whose integrated graphics processors consume a chunk of main memory. Both of our 890GX boards feature 128MB of dedicated sideport video memory, however, which should reduce bandwidth sharing.

Despite its lack of dedicated video memory, the H55 Express platform delivers higher memory bandwidth than either 890GX board. Score one for the Core i3-530. The Asus and Gigabyte 890GX boards offer nearly equivalent memory bandwidth, with the Gigabyte board having a slight edge overall.

Shift your attention to memory latency, and the Asus 890GX board edges out the Gigabyte by a nanosecond. The real story there is how much the H55 Express trails—its memory access latency is a full 30 nanoseconds slower than the 890GX boards.

The following graphs are a little indulgent, but they paint the latency picture in three dimensions, across multiple block and step sizes. I’ve arranged the graphs in order of highest latency to lowest. Yellow represents L1 cache, light orange is L2, red is L3, and dark orange is main memory.

Here we’re focused on how different processor platforms compare, so I’ve omitted scores for the 890GX board. The H55 Express and its Core i3-530 may have much higher memory access latencies than the 890GX and Athlon II X4 635 combo, but thanks to the Core i3’s L3 cache, that platform doesn’t have to hit main memory until the block size exceeds 4MB. The AMD system starts dipping into main memory once we pass 512KB, giving it higher access latencies than the Intel rig at block sizes between 512KB and 4MB.

Application performance
Core-logic chipsets don’t have a huge impact on application performance, but we’ve whipped up a diverse suite of application tests to give you a sense of how our test platforms stack up when their respective CPUs become the bottleneck. Most of these tests are effectively multi-threaded, so keep in mind that we’re testing a proper quad-core Athlon II X4 against a dual-core, four-thread Core i3.

Cinebench’s rendering test is handled by the CPU, and the 890GX platform has a sizable lead over the H55. The results of the OpenGL modeling test are even more striking. In that test, the Radeon HD 4290 easily outruns the Intel GMA HD.

The Core i3-530 is hardly a slouch when it comes to HD video encoding, but the Athlon II X4 635 is faster—by a healthy margin, too.

In another highly multi-threaded test, the 890GX system prevails once more.

TrueCrypt isn’t even close. Core i3 and i5 processors do have new instructions designed to improve encryption performance, but TrueCrypt doesn’t yet support them.

The Panorama Factory’s stitch operation is nicely multithreaded, and the Athlon II X4’s four cores are faster than the i3-530’s two.

The 785G is certainly a competent integrated graphics processor, but it’s hardly a recommended solution for gamers looking to play the latest titles with the details turned up at reasonable resolutions. The 890GX’s Radeon HD 4290’s GPU is clocked 40% higher, so it should fare better. But is it really good enough to manage playable frame rates with the latest games? To find out, we tested a handful of recent releases at a charitably low resolution of 1024×768. We used the in-game timedemo features built into Left 4 Dead 2, DiRT 2, and Borderlands, and relied on FRAPS to log gameplay sessions in Modern Warfare 2. Median scores were taken from five runs of DiRT 2 and Modern Warfare 2, while three runs were used with the other games, whose scores were more consistent.

Lesson 1: even the latest incarnation of Intel’s Graphics Media Accelerator has serious issues. Our H55 Express system crashed repeatedly in Left 4 Dead 2 regardless of whether we were running a timedemo or just trying to play the game. DiRT 2 didn’t work, either, presenting us a blank screen instead of the game’s usual menu. Both are recent, popular titles that really should work.

With the GMA HD pulling up lame in two games, it’s easy for the Radeon HD 4290 to look good. But there is some bad news. I had to run DiRT 2 at the lowest in-game detail levels to manage playable frame rates at 1024×768. Fortunately, Left 4 Dead 2 was much more accommodating. The Source-engine title yielded decent frame rates even with all details turned up, albeit with extras like antialiasing and anisotropic filtering disabled.

Modern Warfare 2 ran acceptably on the 890GX, too, but only after I disabled antialiasing and a couple of in-game effects. Amusingly, the GMA HD’s average frame rate is the same as the Radeon HD 4290’s low point.

The only game of the bunch that I wouldn’t deem playable at 1024×768 on either platform is Borderlands. For whatever reason, this game doesn’t seem to scale well down to middling graphics hardware. Even with the lowest in-game detail levels, the Radeon HD 4290 only managed an average of 20 frames per second. This is why real gamers run discrete graphics cards—even sub-$100 models are a heck of a lot faster than either of these IGPs.

Blu-ray playback
We conducted our Blu-ray playback tests across three high-bitrate movies covering the major formats available on the market. 28 Days Later was used to represent the H.264 camp, Nature’s Journey for VC-1, and Click (which it pains me to even admit that we purchased on Blu-ray) for MPEG2. The latest version of PowerDVD, which supports the decode acceleration built into both the 890GX and GMA HD, was used for testing. Playback was run full-screen over HDMI at 1080p resolution.

Both systems played back our three movie samples flawlessly. Based on the scores above, the H55’s decode logic looks to be much more efficient. However, the Core i3-530 doesn’t lower its clock speed below 2.93GHz when playing the movies. The Athlon II X4 635 drops its CPU multiplier from 14.5X to 4X, taking the CPU clock from 2.9GHz down to just 800MHz. 20% of 800MHz is less than 11% of 2.93GHz.

Serial ATA performance — IOMeter
We’ll begin our storage tests with IOMeter, which subjects our systems to increasing multi-user loads. We used IOMeter’s workstation and database test patterns, since those are more relevant to desktop systems than the file or web server test patterns. This particular test makes good use of the Native Command Queuing capability built into the AHCI specification.

Drives capable of taking advantage of the SB850’s 6Gbps SATA controller are few and far between. The best candidate is currently Crucial’s new RealSSD C300, which is the first 6Gbps solid-state drive to hit The Benchmarking Sweatshop. Naturally, we couldn’t resist testing with it. But plenty of folks also use mechanical drives, and likely will for some time, so we’ll kick things off with a look at SATA performance using a Western Digital VelociRaptor.

AMD has a history of poor storage controller drivers, so in addition to testing the Gigabyte 890GX board with AMD’s own drivers, we tested it with the Microsoft AHCI drivers included with Windows 7.

When they’re both using AMD’s AHCI drivers, the Asus and Gigabyte 890GX boards offer nearly identical transaction rates. Performance levels off after we hit 32 concurrent I/O requests, which just happens to be the queue depth for Native Command Queuing. Interestingly, there’s no performance plateau when we combine the 890GX with Microsoft’s own AHCI drivers. The H55 Express’ transaction rates don’t trail off after 32 I/Os, either.

Regardless of which drivers are used, the Gigabyte 890GX board uses relatively more CPU time than the Asus. The H55’s CPU utilization is lower than both boards, although we’re still looking at less than 4% CPU utilization overall.

Switching over to a high-end SSD exposes a weakness with Microsoft’s AHCI drivers, at least at lower loads. The H55 Express still manages slightly higher transaction rates than any of our 890GX configurations—and that’s with a 3Gbps SATA controller.

Again, the Gigabyte 890GX board exhibits higher CPU utilization than the Asus. Both consume relatively more CPU cycles than our H55 system, but the differences don’t amount to much as the load scales upward. I didn’t expect CPU utilization to peak with the fewest outstanding I/O requests, but that’s what happened with all four configurations.

Serial ATA performance — HD Tach
We used HD Tach 3.01’s 8MB zone test to measure basic SATA throughput and latency.

The RealSSD has some of the fastest read burst speeds we’ve ever measured, and it’s clearly quicker on the 890GX than it is on the H55 Express. Even the VelociRaptor has higher burst speeds on the 890GX, but only with the Gigabyte motherboard. The Asus board’s VelociRaptor burst speeds are nearly 20MB/s slower than those of the Gigabyte.

Here’s where things start to get a little weird. With the VelociRaptor, average read speeds don’t vary all that much from one platform to the next. However, the RealSSD’s read speeds are all over the map, with the H55 Express wedged between the Asus and Gigabyte 890GX boards. The read speeds on each platform were quite consistent from one run to the next, making the differences between platforms all the more puzzling.

We haven’t had much time to work with the RealSSD, but before testing it with each configuration, I ran a secure erase and a full-disk HD Tach write speed test on the drive to establish an even, used-state playing field. This should prevent things like the block-rewrite penalty from affecting our results.

The intrigue continues when we look at average write speeds, which again find the Asus and Gigabyte boards separated by quite a margin. Asus is fastest again, but neither 890GX board is quick enough to catch the H55 Express.

More troubling than the wide gap in SSD write speeds between the Asus and Gigabyte 890GX boards is the fact that both pull up lame with the VelociRaptor. The H55 Express’s average write speeds are 30MB/s faster than the 890GX’s when the mechanical drive is installed.

HD Tach doesn’t quote a margin of error for its random access time test, but I can’t help but wonder if it’s more than a tenth of a millisecond. Things are pretty even here, with the obvious exception that the RealSSD’s access times are more than an order of magnitude shorter than the VelociRaptor’s.

There is a +/- 2% margin of error for HD Tach’s CPU utilization tests, but the differences in CPU utilization are much greater than that. With both the VelociRaptor and the RealSSD, the Gigabyte 890GX scores much higher than the Asus in HD Tach’s CPU utilization test.

USB performance
Our USB transfer speed tests were conducted with a USB 2.0/FireWire external hard drive enclosure connected to a 7,200-RPM Seagate Barracuda 7200.7 hard drive. We tested with HD Tach 3.01’s 8MB zone setting.

The SB850 may have a rearchitected USB controller, but it’s not as fast as the one inside the H55 Express. The 890GX boards pull up short in the burst and average read speed tests, and the Gigabyte trails behind the leaders with writes, too.

Matters get worse for the Gigabyte board when we look at CPU utilization, which is notably higher than the Asus. The 890GX boards are both running the “Balanced” Windows power plan with Cool’n’Quiet enabled, so there shouldn’t be this much of a difference between them.

When asked about the SB850’s slower USB transfer rates, AMD suggested that real-world transfers shouldn’t be affected. The company also indicated that Cool’n’Quiet can react oddly to CPU utilization tests, although that wouldn’t explain the difference in CPU utilization between the Asus and Gigabyte boards.

PCI Express performance
We used NTttcp to test PCI Express Ethernet throughput using a Marvell 88E8052-based PCI Express x1 Gigabit Ethernet card.

PCI performance
To test PCI performance, we used the same NTttcp test methods and a PCI Intel GigE NIC.

A Gigabit Ethernet controller may not be the most bandwidth-intensive peripheral to throw at an expansion interface, but it’s certainly the most common. All of our system configurations do well in the throughput tests, but the 890GX rigs have higher CPU utilization than the H55 Express. I suspect we’re seeing the Athlon II’s clock throttling in action again, but that doesn’t explain why the Asus board has lower CPU utilization than the Gigabyte in the PCI test.

Power consumption
That covers the chipset-specific portion of today’s festivities. Now it’s time to switch gears to exploring variables more dependent on motherboard attributes than core-logic components. First up, we have power consumption tests. We measured system power consumption, sans monitor and speakers, at the wall outlet using a Watts Up Pro power meter. Readings were taken at idle and under a load consisting of a Cinebench 11.5 render alongside the rthdribl HDR lighting demo. We tested with Windows 7’s High Performance and Balanced power plans.

Motherboard makers usually ship their boards with energy-saving software that’s supposed to lower power consumption without impeding performance. We’ve tested each board with and without this software installed. Gigabyte’s H55 Express board uses Dynamic Energy Saver software, while the company’s 890GX offering uses a new app called EasySaver. The Asus board uses an EPU app that must be configured in “auto” mode to avoid performance-sapping clock throttling.

Even with fewer power phases than its Asus counterpart, the Gigabyte 890GX board draws notably more power. Running each company’s power-saving software is good for a watt or two, but that’s about it.

As one might expect, our H55 system has the lowest idle power draw of the lot. The Core i3-530 is more power-efficient than the Athlon II X4 635, and the Intel CPU is also running on a smaller microATX motherboard.

Under load, the Intel system has even more of a power-efficiency advantage. It’s not even close.

Between the 890GX boards, the Asus draws less power under load by about 10W. Power-saving software has more of an effect here than it did at idle, particularly on the Asus board.

I had wanted to dip into IGP overclocking with this review, but there simply wasn’t time. And then I got to thinking and figured there wasn’t a point, either. Sure, the Asus and Gigabyte 890GX boards both give users the ability to tweak Radeon HD 4290 clock speeds, but if you’re that desperate for graphics performance, you’re better off saving your pennies, shoveling a few driveways, and buying a real graphics card, even if it’s a generation or two old.

Fortunately, I did have time to run a few quick overclocking tests on the motherboards themselves. First, I experimented with the auto-overclocking utility built into the Asus BIOS.

There wasn’t much to it: select the option in the BIOS, wait for the reboot, and see what you get. My system settled on a 233MHz base clock, which combined with a 14.5X multiplier, yielded a 3.4GHz CPU clock speed—not bad for auto-tuning.

Since the Gigabyte board lacks an in-BIOS auto-overclocking utility and doesn’t come with Windows software that accomplishes the same task, I kicked it old school with some base clock overclocking on the Asus. First, I lowered the CPU and memory multiplier to take those components out of the equation. Next, I turned up the base clock speed, checking for stability along the way using a four-core Prime95 load.

The Asus board cruised up to a 300MHz base clock speed with ease, but it would go no further. 310MHz wouldn’t post, even with extra voltage applied to the CPU and chipset. Still, it’s hard to complain about a 50% boost for the base clock.

Next up: the GA-890GPA-UD3H.

Much like the Asus board, the Gigabyte didn’t put up a fuss as I turned up the base clock—but only up to 280MHz. Try as I might, I couldn’t get the system to post at 290MHz. A 280MHz base clock speed is still capable of taking an Athlon II X4 635 up to an even 4GHz, which ain’t half bad.

Motherboard peripheral performance
Core logic chipsets integrate a wealth of peripherals, but they don’t handle everything. FireWire, Ethernet, USB 3.0, and audio are farmed out to auxiliary chips, for example. To provide a closer look at the peripheral performance you can expect from the motherboards we’ve tested today, we’ve compiled Ethernet, Serial ATA, USB 3.0, FireWire, and audio performance results below.

HD Tach
FireWire performance

Read burst
speed (MB/s)

Average read
speed (MB/s)

Average write
speed (MB/s)

CPU utilization

Asus 890GX
40.2 34.6 16.4 5


40.0 34.0 19.8 7

Gigabyte H55
32.6 28.9 19.8 2

The 890GX boards have similar FireWire performance, which is interesting considering that they use completely different controller chips. CPU utilization is a little higher on the Gigabyte, but the scores are within that test’s +/- 2% margin of error.

HD Tach
USB 3.0 performance

Read burst
speed (MB/s)

Average read
speed (MB/s)

Average write
speed (MB/s)

CPU utilization

Asus 890GX
152.4 81.0 75.6 19


142.5 40.6 52.3 17

Gigabyte H55
152.5 119.1 123.7 7

USB 3.0 may be the new hotness, but it just isn’t stable on the Gigabyte 890GX board. Scores were wildly inconsistent from one run to the next, and the system even locked up a couple of times during testing. Using the exact same SuperSpeed USB hard drive, I saw much more consistent performance on the other two boards.

All three boards use the very same NEC USB 3.0 controller, yet the H55 has much higher average read and write speeds than the Asus 890GX. Clearly, some implementations are superior to others.

HD Tach
Serial ATA performance

Read burst
speed (MB/s)

Average read
speed (MB/s)

Average write
speed (MB/s)
Random access time

CPU utilization

Asus 890GX
210.2 110.3 81.3 7.3 10

Gigabyte 890GX (AMD)
228.5 106.6 79.5 7.5 24

Gigabyte 890GX (GSATA)
169.8 108.2 76.1 7.3 18

Gigabyte H55 (Intel)
218.8 109.4 110.2 7.4 7

Gigabyte H55 (GSATA)
179.5 110.5 80.0 7.0 3

We’ve already covered the interesting SATA results, but these fill out some missing scores for the auxiliary storage controllers on each board. The scores above were all obtained with the VelociRaptor serving as the test drive.

NTttcp Ethernet
Throughput (MBps)
CPU utilization

Asus 890GX
940.4 18.5

Gigabyte 890GX
940.7 26.9

Gigabyte H55
926.8 9.6

The Asus and Gigabyte 890GX boards use Realtek’s RTL8111E and 8111D Gigabit Ethernet controllers, respectively. We don’t see much difference in throughput between the two, but again, the Gigabyte turns in a higher CPU utilization score.

RightMark Audio
Analyzer audio quality

Overall score

Frequency response

Noise level

Dynamic range


THD + Noise

IMD + Noise

Stereo Crosstalk

IMD at 10kHz

Asus 890GX
4 5 4 4 5 3 5 5 5

Gigabyte 890GX
4 5 4 4 5 3 5 5 5

Gigabyte H55
5 5 5 5 5 3 5 5 5

The 890GX boards score identically in our 24-bit, 192kHz RMAA loopback test. That said, the Gigabyte H55 board scores one point higher in three of the eight component tests—and overall.

Like the 790GX that came before it, I just don’t get the 890GX. Sure, part of me marvels at the idea of a jack-of-all-trades chipset with the fastest integrated graphics component on the planet, next-gen 6Gbps SATA connectivity, loads of interconnect bandwidth, and plenty of PCI Express 2.0 connectivity. But AMD expects 890GX boards to sell for between $130 and $180, which is a whole lot more than I’d advise anyone spend on an integrated graphics platform. Heck, you can pick up a microATX 785G board for $80 and get the very same north bridge chip and identical Blu-ray decode capabilities. The 785G’s Radeon HD 4200 won’t be as fast as the HD 4290 in games, but that’s sort of like saying a standard Smart car isn’t as fast as a turbo-charged one. Neither is quick enough if you’re looking for speed, just like neither integrated Radeon is sufficient if you really want to play games.

So what else does the 890GX give you that the 785G doesn’t? Dual-x8 CrossFire support, which is nice, but that makes the integrated GPU even more of a waste. Then there’s the full-sized ATX motherboard rather than a microATX model, but the 785G is also available on ATX boards that cost less than $100. Hmmm. Maybe you want the new SB850 south bridge, and specifically, its 6Gbps SATA controller.

If we ignore, just for a moment, that the only storage devices likely to be capable of taking advantage of 6Gbps SATA are extremely expensive SSDs, then yes, I can see being tempted by the 890GX just to get a taste of the SB850. But I can’t ignore that the SB850 appears to have a few kinks that need ironing out. Our testing has exposed weaknesses in AMD’s AHCI drivers and with the SB850’s sustained write performance. The fact that the Asus and Gigabyte boards exhibited wildly different SATA performance in some of our tests is reason for concern, as well. Neither board impressed in our USB performance tests, which is another traditional area of weakness for AMD.

Perhaps BIOS and driver updates could smooth out the SB850’s rough edges over time. If and when that happens, the 890GX may start to make more sense. Or it may make even less sense, because it didn’t take long for AMD to bring the 790GX’s then-new SB750 south bridge over to high-end 790FX motherboards. Surely, the SB850 will migrate to high-end FX territory before long.

Still, I suppose the 890GX makes a certain kind of sense for AMD. It puts the integrated graphics crown and whatever bragging rights that’s worth even further out of Intel’s reach. The 890GX also provides a mid-range replacement for the aging 790GX, which can currently be found on a whole lot of mid-range motherboards, even if few of their integrated Radeons will ever be called into action. All AMD had to do was sort out some of its better 785G north bridge chips and incorporate its new south bridge. That south bridge can now be distributed across AMD’s chipset lineup.

Although the 890GX doesn’t quite add up for me, it’s much easier to pass judgment on the two motherboards we’ve looked at today. I’ve liked a lot of the Gigabyte motherboards I’ve seen over the past few years, but the GA-890GPA-UD3H has some issues that must be addressed. The board’s USB 3.0 performance was flaky at best, and its CPU utilization was consistently higher than the Asus model across multiple peripheral performance tests. We’ve contacted Gigabyte about those problems, and while the company is looking into the issues, they’ve yet to be resolved. Those problems may be easier to fix than the UD3H’s comparatively high power consumption and its lackluster BIOS-level fan speed controls, making the Gigabyte board difficult to recommend, even if it costs $15 less than Asus’ M4A89GTD PRO/USB3.

Normally, I wouldn’t dream of paying $15 more for a board with essentially the same feature set, but not all else is equal this time around. The M4A89GTD offers better peripheral performance, lower power consumption, and useful BIOS features like automatic overclocking and robust fan speed controls. I wish it supported real-time Dolby Digital Live encoding and that its eSATA port was of the hybrid USB variety, but that’s really all I’d change.

In the end, though, I’m left with too many lingering concerns to wholeheartedly recommend any motherboard based on the 890GX. If you’re looking for integrated graphics on a budget, I think the 785G is still the best game in town. Asus and Gigabyte both have a number of excellent 785G boards from which to choose. If you were hoping to get in on next-gen SATA with a new AMD south bridge, hold tight. The SB850 may yet prove its worth, but it’s not ready for prime time yet.

0 responses to “AMD’s 890GX integrated graphics chipset

  1. Thanks for all your answers, especially “flip-mode” ‘s answer! I rechecked the reviews and I was wrong. The performance problems I saw were never reported with the 785G chipset. I got an Asus 785G V EVO for a friend of mine. Here in Portugal it’s one of the cheapest (sub-)mid-range motherboards (87.5 euros as of past week). Will be coupled with a 955, 3.2Gh Phenom II X4 (OCZ Vendetta 2 cooler), Geil 2x2Gb, 1600Mhz CL7-8 RAM, 1TB 1001FALS WD HD… Hope it rocks!… BTW, Intel never seduced me. Their processors are great, but when I check performance/cost, I allways find that I can get around pretty well with an AMD system equivalent to an Intel sytem. I think they have pretty equal performance in sub-mid price target systems.

  2. $20 difference in an immediate purchase? Naturally, that counts.

    $20 total difference _[

  3. The new Atom is supposed to be DDR3 667 MHz, so it will become prevalent soon enough.

    But since an 800 MHz DDR3 JDEC spec has been official all along, it surprises me no one goes with that for normal laptops.

    I don’t think they’d have any problem making it extremely low voltage at low clock speeds. It’s just that no one is asking for it…yet. There’s already relatively high speed DDR3 that runs at 1.25v for desktops, and there’s much more of it for servers.

  4. I have a feeling part of it is that there’s a lot of IP and licensing that would be needed to make an immediately competitive performance graphics part. I can only imagine how complex it is to go through all the hoops these days to keep from infringing on something.

  5. As far as I know, the DDR3 voltages are specification-driven. Also, I don’t think they are CPU/chipset-limited, but DIMM limited. DDR chips are done with processes that are not tuned for performance… 1.0V DDR3 I/O would be pretty easy to make on a CPU process.

    I don’t know much about HT, but I doubt I would gain much by lowering the supply on the Clarkdale in-package QPI link – the link is so short that power consumption is probably pretty low already.

  6. I played with RAM voltages and clock speeds on my computer with 2GB of DDR2 and it’s definitely enough to impact laptop battery life. I saw differences of a few watts at idle in some cases, which I can’t gauge completely accurately since the PSU isn’t so great, but it’s there.

    DDR3 may be a bit lower voltage, but the clock speeds are pretty much universally higher. There should still be some wiggle room there, too. It’s interesting that there’s not yet something like 1.0v 667 MHz DDR3, but I guess we’ll see that with the DDR3 update of Atom.

    I know AMD desktop boards don’t reduce chipset voltages at idle, but it looks as if that doesn’t really matter. I guess that’s one of those things where maybe on a laptop, it could save 0.5w, but it just increases the cost, otherwise. I’m still going to wonder unless I can play with it myself, though, which is why I pointed that out.

    Now I’m really wondering about the HT links, though. That will be something else to tinker with at another time…

  7. I tink you’d be hardpressed to find a modern dual core CPU that will go in these boards that can’t handle it on its own. The only advantage of acceleration might be lower power draw.

  8. Are you questioning the journalistic integrity of The Tech Report, but somehow using it as an attack on my post?

    The two systems benchmarked are clearly different, which you neglected to mention. The AMD slide uses exclusively Timedemos and not real world situtations. And finally, the AMD platform wins 9 of the 10 benchmarks. 9/10 is a massacre, not a tie. The one they don’t win, Supreme Commander, is an RTS — the loss is due to the AMD processor being slower, not due to the amazing performance of the i5-660 IGP.
    You can’t just look at the data. You have to read everything around the data. That’s the only thing that gives those little numbers and the relative heights of the bars any value.

  9. This point has been equally irrelevant in every single post in which you’ve brought it up. I’ve counted three but I could be missing some…
    The Intel board is really hard to find, and not available from a reputable retailer that I can find. It starts at $140 from those few retailers that stock it. The i5-660 is $200 on newegg. For comparison, the ASUS motherboard is inexplicably $140 online, and the processor it comes with is $120. Going with the gigabyte board lowers the price of the motherboard to $90, meaning the AMD platform is already $130 cheaper (or $80 with the ASUS). In 6.5 years with the Gigabyte (likely slightly less as it has slightly higher power requirements) or in 4 years with the Asus.
    Even in the absolute best-case scenario, the $20/year you quoted doesn’t add up to anything but a break even across the entire life of the PC, and that’s only at best and if you keep the PC for a really long time.

  10. so you’re telling me you think that the desktop would run 10 to 20 percent slower on an Intel IGP? Where did those numbers come from? Did you pull them out of your nether regions? This, combined with the use of “improvement” in the first sentence contrasting the use of “still” in the second, implied that you weren’t referring to performance in desktop applications. I read it as “Intel’s IGP is still inferior to AMD’s in games, but not by much — the 10-20% delta between them doesn’t really count, and office+video jobs are still OK”. Given the poor grammar and wording of your sentence, that’s not that unreasonable.

    Notice that you forgot to end your question with a question mark in a post in which you challenged my reading comprehension. Good day sir!

  11. I’m not trolling at all. He said the power savings made it worth it to get the Intel board, and I demonstrated mathematically that that statement is garbage. There’s nothing troll-like at all there.

    EDIT: I see your post #127 calling mine a troll. Now I can respond to your specific criticisms. I _[

  12. How about Hulu and Netflix performance? Can you scale a stream to 1080p and still have it be watchable? That’s the kind of mainstream usage that IGPs are more interesting to me for (I’d rather spend an extra $25 on an IGP than a separate video card with another fan and more power draw).

    Anyone have any info on that?

  13. One thing that you didn’t mention is upfront cost differences which could go a long way toward the $60 difference. (The 890 boards seem overpriced atm unless one is desparate for SATA 6Gb/s but 7-series can be had cheap.)

    Also you quite frankly exaggerate the difference or assume a specific use model – go from 24/7 to 8/7 and the difference becomes 1/3 of what you said – and you also multiply it out over 3 years.

  14. Considering these board for my next Phenom II build but I was looking to use two GTX 260 on SLI, except this board exclusively only Crossfire!? I have looked at other reviews but from what I’ve gathered, AMD’s 890GX doesn’t allow for SLI. 🙁

  15. Got it. I won’t have a CPU fan anyways (for noise reasons), but I was hoping it would score me a watt or two in idle reduction. Oh well…

    Yeah, I think DDR3 undervolting could reduce idle a bit. Maybe uncore (HT) undervolting helps as well. AMD doesn’t powergate everything yet, so idle power could reduce through undervolting, but with Intel CPUs undervolting mainly helps with load power.

    I’m thinking that best idle power reductions could be had through appropriate PSU sizing. Maybe switching to “green” HDDs or SSDs.

  16. Yes, true. I feel that the Intel processors are the best pick for the overclocker, but if you’re leaving the CPU at stock then both companies’ CPUs are good choices at any given price point.

    And yeah, as you said, I think that at this point all of the boards offer sufficient write speeds, but it is the job of a review to pick winners and losers, and, well, that’s what they’re doing.

    I’m plenty happy with my 785G’s write performance in Windows, though I haven’t used other OSs on it yet. My 690G’s write speed was definitely not as good and it was absurdly poor in Ubuntu. The 690G also had the AHCI problems that I’m not going to detail; suffice it to say the the 785G was a meaningful improvement and offers satisfactory performance for me… I just keep wondering why AMD hasn’t fully caught up with Intel and I sure wish they would. It gets tiresome to keep hoping for the next time when I’ve been doing that since before AMD bought ATI and it was ATI’s 400 series chipset.

  17. I’ve unplugged my CPU fan, and several other different fans, and measured the power difference. It’s pointless. If it’s not spinning fast, its power use is negligible.

    And yes, I also wonder about HT speed, though I think it’s the CPU’s that has the most impact there. I believe Athlon 64s idled at lower levels because theirs were much slower, and that’s why AMD still use them as their low power CPUs.

  18. Or a cheap Phenom II. I think it’s overlooked that you can get them for $130-140. Nobody is making you dole out for a 965 as all the PC sites imply.

    Both companies have pretty compelling options at all sorts of price points, though. It’s nothing to complain about. Just get what fits your use scenario. There is no “best,” just what works for you and your budget.

    The read/write speed differences are blown out of proportion and only matter if you are using a very fast SSD, and the computer’s purpose depends on that.

    As you can see in TR’s review, the Velociraptor shows no difference. At Anandtech, they show that even Indillinix-based SSDs show no difference, either.

    §[<<]§ The handful of super expensive SSDs with new SandForce and Micron controllers will find their home in the server market, not desktops.

  19. Gabe, the fact is that write performance is pretty good on these boards and if you are enamoured with the Phenom II then you shouldn’t be put off. The Asus board in this review actually does very well even against the H55 board, especially with SSD, but even with mechanical.

    The real zinger, at this point, is the CPU. The Clarkdales make just about any AMD CPU hard to recommend. The power consumption is great, the stock performance is great, turbo boost delivers real performance gains, and those things are dream overclockers (just like all Intel CPUs, heh).

    I have a Phenom II myself, but if I were buying today I’d almost certainly be getting an i5-750 or one of the Clarkdales.

    Life is ruff for those that prefer AMD. AMD’s savior is the Athlon II at this time, IMO.

  20. I’ve I question I hope it’s not off topic.
    I wanna buy a AMD Phenom II board. I was inclined to buy a 785G one, as 980GX are not available yet and may be too expensive.
    But most important of all, I’m concerned with the fact that AMD chipset based motherboards show a poor SATA write performance, compared to NVIDIA ones (750a, 780a). (Check TechReport on 790GX review, for instance and 785G too.)
    I don’t play games, and I must do backups from one HD to two other HD at least twice a weak (50GB to 100 GB). And I really hate poor write performance. Should I be concerned with that question qith AMD chipsets? Should I prefer NVIDIAS’s ones? I was thinking about Asus M4N82 Deluxe or the not so expensive M3N-HT Deluxe.
    980GX has not yet solved this problem with the appropriate care, I think. I’ll be using Windows XP. Would Windows 7 drivers solve the problem?

    Thanks for your opinion.

  21. So, are you going to tell me what the most efficient 890GX board is, so we can look at the comparison again?

  22. Absolutely.

    With XP, I couldn’t get the “wake-on-lan” thingy to work right, ever. Is it working well in Win7?

    If only I could make the computer go to sleep automatically, and wake up when I need it to – and ONLY when I needed it to – that would be awesome.

  23. $20 is not nothing in these forums, where people are bitching about an i5 costing $20 more than some AMD tri-core.

    This all started from my response to grantmeaname who trolled around saying that i5 is crap. But the obvious AMD-bias has yet again blinded everyone and gotten me labeled as a troll.

    You guys should try to be a bit less biased and a bit more factual, m’kay?

  24. I was just correcting the numbers.

    Oh, and please check your first post in this thread – you’re the troll here.

  25. Respectfully – unless you’re running a server, you computer should not be running while you’re asleep or while you’re out of the house. Turning off even the most efficient of the two would save 53 watts for more than half the day, which would equal EVEN MORE than 23% of a full size tree.

    If you’re running a file or web or other such server that doesn’t need to actually do much processing but just be running 24/7, then you should be using a NAS, which will get your power usage WAY down.

    If you have the need for a powerful server running 24/7, then you’re not going to be looking at i5 / Phenom, you’re going to be looking at whole systems and much more complex sets of requirements, capabilities, usage models, and so on.

    In sum, the i5 and H55 have terrific power consumptions profiles, and they should be given all due credit for that. But, in the context of a home computer I think bickering over 20 watts or 11 watts is ridiculous. If you’re going to be rolling out multiple desktops in a business environment then even these small wattage differences start being relevant.

  26. Ergo, saving 20 W constantly over three years is worth absolutely nothing. Nice one.

  27. They may not now but they did and Nvidia’s southbridge has been superior to AMD’s for quite a while.

  28. >7W sounds a lot more impressive than <8W, doesn’t it?

    I’m so tired of moronic trolls.

  29. The difference in the example was >7W (after correcting my math), and my point was that an inefficient Intel motherboard was selected for this comparison.

    Comparing the most efficient 890GX system and the most efficient H55 system, the difference could be some 20W at the wall. Maybe 20W doesn’t sound like much, but saving 20W 24/7 for three years yields a $60 difference – money that could be spent towards that SSD, larger pron drive etc. Not to mention, this might save 23% of a full-size tree.

  30. “So, what is the point of having a faster IGP that doesn’t offer a reasonable or playable performance in any modern game?”

    The point is that it works for the other over 9,000 people who play less than “modern” games on their PCs, as opposed to the three or so people who actually ever played Crysis.

    “890GX is not good for gaming, and most people who probably use IGP are not gamers at all.”

    Most people who play PC games play WoW and The Sims at low to middle of the road resolutions.

  31. re: bluray, from the article

    “Both systems played back our three movie samples flawlessly. Based on the scores above, the H55’s decode logic looks to be much more efficient. However, the Core i3-530 doesn’t lower its clock speed below 2.93GHz when playing the movies. The Athlon II X4 635 drops its CPU multiplier from 14.5X to 4X, taking the CPU clock from 2.9GHz down to just 800MHz. 20% of 800MHz is less than 11% of 2.93GHz.”

  32. 890GX didn’t offer a playable performance (except in MW2) even though all the games tested were running at 1024×768 with low settings

    So, what is the point of having a faster IGP that doesn’t offer a reasonable or playable performance in any modern game ? 890GX is not good for gaming, and most people who probably use IGP are not gamers at all.

    For non-gamers, for people who don’t use computer for gaming (those people are the majority), AMD IGP has absolutely no advantage over Intel IGP. And if you are gamer then you should get something better than AMD IGP because you won’t get playable performance in most games even if you play at low settings.

    Also, none of the AMD fanboys here metioned that Intel IGP was better in blu-ray playback

    Let us not forget that Intel Clarkdale GPU can bitstream Dolby TrueHD, DTS HD-MA and 8-channel LPCM over HDMI

  33. You missed my point entirely and that is that percentages sound great until you realize in some cases the absolute difference is something like 5W. Big freaking deal.

  34. So, are you going to tell me what the most efficient 890GX board is, so we can compare to Intel’s best offering?

    Or is the Asus one reviewed here the best pick?

  35. Disappointed that the northbridge is just a rebadge of 785G. There’s really nothing they could have changed in the northbridge except the IGP- but they made no effort to improve that. Ho hum.

    As for the southbridge, SATA 6Gbps is great- but no AHCI improvements? Come on.

  36. Talking only in percentages is the same marketing bs that ‘green’ stuff which has low power draw to start uses. (‘Greenpower’ drives which save 20% power but it’s actually only a few watts.) You have to consider absolute amounts too and at such low DC loads the absolute AC draw difference is very small.

  37. Yeah, except we don’t see the efficiency at “<5% load” (<40W), which is where these systems would be idling. The only review I’ve seen so far that looks at PSU efficiency at very low loads is the one from SilentPCReview:

    §[<<]§ The otherwise-efficient PSU dips to around 65% efficiency at 5% load.

  38. They’ll still have the same southbridge. If the 890GX is any indication it’ll just be a rebadged northbridge and this terrible southbridge.

  39. I’ve never heard that, but yes, I wouldn’t ever buy an SB850-equipped motherboard just because I have an unlockable CPU.

  40. I picked the lower power consumption of the two boards; that’s not that out-there or fallacious of a concept. Even with a Gigabyte board, the argument is essentially the same; run the numbers if you want. It’ll probably be more significant, something in the high teens or low twenties per year, but definitely not so much that it pays for itself.
    Of course I would pick the balanced plan for both boards; anyone concerned with both power and performance would run the balanced plan (gee, do you think that’s how it got its name?) as it balances the need for performance at times with the need for low power consumption at idle. I already explained this in my post, but I see you’ve chosen to ignore that.
    If you REALLY wanted to save power, as you say, then why would you go out and buy a really efficient PSU instead of opening control panel and clicking ‘power options’>’balanced’? Your scenario assumes that the PC is running in high performance mode, which I see no compelling reason for, especially if now in our hypotheticals we’ve gone out and bought a new PSU.

  41. No, the OP stated “The one remaining advantage is lower purchase cost, but Intel’s lower power consumption makes up for that quickly.” The point of my post was not to demonstrate that he was right. Clearly if the power consumption is less, the power cost will be less. I demonstrated that he would only be saving $6 a year, so even over the entire life of the PC that money would not be made up. THAT was the point.

  42. The first 890GX/SB850-based mATX board has shown up already on this side of the pond (Japan) from MSI:
    §[<<]§ It will initially cost around 16,980 yen (roughly $190), which is way too expensive for a motherboard, "fancy" sideport RAM-assisted IGP notwithstanding. Bring it down to under 12,000 yen and I might even look at it... (Source: §[<<]§ )

  43. Thanks for the correction – minor math mishap.

    I used “amplify” because the true power delta was larger than it initially seemed to be, “caused” by the difference of PSU efficiency – sort of “amplifying” the delta. But yeah – the delta is what it is; it doesn’t get amplified – “masking” is a better word for this.


  44. Please let me know what the most efficient 890G board is, so we can take a look at the comparison again.

    You cherry-picked your power consumption numbers.

    My point was that if you REALLY wanted to save power, you would pick a PSU that’s more efficient at low power loads. THEN you would see the true efficiency potential of the different platforms.

    You showed one DOES save money by going with the lower-power platform. Your statement that said you proved it incorrect is incorrect in itself. In contrast, I showed that there are unknowns in this power savings analysis, and the whole thing seems pointless since the PSU selected was a power hog, tuned to far higher loads.

  45. historically IGP motherboards sell for notably less than non IGP motherboards….. the added complexity/cost is likely negated by sales volumes given how much OEM’s love to bundle them.

    it used to be that IGP motherboards didn’t overclock well which was for enthusiasts a deciding factor but those days are gone.

    I wouldn’t run out and buy an 890g today because of price but last year I was buying 690g and 740g motherboards for $50.00 and I’ve since switched to 780 and 785g motherboards for $80.00……. when 890g comes down and the bios and later board revisions address the “gremlins” it’ll be hard to pass on this series.

    I’d like to see AMD double the available IGP perf….. or offer sideport memory slots if it wasn’t overly spendy.

  46. 1. IGP: Costs more than a non IGP board all things being equal.
    2. Graphic Cards rarely die. At least, I’ve never seen one die.
    3. If your Graphics Card dies, there’s good chance your IGP might die also, which might kill the MB. IGP may add a level of complexity to the build.

    I guess I don’t see the need for IGP+dedicated with the intention of redundancy. Seems like waste more than useful. It takes less time to pop a case open and swap a card/cable than go into the BIOS and turn on IGP again and reconfigure a new driver set (and uninstall an old driver setg{<.<}g)

  47. ?

    785g boards with the SB710 had ACC, and before the 890GX, that was as new as AMD’s chipsets got.

  48. Unless you’re planning on buying a 8xx series board and unlocking cores.

    If everyone eventually follows Asus’ lead and comes up with a homebrewed solution, it might not be such a big deal.

  49. As I stated: “office jobs + video”. Who on earth plays with an IGP, especially recent games like the ones tested (those tests don’t make much sense to me).

    Please L2R

  50. ACC doesn’t do anything for Phenom IIs anyways, as it’s entirely implemented inside the processor and the motherboard never has to deal with any of it. It’s a nonissue.

  51. Want to try again nublet? There are quite a few AM3 (DDR3) motherboards with ACC.

  52. Edit: I was mistaken, and probably.

    As an aside, Fermi is 42×42 mm on a 40 nm process.

  53. This… is an extremely peculiar review, both because of the often huge performance discrepancy between the ASUS and Gigabyte boards and the performance discrepancy from other TR reviews.

    Maybe it’s because of the lack of direct comparison, but if you look back at your own 785G review both the 785G and the G41 do better in USB performance than the H55 and 890GX. So yeah, USB performance is disappointing but how did it get worse on both chipsets?

    Moreover, if you look at that same review the SB850 does much, much better with AMDs own AHCI drivers in IOmeter. Yes, there is some weakness in average writes but otherwise it’s a significant improvement over the last generation. I can’t say that for sure, since they’re different reviews, though.

    I dunno, something just smells fishy. This is a chipset I’d definitely like to see compared in a future review when BIOS and drivers are more mature.

    Oh, and I’m surprised you guys didn’t mention that ACC support is dropped with this chipset.


    Forgot to mention that it’s a shame that, once again, AMD ignored the market’s request for 8-channel LCPM audio support.

  54. g{<53W vs. 64 W is not 3W; it's 13W.
    Difference: 18.3W<}g How many more times do you have to misfire to realise that 64-53=11? Also, efficiency problems of a PSU will usually work to mask large usage deltas in the computer, rather than "amplify" them, going by your example. If intel has more people like you on their side, what with your master's degree in statistical maths, then you shouldn't be surprised that this site is also filled with AMD users, considering this site is also filled with smart people.

  55. then why aren’t we seeing motherboards with SB750 coming out “unlocking” SATA 3.0 like we do for unlocking cores? I just think it was an oversight on AMD’s part.

  56. “The whole thing is fabricated by TSMC at 65 nm, resulting in a chip that measures about 50 x 70 mm”

    That’s one very very big chip you got there. At 50 × 70 mm this chip is 3500 mm² in size. Fermi would seem small compared to a chip of this size. I think you meant to say the chip is 5.0 × 7.0 mm?

  57. 53W vs 56W is 3W. I suggest you stop cherry-picking only those statistics which favor you. Almost all PCs are left in “balanced” mode at all times, so it really doesn’t make sense to assume the PC would randomly be left in High Performance mode despite the fact even a normal PC user would leave their PC in balanced, and that any PC user as concerned as we would definitely not run the system in high performance.

    You don’t pay for the kWHr as it goes into the computer from the PSU. You pay for the KWHr as it leaves the wall and goes into the PSU. The OP was referring to saving money on the electricity bill, and I showed why this was incorrect. Once again, your argument doesn’t really relate to my post in any way.

    It’s not a fair comparison to select the most efficient Intel board in existence and compare it to a typical 890GX board; in fact, the variation within the Intel boards is many, many times greater than the differences between platforms, as you’ve just demonstrated.

  58. Maybe that suggests it’s more firmware than actual hardware differences for Sata 3.0

  59. Yes, that was one of the reasons I was so torn between an Intel and AMD before the Clarksdale era. Having an integrated graphics was more useful in more ways than one.

  60. Definitely agreed. I really would’ve liked to have seen the 890GX motherboards compared to a 785G (or similar) along with the intel board. The big feature here I was wondering about was comparative SATA/AHCI performance between the 780G (SB700) and 785G (SB750) boards I have at home, and the new SB850 controller included on these boards.

  61. His reason makes plenty of sense to me. “Free” is less than “cheap”, for one. For another, IGP means you don’t have to open up the computer case. Also, IGP means you don’t have to safely store a spare card. Also, IGP will typically use less power than a discreet card, if that matters to you.

    It’s just too damn convenient, and I agree, IGP should be on ever mobo. With the advent of Clarkdale, it essentially will be.

    IGP makes sense, and it doesn’t hurt a darn thing, even on top end mobos. If I could choose between two top end mobos – one with IGP and one without – and all else was equal or if the only difference was a couple percent of overclock lost, I’d take the IGP every time.

  62. Or you could have a cheap discrete card as a spare. Your reasons don’t make much sense.

  63. Only if power consumption when disabled is the same as not having it at all. Sadly, that’s usually not the case with IGP’s (at least it isn’t when I compare my AMD 770 board to my old 780G).

  64. Been building every PC for years with an integrated core, regardless of whether I add a grapahics card. It’s basically “free” so why not?

    The main reason being when these are passed on I can keep my graphics card and of course when retired to more basic duties who wants a power-sucking card card doing only basic desktop anyway?

  65. The article thoughts on power efficiency at load are really misleading. The figure show power consumption and not efficiency. As AMD had a clear lead in cinebench, the efficiency might be better with AMD.

    In my opinion every benchmark should be paired with power efficiency analysis, or at least benchmarks simulating usage scenario, e.g. every day multitasking, gaming, multimedia, etc.

  66. Good post.

    I would add that playing HD video on youtube/hulu/etc. is pretty important; any HTPC platform should be able to do it without issues. I’m not sure Intel has figured that one out yet…

  67. So let me make sure I understand this: same CPU, same north bridge, same south bridge, same disk drive, same OS, same drivers … but the CPU utilization during SATA I/O on the two AMD-based boards differs by as much as 60 percent??!?

    I’m having trouble believing this. These two boards had to be throttling the CPU differently during this test suite.

  68. 3W? I guess all the big-power stuff is already on the CPU… How much power does HT consume?

    I’m wondering how much I can save by removing the CPU fan. That could take some 3W by itself…

  69. 53W vs. 64 W is not 3W; it’s -[<13W<]- 11W. (Thanks, Meadows!) Also, the efficiency of the PSU is not fixed; it's a function of load. We don't know how bad the PSU is, but for the sake of the argument, let's say it runs at 50% efficiency when drawing 53W AC, and at 70% efficiency when drawing 64W off the wall. As a result, the actual platform power consumptions (=ignoring PSU inefficiency): 53W*0.5=26.5W 64W*0.7=44.8W Difference: 18.3W The load-dependent efficiency of the PSU can "amplify" the power consumption difference. Finally, if we really want to compare "platform" power efficiency, we should use the most efficient mobos available - MSI H55 is very efficient, and Intel's own boards are even better. Anandtech: §[<<]§ Intel's H57 and i5-661 idles at 34W, and that includes the 950W PSU inefficiency. With a better suited PSU: §[<<]§ After calibrating out the PSU efficiency, the system idle power was 18W.

  70. And how many people will attempt to play games on an IGP?

    IGP performance doesn’t matter much – features do.

  71. No; my comparison was to the current AMD option (785/790G), and I pointed out why I thought Clarkdale was better. It was others who wanted to compare it to the unknown (890G)

  72. In this specific case I think the point is that no one knew the details of the upcoming AMD chipset at the time. What you said might not have been wrong if it was just about features but if it was a comparison to an unknown then it wasn’t really a comparison and combined with your apparent bias you saying ‘Intel is better’ when compared to an unknown brought out the insults.

  73. No, saying that something is good doesn’t make you an idiot.
    Being prime1ey about it does.

  74. It is interesting how a choice of benchmarks can paint a different picture, it is probably better to take in as much data as possible before drawing a conclusion.

    Even AMD’s own press slides do not show a ’50 %’ advantage:
    §[<'s<]§ 890GX Desktop Chipset & ATI Radeon HD 4290 GPU Notably other reviews are showing these to IGPs trading blows, one wins the other wins, in the grander scheme a objective analysis would call it a draw, or at best a slight advantage to the 890GX. Nonetheless, all the reviews are pretty disingenuous with IGP in general, because this is not the typical usage model, almost all of the reviewers put into context these performance metrics relative to high end games, which are only 'playable', not 'enjoyable' as they must be diluted down to low res and low quality settings just to get > 30 FPS to be reasonable. No one is seriously going to opt for IGP for high end 3D gaming, rather a typical user would likely look toward more moderate games -- Sims series, WoW, etc. In this case both IGPs would likely yield a decent frame rate even at high res and good quality settings. The value, though is not in gaming, again because IGPs are not really gaming geared components of the chipsets. Let's take your examples, one game COD MW2 is certainly an 890GX win, but bordrelands, meh, who cares 16 vs 20, neither will be fun and I would suppose neither would COD MW2 as the details will be way way down. The real value is in other applications, HTPC is a commonly cited one... based on features, one dominant component is the TrueHD Dolby passthrough, one has it the other does not. Both give dual video streams, both HW decode but one is certainly more suitable for a HTPC for the video/audiophile. One is much lower power (thus quieter). Then there is the other performance considerations which is critical of the chipset, SATA, RAID stability, USB performance yada yada... In the end, it is to each his own. Overall, though, your original thesis is not supported by the larger body of the data.

  75. Me saying that Intel is good qualifies me as an idiot? This forum is just too loaded with AMD-biased bigots…

    Although I’m fully aware that you’re not one of them

  76. I didn’t move any bar. Both times I said, “How many people?” This is business.

    The return has to pay for the investment. AMD are not designing a new chip just to attempt to satisfy a few people who are probably just going to buy something else, anyways. The end.

    All they really did here was swap the southbridge for one that will become prevalent in the future.

    Why design a new northbridge when they’re about to dump it altogether and can’t reuse it down the road?

  77. It’s especially nice if you start using a computer for something else when it gets older and you don’t need a graphics card that would otherwise be wasting power.

  78. I totally agree. Not only for that, but in testing too. Testing without a video card is awesome!

  79. Look, it’s an HTPC feature that others have and this chipset doesn’t. You only asked ‘How many people are going to buy an AMD board specifically because of those features?’ and the answer is HTPC builds then you move the bar because you don’t like the answer. Since the northbridge here is literally just a 790GX chip they didn’t dump i[

  80. It seems that in Panorama Factory 890gx scores better (25 seconds vs 32 seconds), yet the review claims “a win for the H55″…

  81. Yeah, point being, how many of those people are there, really, and on top of that, how many are actually looking for something new? Obviously, AMD didn’t think it was enough to dump money into making changes just for them.

    As time goes on, HTPCs become less and less interesting, and they weren’t that interesting to most people to begin with. Rather than buying a new computer, you can just stream things straight to your TV countless new ways as time goes on.

  82. Uh, anyone who wants to put together an inexpensive HTPC would want those features.

  83. What’s weird is that the Anandtech article from the other day for mini-ITX H55s used TWO very high power PSUs, and they were still idling in the 30-40w range.

    I guess some of them don’t have a problem with that, but it still makes me wonder what it would do with a power brick, instead.

    You are kind of reinforcing his point, though. If the efficiency is worse on the high power PSU, and it’s still only a 3w difference, it’s going to be even less with something more sensible.

  84. Yeah, this seems like a moot point to me.

    Sub $100 boards already have USB 3.0. What is the real complaint?

  85. Something tells me they really just don’t care. It’s hard for them to miss a boat that was never coming their way to begin with. How many people are going to buy an AMD board specifically because of those features?

    This update is disappointing, but not because of techno babble bullet points like that. The big thing is that the graphics aren’t any better. Something tells me that Nvidia’s exit from the motherboard market is to blame there.

  86. I was at least expecting the IGP performance would be a bit better than the 790GX’s, but it’s the same identical game performance.

  87. Wait a minute, am I reading the BIOS settings right? It looks like you can undervolt the NB in the BIOS on these boards.

    I’ve always wondered if that would accomplish anything for idle power. I think the 785G chipset idles at 3w, so it’s probably irrelevant on a desktop, but you never know until you try.

    At least they finally have lower RAM voltages. I don’t think any 785G boards did lower than 1.5v. Probably not terribly important, either, but very goofy when DDR3 has continually been dropping lower than that for a while.

    Those could be interesting things for laptops using the 800 series chipsets.

  88. I was thinking that I’d be able to jump back to AMD when I first heard about the 800-series chipsets around 1 or 2 years (I had been using AMD since my first computer, up until the P35 chipset came out). That doesn’t look like thats going to happen now.

    The 890FX, 890X, 870, and 880G/885G chipsets haven’t come out though, hopefully they might have some of what’s missing with this.

  89. Man, this almost makes nVidia’s chipsets look attractive. Almost. At least they don’t have these I/O transaction performance issues.

  90. But yet they managed to squeeze 6GBPS SATA 3.0 in there. There’s really no excuse.

  91. Yeah I hate browsing articles at SPCR, they force you to click through each page with no quick jump article index nor any article table of contents. They just try to maximize their page views but to do it make me feel like I’m browsing a website from 2002.

  92. Well maybe you shouldn’t have been called an idiot (although your Intel bias probably doesn’t help there) but seeing what else is on offer is never a bad idea if one is able to wait and the timeframe isn’t too long. If you knew ahead of time that those features wouldn’t be present that’s another thing.

  93. If I had to make a total guess and say that AMD had to choose one or the other they may have felt that integrating SATA 6Gbps in to the chipset itself was more important than integrating USB3.0. You can hang USB3.0 off PCIe lanes well enough but doing do with SATA 6Gbps ports would take a lot of PCIe lanes.

  94. Yeah, and looking at idle power of some 50W with a 700W PSU makes so much sense…

    Please head out to Anandtech, or better yet: SilentPCReview, to get a better idea of the true idle power of H55/Clarkdale.

  95. Every system should include a “integrated” graphics solution of some sort. Be it in the northbridge as the case with current AMD setups or on the CPU with the new Intel platform. I can’t overstate enough how wonderful it was having my integrated graphics as a back up when my dedicated card didn’t want to behave.

    It is extremely useful, one of the most useful features a chipset can have IMO.

  96. I wonder if AMD killing ACC on the 890GX (per Anand, and requiring mobo makers to figure out workarounds like Asus did) is going to be the way forward for AMD chipsets now? Make it more difficult to unlock disabled cores and all that…

  97. That is the AHCI driver issue. Regardless of whether you’ll often hit a queue depth of greater than 32, the fact remains that AMD’s drivers cap performance and Microsoft’s don’t. The limitation isn’t the controller, so this should be something AMD can fix.

    The 890GX may be a huge improvement over its predecessor on this front, but the SB750’s dismal AHCI performance with AMD’s drivers hardly set the bar high.

  98. #17,
    “Glad I wasn’t the only reviewer that noticed the piss poor write performance of the SB850. I honestly wonder if it wasn’t tuned more for SSD usage, and has thrown spinning disks under the bus. You would think that after the misses when it comes to southbridges would have inspired them to hire some extra designers to iron out the kinks that AMD would have done so by now… ”
    Check: §[<<]§ #8, "I'm starting to be worried about AMD. Even though for the last 5 yrs I've been true to them, especially for low-end systems, it seems their value proposition is fading" Yeah, AMD's platform has little advantages over Intel's. When Intel gets a decent IGP (they've been making progress), AMD will have little value on its platform (video performance, I/O performance, power usage, features, etc)

  99. Is it just me, or does not having an SB7x0 southbridge in the mix make this review feel incomplete?

  100. If SATA 6Gbps were that important to you, you could just get the 2-port PCI-E x4 adapter Asus makes for $26.

    There are plenty of great 785G boards selling for $60-80, and even less with promos and such. Hell, there’s a JetWay with Sideport memory for $70 shipped. So, $96 vs $140/150, or $106 vs, if you get the adapter with 2xUSB3 on it as well.

  101. Looking at the IOmeter graphs I actually think the SB750 has improved a lot when it comes to AHCI using AMD’s own drivers? Beyond 32 I/O’s I don’t think many people will hit that in their queue depths. So what other AHCI driver issues are you talking about Dissonance?

  102. Well, I’m thinking of his l[<"I just don't get the 890GX....dual-x8 CrossFire support.... is nice, but that makes the integrated GPU even more of a waste"<]l The IGP is only a waste if you think it's supposed to be an important part of the value proposition in the first place. I'm suggesting that it's there because there was an otherwise-unused corner of the wafer...

  103. From all the links in the Daily Bread ‘Systems and Storage’ section I am a little perplexed from reading the forward comments/analysis. Along with the review here(TR), at Techgage and Hardware Canucks being most candid on AMD’s 890GX chipset, with accompanying Motherboards from Asus, Gigabyte and/or MSI.

    I think AMD could have captured some wind in their sails if there had been a competent implementation of SATA 6Gb/s performance. It’s like a boxer in the 11th round, who hasn’t won any of the previous 9 rounds and can’t deliver a blow to the competitor decisively. That may be a bit dramatic but they could have had a better event opener to their upcoming Thuban’s.

    I think the speed boost to the IGP is great. It’s even compatible apparently with the HD 5450 and possibly another Radeon HD 5XXX low end product for Hybrid Crossfire for additional performance. But now perhaps this release in a mATX MB with an Athlon II and a competent SATA 6Gb/s controller card for $40-$50 can keep much or all of the i3’s on the ropes.

    TR may have to redo the match up review article “Core i3 takes on Athlon II”.

  104. I *like* mobos with IGPs; that way if i”m having problems I can rule out of GPU and PCIe slot.

  105. Intel’s IGP was 50% inferior in the two games that actually started correctly. The other two were literally unplayable.

    Intel’s platform uses 3 fewer watts at idle and thirty fewer at load than AMD’s. Assuming 10% of the time, the PC will be at load (which is really if anything more generous than a typical situation), and that the other 90% it will be at idle, and that the PC is left running at all times, that’s a difference of 5.7 W each hour. To add up to a kilowatt hour of power, which is _[ you’d have to run the PC for 175 hours, which is more than a week. Across the course of a year, you’d save just under six dollars.

    Thanks for playing.

  106. Glad I wasn’t the only reviewer that noticed the piss poor write performance of the SB850. I honestly wonder if it wasn’t tuned more for SSD usage, and has thrown spinning disks under the bus. You would think that after the misses when it comes to southbridges would have inspired them to hire some extra designers to iron out the kinks that AMD would have done so by now…

  107. Oh my, lots to talk about here.

    First, I’m going to repeat the same thing I say every time TR reviews one of AMD’s xxxGX boards: why complain about an IGP on a mid range (or even high end, though that’s not what we have here) motherboard? Think of it as an extra feature that, if not useful now, will be useful down the road when the mobo transitions from your main rig to a secondary rig. Why complain about dual PCIe just because the board has IGP, and why complain about IGP just because the board has dual PCIe or just because it costs $180. I just don’t get how IGP can be a bad thing, and I don’t buy the argument that it’s bad because it could hurt overclocking – not to any degree that it is a big deal. Just my opinion.

    Second, the above not withstanding, I totally agree with Geoff’s final assessment of the board – not any reason to choose 890GX over 785G boards unless you really want that dual PCIe.

    Third, and most disappointing is the storage controller is still a weak link and there is simply no excuse for it anymore. It’s beyond ridiculous.

    Essentially, this is a totally unimpressive product, as far as I can tell.

  108. AMD needs to go buy Nvidia’s chipset division at least there performance could match Intel’s.

  109. The SB850 was already completely finished in summer 2008, with the first sillicon samples popping up in late 2008. Definitely too late for USB3.0.
    Back in June 2008 it’s originally planned release date was actually Q2 2009.

  110. I second that. There are plenty of very good older games. I’d love to know which ones are actually playable on today’s IGPs. Detail level doesn’t matter, but resolution does since it can’t really be tuned down, and my PCs all have at least 1680×1050 minimum, 1920×1200 most often.

  111. And it’s not like that’s the first look AMD would get at the USB spec. AMD is a corporate member of the USB-IF and has 40 something employees (most of them engineers) listed as members, so they would have had access and input to the spec as it evolved. AMD may have good reasons for not including USB 3.0 in its latest southbridge, but access to the specification is not one of them.

    This is just speculation, but it feels to me like AMD thought Fusion would be shipping sooner, and/or this SB would be as well, making the lack of USB 3.0 in its expected timeframe less important.

  112. I’m starting to be worried about AMD. Even though for the last 5 yrs I’ve been true to them, especially for low-end systems, it seems their value proposition is fading:

    – Intel’s IGP is still inferior, but not by much. 10-20% don’t really count, office+video jobs are still OK
    – Intel’s power consumption is much better across the board. As a consequence, silence, too.
    – Peripheral buses (SATA, USB…) are much better on Intel’s side: reliability, throughput, CPU utilization…
    – Intel has more Mini-ITX boards, and for single- or dual-spindle IGP configs, Mini-ITX makes a lot of sense.

    The one remaining advantage is lower purchase cost, but Intel’s lower power consumption makes up for that quickly.

    If I had to buy or recommence yet another cheap PC today, it would probably be i530 based.

  113. The USB 3.0 specification was made available on November 17, 2008:

    §[<<]§ The SATA 6Gbps spec was released on May 27, 2009: §[<<]§ For what it's worth, I actually asked AMD why they opted for 6Gbps SATA over USB 3.0 and was told that they expected the market to use the faster SATA interface before SuperSpeed USB.

  114. Going to repost an idea from the 5830 review for something I’d like to see:

    Hey Scott *or Geoff, in the interest of ‘looking at old hardware’ there’s something that’s been on my mind for a while. I haven’t really played any truly modern games (the newest being at least 2 years old) and have actually picked up some older games that have low by modern standard system requirements thanks to Steam sales and the sysreqs are like 6600GT or lower. Now it’s easy to say ‘this card is faster than an old one’ but what isn’t so easy is what new card or *especially* what IGP might be equal to an old one. I say IGP especially because I’ve been thinking of going with an IGP in the future (AMD DX11/Llano, Sandy Bridge IGP, whatever NV comes up with) for the little lightweight gaming I do for my next new system build.

    The problem is that most any IGP review uses more modern games but at very low resolutions and/or settings. Great, we know IGPs are bad for newer games…not the most useful tests. It would however be very useful to know what older cards low-end GPUs or IGPs are equivalent to for the sake of playing older games. Even just one such article would be useful because then we could reference off of it in the future using extrapolation for newer IGPs.

  115. Because the final USB3.0 specifications were finished too late for AMD to be able to implement them into the SB.
    The Southbridge for Fusion will have USB3.0 and will be released in about a year from now. That’s also the reason why the IGP was recycled with this Chipset. In a year from now, Motherboard IGP’s will die once and for all on AMD’s side.

  116. I’m wondering if you’re looking at it the wrong way – the integrated graphics is just a “gimme”, a throw-away – /[