Nvidia’s GeForce 8300 chipset

Just a few short years ago, home theater PCs were pretty cutting-edge. You pretty much had to be an enthusiast to even know such a thing was possible, and setting up a suitable system wasn’t cheap—especially if you wanted to make the most of a high-definition TV. But as is often the case in this industry, cutting edge features and capabilities quickly trickle down to the mainstream. Even today’s run-of-the-mill home theater PCs are leagues ahead of the once-impressive media rig that I assembled several years ago and still use today.

Several factors have conspired to make home theater PCs so capable and popular. Microsoft deserves some credit for bringing a 10-foot GUI to Windows, making it easier for folks to control their PCs from the couch without having to mess with additional software. The industry trend toward lower power consumption has helped, too, delivering scores of cool-running chips that can get by with the kind of near-silent cooling you want in your living room. Integrated graphics chipsets have also stepped up in a big way, offering credible gaming chops and an arsenal of advanced video decoding tricks.

For a few months now, AMD’s 780G has reigned as the only integrated graphics chipset capable of handling high-definition video decoding. Now it has company in the form of Nvidia’s new GeForce 8300. This single-chip core logic package features a graphics core derived from the GeForce 8400 GS, full Blu-ray decode acceleration, a HyperTransport 3.0 processor link prime for Phenom processors, PCI Express 2.0 connectivity, Gigabit Ethernet, loads of SATA RAID, and an even dozen USB ports. Impressive specs, no doubt, but can the GeForce 8300 unseat the 780G as our integrated graphics chipset of choice? Read on to find out.

New hotness all around

Nvidia has taken to calling the integrated graphics components of its chipsets motherboard GPUs, or mGPUs for short. This is a part of a larger strategy to put a graphics core into every one of its core logic chipsets, providing a broader installed base for the Hybrid SLI technologies recently introduced with the nForce 780a SLI chipset.

Interestingly, the nForce 780a SLI’s mGPU architecture is essentially identical to what you’ll find inside the GeForce 8300. This graphics core was derived from Nvidia’s GeForce 8400 desktop GPU, so it’s fully compliant with DirectX 10 and Shader Model 4.0. A total of 16 stream processors can be found inside the mGPU, and those SPs run at 1.5GHz, while the rest of the GPU core is clocked at 500MHz. The GeForce 8200 chipset is also available with a slightly slower shader clock of 1.2GHz.

Nvidia formally introduced the GeForce 8200 back at the Consumer Electronics Show in January, and the company made no mention of the GeForce 8300 at the time. I suspect this higher-clocked 8300 derivative is a response to the strong integrated graphics performance of AMD’s 780G chipset.

Even with a higher shader clock, the GeForce 8300 mGPU’s pixel-pushing horsepower is still relatively modest. However, the chip’s video decoding capabilities are top notch. The GeForce 8300 includes a PureVideo HD decoding block that Nvidia says can offload 100% of the Blu-ray decoding process with MPEG2, AVC (H.264), and VC-1 content. You’ll need the faster HyperTransport link present in Phenom processors to take advantage of this decode acceleration, though. Nvidia also points out that while video decoding is handled in hardware, calculations associated with HDCP copy protection schemes must still be crunched by the CPU.

The GeForce 8300’s HD video decoding capabilities are well-suited for home theater PC applications, so it’s only fitting that the chipset offers support for HDMI 1.3a in addition to standard VGA and DVI outputs. Audio can be passed over HDMI, but while 8-channel LPCM bitstreams are supported, TrueHD and DTS-HD Master Audio formats are not. Digital video output is also limited to a single connection, so you can’t use the chipset’s DVI and HDMI outputs simultaneously.

Of course, the mGPU is just one component of the GeForce 8300 chipset—well, chip, actually. While most core logic chipsets (including AMD’s 780G) split functionality between separate north and south bridge chips, Nvidia has squeezed the 8300 onto a single piece of silicon. The chip itself is fabbed on an 80nm fabrication node by TSMC.

If the GeForce 8300’s mGPU just isn’t cutting it for you, the chipset also provides 19 lanes of second-generation PCI Express connectivity. Plug in a compatible graphics card, and you can reap the benefits of Hybrid SLI, or more specifically, its GeForce Boost and HybridPower component parts. The GeForce Boost side of Hybrid SLI improves 3D performance via cooperative rendering just like, well, SLI. For this scheme to work, however, you have to use a discrete GPU with capabilities comparable to those of the mGPU—in this case, a lowly GeForce 8400. HybridPower is more interesting; clever display routing allows the user to run a powerful discrete graphics card that can be literally switched off at idle—where the mGPU takes over display duties—to conserve power. The list of Hybrid SLI-compatible GeForce graphics cards is a short one at the moment. However, the recent Green Light Special on the GeForce 9800 GTX has finally brought HybridPower-capable cards down to a mid-range price point.

Sorry to slip back into graphics there for a moment. The mGPU really is the GeForce 8300’s raison d’être, and it’s loaded with interesting stuff. They called this a GeForce rather than an nForce for a reason, you know.

AMD 780G Nvidia GeForce 8300
Processor interface 16-bit/2GHz HyperTransport
16-bit/2GHz HyperTransport

PCI Express 2.0 lanes
26*
19

Multi-GPU support
CrossFire
SLI

Chipset interconnect
PCIe 1.1 x4 NA
Interconnect bandwidth 2GB/s NA
Serial ATA ports 6
6
AHCI Y
Y
Native Command Queuing Y
Y
RAID 0/1 Y
Y
RAID 0+1/10 Y
Y
RAID 5 N
Y
ATA channels 2
1
Max audio channels 8
8
Audio standard AC’97/HDA
HDA
Ethernet N
10/100/1000
USB ports 12
12

On to the rest of the 8300’s core logic package. With 16 PCI Express 2.0 lanes reserved for a graphics card, only three remain for x1 slots and peripherals. The GeForce looks to be at quite a disadvantage here next to the 780G, but keep in mind that four of the AMD chipset’s PCIe lanes are reserved for its chipset interconnect. In reality, the GeForce is really only three lanes short—effectively two, since you save having to burn a PCIe lane on a GigE controller thanks to the GeForce’s integrated Gigabit MAC.

Otherwise, the 8300 matches up well against the 780G. Both offer six Serial ATA ports, and Nvidia even kicks in RAID 5 support for those looking to build a storage server on the cheap. The GeForce doesn’t abandon “parallel” ATA, either, although it only supports two old-school IDE drives, while the 780G can handle four.

Motherboard by Zotac
Your next home theater PC platform?

Manufacturer Zotac
Model GeForce 8300
Price (Estimated) $85-90
Availability July

Our first look at the GeForce 8300 comes courtesy of a motherboard from Zotac. You may not have heard of the company, but it’s a subsidiary of industry giant PC Partner and thus rather well connected. Zotac makes graphics cards and motherboards based on Nvidia chips, and those products are available in North America through Newegg. Earlier this year, we gave Zotac’s GeForce 8800 GT Amp! Edition an Editor’s Choice award in our mid-range graphics card round-up, so expectations are understandably high this time around.

As one might expect, Zotac builds its GeForce 8300 on a Micro ATX form factor. The board pictured below is draped in a blueish turquoise hue, but it’s an early sample. Production boards are black, which makes them that little bit more menacing.

Micro ATX real estate doesn’t leave a whole lot of room for onboard components, and that may be why Zotac chose less-than-ideal locations for the board’s power plugs. Both the primary and auxiliary 12V connectors are mid-way down the board where cabling, particularly from the beefier primary plug, can obstruct airflow between the CPU socket and where most chassis locate their exhaust fans. We think power plugs are best located along the edges of the board where their associated cabling won’t get in the way.

Zotac keeps the board’s socket area relatively clean, which would leave plenty of clearance for larger aftermarket heatsinks were it not for the close proximity of the DIMM slots. The huge Scythe Ninja we use for heatsink compatibility testing blocks the first slot—a common problem for many Socket AM2 motherboards.

Speaking of the socket, Zotac says its GeForce 8300 board is compatible with Phenom X4 9850 Black Edition processors. The Black Edition’s 125W TDP has proven too high for many Micro ATX motherboards. That’s a shame, because with an unlocked multiplier and relatively low price, the 9850 is the most attractive Phenom in AMD’s lineup. We can confirm that the Black Edition does indeed work on this board, but under load, there’s a definite squeal emanating from the board’s power regulation circuitry. After not even an hour crunching Prime95, that circuitry got hot enough to melt a hole in the foam pad we place under mobos during testing—not a good sign. This GeForce 8300 board may be compatible with Black Edition Phenoms, we wouldn’t recommend running one.

Zotac hides the GeForce 8300 chip beneath a beefy cooler. The heatsink really doesn’t offer that much surface area considering its footprint, though. At least the low-profile design won’t interfere with longer expansion cards.

Longer graphics cards will block access to a couple of Serial ATA ports, however. That’s to be expected given Micro ATX’s diminutive dimensions, and the board does have four more ports tucked nicely out of the way.

The Zotac board’s slot stack is pretty standard fare for Micro ATX. PCI Express connectivity is split between an x16 and x1 slot, and Zotac throws in a couple of standard PCI slots for good measure.

Around the outside edge, we find four USB ports, Firewire, Ethernet, and a full suite of analog audio ports. You also get a digital S/PDIF output and both VGA and DVI video output options. There’s actually room in the cluster for more ports, but Zotac stops there, denying users a digital audio input, TOS-Link S/PDIF output, and eSATA. HDMI is conspicuously missing from the port cluster, too, but Zotac has that angle covered.

Included with the board is a DVI-to-HDMI adapter that allows users to choose between digital video outputs. The adapter nicely sidesteps the chipset’s inability to output DVI and HDMI at the same time, too.

Busting into the BIOS

Those used to the cornucopia of tweaking and overclocking options present on most enthusiast-oriented motherboards will have to adjust their expectations when dipping into Micro ATX territory. Micro ATX boards are generally designed with budget systems and mainstream users in mind, and those folks are probably best kept out of the BIOS altogether.


Bus speeds
CPU base clock:
200-600MHz in 1MHz increments

PCIe: 100-200MHz in 1MHz increments

DRAM: 400,533, 667,
800,1066MHz
NB link: 200MHz-2.6GHz in 200MHz increments


Bus multipliers
NA
Voltages DRAM: 1.8-2.2V in 0.1V increments

Chipset: 1.2-1.35V in 0.05V increments


Monitoring
Voltage, fan
status, and temperature monitoring

Fan speed control
CPU

Zotac at least serves up a few overclocking options, including the ability to set the processor’s base clock between 200 and 600MHz in 1MHz increments. A number of memory speed options are available, as well, in addition to control over the PCI Express and HyperTransport links. However, CPU multiplier control isn’t available at all, even with unlocked Black Edition processors.

Another limiting factor for overclockers is the complete lack of CPU voltage control. The BIOS does let users manipulate DRAM and chipset voltages, but even those options are relatively limited.

Memory timing control was also missing from the initial BIOS included with our board. Fortunately, Zotac was able to add basic latency controls to a subsequent BIOS release, at our request. We really shouldn’t have to ask for such a basic feature, though.

Fan speed control is particularly important for those looking to build quiet home theater PCs, and we’re pleased to report that Zotac serves up a handful of processor target temperature options. We’d ideally like to see automatic fan speed control extended to the board’s system fan headers, but few boards go that far. Most don’t even provide temperature target control.

If you don’t want to poke around in the BIOS, Nvidia’s excellent System Utility software is fully compatible with the GeForce 8300 chipset. Unfortunately, though, the Zotac board isn’t.

The system utility’s monitoring capabilities are severely constrained by the board’s failure to report voltages, fan speeds, and temperatures.

These missing variables all but ruin the system utility’s device rules feature. Tweaking options are also limited by the lack of voltage and fan speed controls. You can, however, adjust the CPU base clock and a handful of memory timings. That’s better than nothing, I suppose, but we’d prefer to see Zotac take full advantage of the system utility software that Nvidia makes freely available.

Specifics on specifications

As usual, we’ve consolidated all of the motherboard’s vital specifications in a handy chart. Enjoy.


CPU support
Socket AM2+/AM2-based Athlon,
Phenom processors

Chipset
Nvidia GeForce 8300

Expansion slots
1 PCI Express x16

1 PCI Express x1
2 32-bit/33MHz PCI


Memory
4 240-pin DIMM
sockets

Maximum of 8GB of DDR2-667/800/1066 SDRAM


Storage I/O
Floppy disk

1 channel ATA/133

6 channels 300MB/s Serial ATA with RAID 0, 1, 0+1, 5 support

Audio 8-channel HD audio via Realtek
ALC888 codec
Ports 1 PS/2 keyboard
1 PS/2 mouse
1 VGA
1 DVI

4 USB
2.0 with headers for 8 more

1 RJ45 10/100/1000
1 1394a Firewire via
VIA VT6307 with header for 1 more

1 analog front out
1 analog bass/center out

1 analog rear out

1 analog surround out

1 analog line in

1 analog mic in
1 digital coaxial S/PDIF
output

With the GeForce 8300 integrating a Gigabit Ethernet controller, only the board’s audio and Firewire components require auxiliary peripheral chips. VIA gets the call for the former, while the latter is predictably powered by a Realtek audio codec. The ALC888 isn’t the crab’s swankiest offering, and unlike some of the company’s other codecs, it doesn’t support real-time DTS or Dolby Digital Live encoding—features that would be particularly useful for home theater PCs.

Our testing methods

Today we’ll be exploring the GeForce 8300’s performance against its closest rival, the AMD 780G. We’re using a B2-stepping Phenom 9600 here due to the fact that we don’t have a 780G board compatible with our 9850 Black Edition CPU. AMD’s performance-robbing TLB erratum patch was disabled on both boards for testing. We’re also using slightly looser memory timings than normal since we had to begin testing before Zotac managed to come up with a BIOS with memory timing controls.

Because of the numerous problems associated with running the 780G’s disk controller in AHCI mode, we settled on IDE mode, which doesn’t support Native Command Queuing. AHCI has been an issue for AMD chipsets for a while now, so we’re not inclined to cut them any slack. By contrast, the GeForce 8300’s AHCI implementation appears to work flawlessly.

All tests were run three times, and their results were averaged.

Processor

AMD Phenom 9600 2.3GHz
System bus 1.8GHz HyperTransport
Motherboard Zotac GeForce 8300

Gigabyte GA-MA78GM-S2H
Bios revision AD4PR929 F4C
North bridge Nvidia GeForce 8300 AMD 780G
South bridge AMD SB700
Chipset drivers ForceWare 175.16 Catalyst 8.5
Memory size 2GB (2 DIMMs)
Memory type
Corsair
TWIN2X2048-8500C5

DDR2 SDRAM at
~800MHz
CAS latency
(CL)
5 5

RAS to CAS delay (tRCD)
5 5
RAS precharge
(tRP)
7 7
Cycle time
(tRAS)
18 18
Audio codec Realtek
ALC888
with 1.92 drivers
Realtek
ALC889A
with 1.92 drivers

Hard drive


Western Digital Raptor WD1500ADFD 150GB
SATA

OS


Windows Vista Ultimate x86
with Service Pack 1

Thanks to Corsair for providing us with memory for our testing.

All of our test systems were powered by OCZ GameXStream 700W power supply units. Thanks to OCZ for providing these units for our use in testing.

Finally, we’d like to thank Western Digital for sending Raptor WD1500ADFD hard drives for our test rigs.

We used the following versions of our test applications:

The test systems’ Windows desktop was set at 1280×1024 in 32-bit color at an 85Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.

All the tests and methods we employed are publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Memory performance

Memory subsystem performance doesn’t always track with real-world applications, but it’s a good place to start with integrated graphics chipsets that cannibalize a portion of system memory and therefore bandwidth.

The GeForce is quick out of the gate, delivering slightly higher bandwidth and lower access latencies than the 780G. But does that lead hold when we populate all of our motherboards’ DIMM slots?

It does. Memory bandwidth is a little higher than with only two DIMMs installed, but then so are access latencies.

The following latency graphs are a little indulgent, so I won’t be offended if you skip them. They show access latencies across multiple block and step sizes, painting a fuller picture of memory controller performance with each chipset. I’ve arranged the graphs in order of highest latency to lowest. Yellow represents L1 cache, light orange is L2, red is L3, and dark orange is main memory.

With both chipsets relying on the Phenom’s integrated memory controller, there aren’t many differences to see here.

STARS Euler3d computational fluid dynamics

Few folks run fluid dynamics simulations on their desktops, but we’ve found this multi-threaded test to be particularly demanding of memory subsystems, making it a good link between our memory and application performance tests.

With a superior performance in our memory subsystem tests, it’s no surprise that the GeForce comes out ahead in Euler3D.

WorldBench

WorldBench uses scripting to step through a series of tasks in common Windows applications. It then produces an overall score. WorldBench also spits out individual results for its component application tests, allowing us to compare performance in each. We’ll look at the overall score, and then we’ll show individual application results alongside the results from some of our own application tests.

It doesn’t get any closer than this, folks. The GeForce 8300 exactly ties the 780G in WorldBench.

Results from WorldBench’s individual application tests confirm that these chipsets offer nearly identical performance with common desktop applications. The GeForce and 780G trade the lead back and forth in a couple of tests, but neither manages to stretch its lead to more than a few seconds.

Gaming

We’ve had more than one game developer decry integrated graphics as the bane of their existence, and it’s easy to see why. With today’s latest titles, you have to back off most in-game detail levels and drop the resolution way down to get frame rates that are even close to what we’d consider playable. Crysis just isn’t the same with a low graphics detail setting, and the others don’t look so hot, either. But on the GeForce 8300 and 780G, games at least run without visual anomalies, even if you have to turn off most of the eye candy.

Note that we have several sets of results here. Naturally, we’ve tested the integrated graphics offered by the GeForce 8300 and 780G. The 8300 was also tested with a GeForce 8400 GS graphics card running on its own and in a GeForce Boost SLI configuration. With the 780G, we popped in a discrete Radeon HD 3450 (which runs about the same price as the 8400 GS) and tested it as a single card and also as part of a Hybrid CrossFire config.

Focusing just on integrated graphics, the GeForce 8300 is faster than the 780G in Quake Wars, but slower in Crysis, Episode Two, and Call of Duty. Even with its cranked shader clock, the GeForce 8300 just can’t keep up.

The tables turn a little when we throw hybrid multi-GPU solutions into the mix. GeForce Boost delivers a big payoff in Crysis, vaulting the GeForce 8300 into the lead. However, we didn’t observe any performance scaling in Quake Wars, allowing the 780G’s CrossFire config to steal the top spot. The 8300’s multi-GPU performance also lags behind that of the 780G in Call of Duty and Episode Two, and in the latter, the gap is significiant.

AMD and Nvidia are both bullish when it comes to their respective integrated graphics multi-GPU solutions, and there’s certainly additional performance to be had with the addition of a $50 discrete graphics card. However, if you’re really serious about improving game performance, you’re better off shelling out some extra cash for a more powerful graphics card like the GeForce 9600 GT, which can easily handle these games at high resolutions and with detail levels cranked up.

HD Video playback

With HD DVD essentially dead, we confined our video playback tests to Blu-ray movies with the highest bitrates we could find for each of the format’s three encoding types. For VC-1 encoding, we settled on Nature’s Journey, which is packed with ridiculously gorgeous loops of nature scenes. On the AVC front (otherwise known as H.264), the highest bitrates we could get our hands on came with the fast zombie flick 28 Days Later. We had to scrape the bottom of the barrel for MPEG2, eventually settling on Click. For whatever reason, an Adam Sandler comedy is encoded with a higher bitrate than other MPEG2 movies.

We used PowerDVD 8 Ultra for playback and enabled hardware acceleration within the application. CPU utilization was logged during 60 seconds of playback with each movie, and the results were averaged. The movies were played back in full-screen mode with the desktop resolution set to 1920×1440 to make things as difficult as possible for the IGPs.

Although the GeForce 8300’s Blu-ray playback is smooth and fluid, its CPU utilization is measurably higher than that of the 780G. Both companies claim 100% decode acceleration for each format, but AMD’s approach more fully unburdens the CPU.

HD HQV video quality

We’ve added Silicon Optix’s latest HD HQV benchmark to the mix to assess each chipset’s HD video playback quality. HD HQV tests noise reduction, antialiasing, and various resolution loss scenarios, culminating in an overall score out of 100. The scoring system for HD HQV is entirely too subjective for my liking, and the weighting is a little odd, but it’ll have to do for now.

The GeForce 8300 comes out way ahead here thanks to half-resolution processing cropping up on the 780G in the film resolution loss test, which costs the AMD chipset a whopping 25 points. To my eyes, the GeForce and 780G both fall a little short in the antialiasing test, as well.

Serial ATA performance

The Serial ATA disk controller is one of the most important components of a modern core logic chipset, so we threw each platform a selection of I/O-intensive storage tests using a Western Digital Raptor WD1500ADFD. The 780G is at a disadvantage here because we’re running it in native IDE mode, which doesn’t support Native Command Queuing.

IOMeter

We’ll begin our storage tests with IOMeter, which subjects our systems to increasing multi-user loads. Testing was restricted to IOMeter’s workstation and database test patterns, since those are more appropriate for desktop systems than the file or web server test patterns.

No doubt thanks to its proper AHCI implementation, the GeForce 8300’s transaction rates scale much better in IOMeter than those of the 780G.

The GeForce’s response times are lower, too, although not by significant margins.

IOMeter CPU utilization is pretty much even between the two contenders.

HD Tach

We used HD Tach 3.01’s 8MB zone test to measure basic SATA throughput and latency.

Disk controllers that support Native Command Queuing tend to perform better in HD Tach’s sustained write speed test, which explains the GeForce’s huge lead there. Otherwise, disk performance is pretty close.

HD Tach access times are too close to call, and the CPU utilization results are well within the application’s +/- 2% margin of error for that test.

USB performance

Our USB transfer speed tests were conducted with a USB 2.0/Firewire external hard drive enclosure connected to a 7200RPM Seagate Barracuda 7200.7 hard drive. We tested with HD Tach 3.01’s 8MB zone setting.

USB performance has always been a weakness of AMD chipsets, so it’s no surprise to see the GeForce out in front here. The 8300 is faster across the board, and it has a particularly impressive lead in the read speed tests.

Ethernet performance

We evaluated Ethernet performance using the NTttcp tool from Microsoft’s Windows DDK. The docs say this program “provides the customer with a multi-threaded, asynchronous performance benchmark for measuring achievable data transfer rate.”

We used the following command line options on the server machine:

ntttcps -m 4,0,192.168.1.25 -a

..and the same basic thing on each of our test systems acting as clients:

ntttcpr -m 4,0,192.168.1.25 -a

Our server was a Windows XP Pro system based on Asus’ P5WD2 Premium motherboard with a Pentium 4 3.4GHz Extreme Edition (800MHz front-side bus, Hyper-Threading enabled) and PCI Express-attached Gigabit Ethernet. A crossover CAT6 cable was used to connect the server to each system.

The boards were tested with jumbo frames disabled.

We like Nvidia’s decision to integrate a Gigabit Ethernet into its chipsets because this prevents motherboard makers from using lousy auxiliary GigE controllers that deliver poor throughput, high CPU utilization, or a little of both. That said, the GeForce’s integrated Gigabit MAC doesn’t deliver higher throughput or lower CPU utilization than a good GigE chip like Realtek’s new 8111C.

PCI Express performance

We used ntttcp to test PCI Express Ethernet throughput using a Marvell 88E8052-based PCI Express x1 Gigabit Ethernet card.

PCI performance

To test PCI performance, we used the same ntttcp test methods and a PCI VIA Velocity GigE NIC.

We don’t see much difference in PCI or PCI Express performance between the GeForce 8300 and 780G.

Power consumption

We measured system power consumption, sans monitor and speakers, at the wall outlet using a Watts Up Pro power meter. Power consumption was measured at idle and under a load consisting of a multi-threaded Cinebench 10 render running in parallel with the “rthdribl” high dynamic range lighting demo. Results that fall under “No power management” were obtained with Windows Vista running in high performance mode, while those with power management enabled were taken with Vista in its balanced performance mode.

Despite its swanky single-chip package, the GeForce 8300 appears to consume more power than the two-chip 780G. The GeForce board’s power consumption is 10W higher at idle and nearly 20W higher under load.

Overclocking

We didn’t expect Zotac’s GeForce 8300 board to overclock well given the BIOS’s relative lack of tweaking options, but we gave it a spin with a Phenom X4 9850 Black Edition CPU anyway. Unfortunately, since the BIOS lacks CPU multiplier control, we had to confine our testing to ramping up the CPU base clock.

Our board had no problems booting with a 220MHz base clock, but our four-way Prime95 stress test quickly spit out errors, even with additional chipset voltage applied. However, we did manage to get the board stable and error-free with a 210MHz base clock and the default chipset voltage.

Obviously, this board isn’t built for overclocking. If you really want to turn up the clocks on a Phenom, we’d recommend a standard desktop motherboard with support for the Black Edition’s unlocked upper multiplier.

Peripheral performance

Core logic chipsets integrate a wealth of peripherals, but they don’t handle everything. Firewire and audio are farmed out to auxiliary chips, for example. To provide a closer look at the peripheral performance you can expect from the motherboards we’ve tested today, we’ve complied Firewire and audio performance results below. We’ve used motherboard rather than chipset names here because these performance characteristics reflect the auxiliary peripheral chips used on each board rather than the performance of the core logic chipset.

HD Tach
Firewire performance

Read burst

speed (MB/s)


Average read

speed (MB/s)


Average write

speed (MB/s)


CPU utilization

(%)


Gigabyte GA-MA78GM-S2H
41.7 37.4 23.3 1.0

Zotac GeForce 8300
41.3 37.2 23.8 0.3

The Zotac board’s VIA Firewire chip looks pretty good in our performance tests.

RightMark Audio
Analyzer audio quality

Overall score

Frequency response

Noise level

Dynamic range

THD

THD + Noise

IMD + Noise

Stereo Crosstalk

IMD at 10kHz

Gigabyte GA-MA78GM-S2H
4 5 5 5 3 1 3 6 3

Zotac GeForce 8300
4 5 4 4 3 1 3 5 3

Despite scoring lower in three of RightMark Audio Analyzer’s component tests, the Zotac board’s overall score still matches that of the Gigabyte. We conducted these tests using RMAA’s loopback option, which routes the motherboard’s stereo output through its line input.

Conclusions

The GeForce 8300 brings Nvidia into the next generation of integrated graphics in style, with a unique single-chip solution that squeezes graphics and core logic onto a single piece of silicon. This consolidation hasn’t forced Nvidia to skimp on features, either, with the 8300 delivering a competent DirectX 10-class graphics core, effective Blu-ray decode capabilities, PCI Express 2.0, and a slew of integrated peripherals. However, as slick as the single-chip implementation may be from a design standpoint, the number of chips has little bearing on performance or the end user experience. If the GeForce 8300 is to unseat the incumbant 780G, it will have to impress elsewhere.

And it does, sometimes. The GeForce 8300’s USB performance is excellent and its disk controller works properly, which is more than can be said of the 780G. We like the integrated Gigabit Ethernet, too, if only because it should prevent motherboard makers from skimping in the networking department. That’s about it for highlights, though.

Sure, the GeForce offers decent gaming performance and effective Blu-ray playback acceleration. But the 780G is better on both fronts, and its power consumption appears to be lower. As our WorldBench results illustrate, the two chipsets offer identical performance with common desktop applications, too. There isn’t much for Nvidia to hang its hat on this time around, then.

As for the Zotac motherboard we used for testing, it, too, has issues. The BIOS definitely needs work, both to improve tweaking and overclocking options and to add the necessary hooks to improve compatibility with Nvidia’s System Utility software. The board could use eSATA connectivity, too, and an audio codec that supports Dolby Digital Live or DTS encoding. The GeForce 8300 is primed for home theater applications, after all, and motherboard makers should be mindful of what’s important to that market. Zotac expects its GeForce 8300 board to arrive in July and sell between $85 and $90, which will put it up against better-equipped 780G boards like Gigabyte’s GA-MA780GM-S2H.

I suppose the GeForce 8300 would have an easier time were it not facing such formidable competition from AMD’s 780G. The 8300 is undoubtedly a fine chipset, and one I’d be happy to have sitting at the heart of a home theater PC. But it’s not quite as good as the 780G.

Comments closed
    • pogsnet
    • 11 years ago
    • Peldor
    • 11 years ago

    I think you missed an opportunity to test Nvidia’s HybridPower scheme. Other reviews have shown it to have a nice power savings at idle by shutting down the discrete video card. There is a small performance hit, but it might well be worth the $ savings.

    • eitje
    • 11 years ago

    on page 4, your sisoft & hdtach links appear to be bad.

    they’ve been that way several of the recent reviews, actually.

    • hermanshermit
    • 11 years ago

    “You’ll need the faster HyperTransport link present in Phenom processors to take advantage of this decode acceleration”

    So given that these are megawatt CPUs, most of the advantage is lost? Really for media we are looking at AMDs 45W dual CPUs, if they can’t do the job, then there is surley little point?

    Also why no power use comparison with 780G? both the chipset itself and it’s support for power saving technologies.

    Seeing as the e7200 and e8200 are sipping around 28W at full tilt and below 10W at idle, I don’t give AMD a much chance against G45 (if it works correctly!) in the media segment. Intel mean business in this segment with that m-ITX eaglelake board.

    AMD really need to get Phemon to 45nm and bring the power consumption down if they want to stay in the game.

      • deruberhanyok
      • 11 years ago

      /[http://www.silentpcreview.com/article807-page8.html<]§ That line about needing HT3 only applies to this new nvidia chipset. 780g works just fine with an Athlon X2. /[

        • Deli
        • 11 years ago

        i second this thought. 780G is a superb chipset despite the shortcomings of sb700. power consumption as a whole for an all AMD system is extremely good compared to an all intel system for HTPC right now due to the high power consumption of the intel onboard chipsets. I believe Anandtech had an article on that.

        • hermanshermit
        • 11 years ago

        I was referring to this chipset, not the 780G. I have a 780G, it even works with and older, single core CPU. My point was this chipset is pointless for HTPC as it stands.

        I also stand by what I said about Intel. The 45W CPUs nowhere near the low heat from the very latest e7200 and e8200 – check out the tiny bundled heatsink. that’s before whatever intel release in their 45nm line up as pentiums and celerons at lower speeds and smaller cache. AMD need a kick up the butt for their fab tech.

        In real terms these CPUs are using half to 2/3rd the juice of AMD’s best. The 780G chipset also isn’t that low power, the current G33 while inferior technically uses significantly less. It depends on what happens with G45.

        I’d recommend anyone looking to build a HTPC to hold off until the reviews are in. A few 10s of watts really matter in small cases.

    • sycomonkey
    • 11 years ago

    I personally use crystalcpuid to overclock my Athlon X2 5000+ BE, since my motherboard doesn’t have mutliplier adjustments either. It says “AMD K6/K7/K8/GeodeLX Multiplier/Voltage Control”, with a noticeable absence of K10. I wonder if it would just work anyway…

    • glynor
    • 11 years ago

    I wonder if these too are going to suffer from the just announced “weak material set of die/package combination” and “system thermal management designs” causing “MCP and GPU products [to fail] in the field at higher than normal rates”.

    The initial announcement was just for notebook integrated GPUs and MCPs, but they also said “[t]here can be no assurance that we will not discover defects in other MCP or GPU products. ”

    §[< http://biz.yahoo.com/e/080702/nvda8-k.html<]§ I certainly wouldn't jump onto one of these (or frankly any piece of Nvidia silicon right now) until they get this issue straightened out. For right now, it seems like they either aren't sure what the problem is, or they are pretty sure what it is, but haven't decided how bad it is yet. The fact that they dropped this news late in the evening certainly doesn't bode well. After hours trading has not been kind.

    • h22chen
    • 11 years ago

    When is the Intel version of the 8300 coming out? Intel Integrated is not up to par.

    • A_Pickle
    • 11 years ago

    Is this the first review that you guys have done with AMD’s SB700 south bridge? Gah. It looks a little bit better than the SB600, but… still. AMD really needs to fix it, those average write speeds are abysmal…

    • Deli
    • 11 years ago

    with the update to 780G coming in July/Aug timeframe, it makes the geforce 8300 even less attractive. Hopefully, we get a review on SB750.

    • Hattig
    • 11 years ago

    Hmm, if nVidia went to 55nm to make the chipsets instead of 80nm, they could have a cooler chipset, and/or an mGPU with 32 shaders instead of 16… I expect this will happen later this year or early next.

    Yeah, AMD CPUs for media centres, no questions about it now. Two great chipset options and low cost.

    • Prototyped
    • 11 years ago

    g[<*[

      • Damage
      • 11 years ago

      I think you are confusing mGPU with PCIe.

        • Prototyped
        • 11 years ago

        I’m not, though. The GeForce 8200/nForce 730a doesn’t support dual discrete graphics cards in SLI either, while the IGP-less 750a and 780a do (and provide a much larger number of PCI express lanes).

        I’m interested in seeing what nVidia produces to counter the (delayed) AMD 790G/790GX, which supports dual (or better) discrete GPU CrossFire/CrossFire X, /[

          • Dissonance
          • 11 years ago

          The limitation is the fact that those chipsets tie their PCIe lanes up in a single x16 link that can’t be split between a pair of graphics cards. The 750a and 780a are not IGP-less designs. Same graphics core (with Blu-ray offload, no less), just paired with a different PCIe lane config to suite budget market segments.

          Look for a full 750a review soon, BTW.

    • kvndoom
    • 11 years ago

    Blu-Ray drives are approaching the magic 100$ mark. I’m probably gonna overhaul my system next year after I get the car paid off. Looks like I won’t have problems playing BD movies on the PC if I choose to, especially with some help from Slysoft. 🙂

    • derFunkenstein
    • 11 years ago

    This is actually a pretty decent advancement in integrated graphics. You won’t be playing Supreme Commander on these, but I’m willing to bet it’s a pretty good Vista Aero processor.

      • Meadows
      • 11 years ago

      Integrated GeForce 6150 SE (and all the other varieties – I just had experience with that particular one) was a splendid “Vista Aero processor”. This thing is more capable than that. You could even play Supreme Commander, just in a modest fashion.

    • ssidbroadcast
    • 11 years ago

    But, will it run Diablo 3?

      • derFunkenstein
      • 11 years ago

      honestly, that’s what I’m thinking. I’ve pretty well given up the high-end gaming scene and I’d love to have a cheap-cheap box with an integrated GPU that could do SC2 and D3. I’d probably be better off to just pick up a Radeon 3650 for $65, since those are faster than 2600XT’s, which is pretty much faster than an 8600GT.

        • ssidbroadcast
        • 11 years ago

        It’d be pretty neat to sit on the couch on a Saturday and play D3 on a 40″ 1080p DLP. That would be the life.

          • paulWTAMU
          • 11 years ago

          My next PC is going to get hooked up to my 50″ HDTV. It’ll be nearly perfect.

        • shank15217
        • 11 years ago

        your assumption is that blizzard game dont tax graphics and i/o. Warcraft 3 was pretty demanding in 4 vs 4 when it first came out what makes you think large games of sc2 wouldn’t tax your system similarly. In rts games response times are realy important for micro-management, a fast gfx card may make all the difference.

      • Meadows
      • 11 years ago

      With fast dual channel memory, yes, it should. The situation should be positively rosy once you add an 8400 in for “SLI”, which is a dirt cheap upgrade (and should allow the freedom of running more games well, too).

      • Krogoth
      • 11 years ago

      Nope, it cannot handle Multics version of Pong.

      • swaaye
      • 11 years ago

      maybe at 800×600. My 780G can’t even do Guild Wars well above 1280×800. Doom3 is a stutterfest. I thought that this IGP would be able to handle 1920×1200 Jedi Knight 2, but it can’t really do it very smoothly. And, Dark Messiah isn’t playable above 1280×800 either.

      If I had to choose a discrete GPU that I’ve used that is comparable , I’d name the Radeon 9600 Pro. A Radeon 9700 Pro is definitely faster. Of course, you can overclock it to like a 90% core clock increase….. And if you can crank the HT bus way up it helps equally. I got my Athlon X2 up to 270 MHz FSB / 1350 MHz HT and the GPU at 960 MHz. That allowed me to run Guild Wars at 1680×1050 fairly smooth. 20-30 fps perhaps. A Radeon 9700 Pro would still be faster though, believe it or not.

      The HT bus is a huge bottleneck for these GPUs. It only offers around 4GB/s up/down with a Athlon X2’s 1000 MHz 16-bit HT bus, although there are dedicated lines each way so it’s an “aggregate” 8GB/s. That’s decidedly sad memory bandwidth for the GPU even with Phenom’s faster HT bus.

      This IGP is basically a bit less than a HD 2400 Pro, if you want to look that up. That’s assuming you pair it with DDR2 800+ and a Phenom CPU though. It’s just another example of how poor the first generation of budget DX10 GPUs is. Better to stick with a older generation card if you want to get something very, very cheap. It is the best IGP out there though.

      • d0g_p00p
      • 11 years ago

      I don’t see why not. Blizzard has always made games that run on mid level hardware. I think it’s one reason why their games sell so well. You can pick up any Blizzard title and pretty much know that you can run the game. Valve is the same way.

    • MadManOriginal
    • 11 years ago

    This is a chipset begging for a better mobo.

      • madgun
      • 11 years ago

      I concur…

    • MattMojo
    • 11 years ago

    You know, I built a 690G board with a 1.9 Ghz X2 (low power) coupled with a 8600GT and 2 Gig of RAM and it can do anything I want — play games, Blu-Ray (perfectly btw) and Vista’s media center app is smooth on this machine.

    If I were to upgrade I would stick with AMD on this one — based on my experience with the previous generation.

    Nice show nVidia but you make chipsets AND video cards but this is the best you could muster — along with the 200 series debacle, I find myself hard to recommend nVidia these days. Get off your butt and produce something tangible again… please.

    Mojo

      • henfactor
      • 11 years ago

      You do realize that their next generation GPU will blow everything out of the water (it better anyway)

        • flip-mode
        • 11 years ago

        GT300? Isn’t that a way’s off?

          • Meadows
          • 11 years ago

          Depends on how they name the die-shrunk, boosted version of the GT200.

        • MattMojo
        • 11 years ago

        That is what was said about the 200 series. I am not impressed. I like nVidia as a company but management and design sat on their behind for way too long. How can you double “the guts” of a video card and still come away with similar performance of the previous *[

        • deruberhanyok
        • 11 years ago

        It’s sad that only a couple of weeks after the new cards have been launched someone is already saying “well, GT300 will be the awesomest!”

          • MattMojo
          • 11 years ago

          Tell me about it!!!

    • Usacomp2k3
    • 11 years ago

    Honestly, the most averse part is that it requires a quad-core processor for video decoding. So, it’s not really useful for a cheap media-PC. I’d prefer something that’d take the likes of the e2180 (or rather the AMD equivalent of that, which is the x2 5600, IIRC).

      • derFunkenstein
      • 11 years ago

      I’d say a 2.4GHz 4600+ is probably fast enough to match up to (or at least be in close proximity of) the 1.8GHz 2180. Price-wise they kind of match up, but performance wise I think AMD might actually have the better CPU at that $80 range.

      • Dissonance
      • 11 years ago

      Actually, it doesn’t require a quad-core processor–just a HyperTransport 3 interconnect. Bandwidth is the issue here, not horsepower.

      And if you’re already shelling out $160+ on a Blu-ray drive, surely you can afford to throw a little cash at a cheap Phenom.

        • Usacomp2k3
        • 11 years ago

        Are there any dual-core (read: cheap) processors with HT3?

        EDIT: I guess an x3 phenom for $125 isn’t bad at all.

          • ssidbroadcast
          • 11 years ago

          Do the X3’s count as AM3 processors?

            • BobbinThreadbare
            • 11 years ago

            HT3, not AM3, and it would be pretty silly to put that kind of segmentation into an already struggling product.

            • ssidbroadcast
            • 11 years ago

            HT3, thanks. And i dunno, if you’re building a budget HTPC, it might be relevant to know whether the X3’s are HT3 compatible or not.

        • Palek
        • 11 years ago

        The HT3 requirement still seems a bit odd. Does the 780G from AMD have the same requirement?

        [EDIT]

        I quickly checked your 780G review. In your test back then you used the 4850e which is an Athlon X2 at 2.5 GHz. The CPU utilization scores were around 50% for MPEG-2 and H.264, and 40% for VC-1. In that article you said you were a bit underwhelmed by the utilization scores because you expected lower figures. You also swapped in a Phenom which lowered the utilization figures to less than half of the Athlon X2. At the time you assumed that this could be due to the doubling of the cores, but that does not sound right – especially in light of your current article. If the IGP does most of the heavy lifting for video decoding then the scheduling and housekeeping tasks left to the CPU would not be highly parallelizable, anyway. Based on what you said in this GeForce 8300 review it is entirely possible that the 780G also needs HT3 to fully utilize its video decode acceleration features.

        Can you guys look into this? I am sure other TR readers would be interested to find out.

        [/EDIT/]

        Also, $160 for a Blu-ray drive is as good as it gets now – there is no cheaper alternative, you cannot cut corners anywhere. However, with the CPU you have plenty of better priced options than even the cheapest Phenom which are still more than capable of handling Blu-ray, so why spend more money unless you really have to? You can get a dual-core Athlon for well under $100. Besides, power consumption and heat generation should weigh in heavily on component selection when building an HTPC, so a 45W Athlon X2 seems like a much better choice than a 65W Phenom.

          • Dissonance
          • 11 years ago

          The 780G doesn’t require an HT3 for HD decoding, but it does for post-processing *cough* making pretty for HD HQV.

            • Palek
            • 11 years ago

            Geoff, I updated my previous post, could you please comment on it?

            • deruberhanyok
            • 11 years ago

            I’m also curious about this, specifically how well the 8300 performs with an Athlon X2 instead of a Phenom. I’ve no interest in nvidia hardware but it seems odd that they would require one of the new processors for such playback instead of beefing up the onboard decode capabilities.

            Perhaps it’s a side effect of building the chip at 80nm? Not enough room to squeeze in improved decode logic?

Pin It on Pinterest

Share This