Home Preview: NVIDIA’s GeForce 6600 series GPUs
Reviews

Preview: NVIDIA’s GeForce 6600 series GPUs

Geoff Gasior Former Managing Editor Author expertise
Disclosure
Disclosure
In our content, we occasionally include affiliate links. Should you click on these links, we may earn a commission, though this incurs no additional cost to you. Your use of this website signifies your acceptance of our terms and conditions as well as our privacy policy.

WHEN NVIDIA LAUNCHED its GeForce 6800 series graphics processors back in April, company CEO Jen-Hsun Huang emphasized that the GeForce 6 architecture was highly scalable and would soon power a top-to-bottom line of graphics products. As is often the case with new graphics architectures, GeForce 6 was launched at the top with the NV40 GPU and GeForce 6800 series graphics cards. These cards now retail for between $300 and $500, depending on whether you’re looking at a vanilla, GT, or Ultra card.

Although hard-core gamers and performance-oriented enthusiasts might not blink at dropping $300 or more on a graphics card, mainstream markets generally prefer something a little cheaper. Heck, there are plenty of cash-strapped gamers and enthusiasts looking for a deal in the $200 range, too.

To extend its GeForce 6 technology down to mid-range markets and more affordable price points, NVIDIA is now introducing the GeForce 6600 series. Based on an eight-pipe implementation of GeForce 6 technology, flavors of the 6600 will arrive at $149 and $199 price points. Sound tantalizing? Read on for more.

The NV43 graphics processor
The GeForce 6600 series is based on NVIDIA’s new NV43 GPU. NV43 inherits all of NV40’s core features, including support for Shader Model 3.0, 64-bit texture filtering and blending, 32-bit pixel shader precision, and DOOM 3 acceleration UltraShadow II. Where NV43 differs from NV40 is in the number of pixel pipelines and vertex unit. NV43 has eight pixel pipelines and three vertex shaders—half that of NV40.

The “half NV40” mantra is handy for describing NV43’s pixel pipelines and vertex shaders, and it’s also useful for describing the chip’s memory interface. Where NV40 enjoys a wide 256-bit memory bus, NV43 must make do with a 128-bit memory interface. Half the memory bus width translates to half the available memory bandwidth per clock, but that’s nothing new in the graphics world. ATI and NVIDIA both segment high-end and mid-range graphics offerings with 256- and 128-bit memory interfaces, respectively.

While we’re on the subject, I should mention that NV43 supports up to 256MB of memory. 128MB implementations will probably be the norm, but board vendors have a habit of offering boards with more memory to differentiate themselves from the competition. In the past, mid-range graphics cards with 256MB of memory haven’t been much faster than 128MB cards. However, NV43 could have the horsepower to exploit the extra memory at higher resolutions with antialiasing and anisotropic filtering enabled. Newer games like DOOM 3, whose Ultra High detail level prefers graphics cards with 512MB of memory, may run smoother with 256 rather than 128MB of memory.

NV43 will be fabbed by TSMC using a low-k 0.11-micron manufacturing process. 0.11-micron is just an optical shrink of 0.13-micron tech, so it’s not a huge leap. However, using a smaller fabrication process will allow NVIDIA to squeeze more chips on to each wafer, which is always good for profitability. NVIDIA won’t be the first to use 0.11-micron fab tech, though. The RV370 GPU that powers ATI’s Radeon X300 series graphics cards is also made using 0.11-micron fab tech.

Like ATI’s RV370, NV43 has a native PCI Express interface. NV43 is NVIDIA’s first native PCI-E chip, and cards based on the GPU will initially be PCI Express-only. Although NVIDIA isn’t ready to talk about AGP implementations of NV43 just yet, the company will use its bi-directional High Speed Interconnect (HSI) chip to bridge NV43 to AGP at some point after the GeForce 6600’s release.

All right, enough about the GPU. Let’s talk about how NV43 will arrive on the market.

 

Enter the GeForce 6600 series
NV43 will make its way to store shelves in two flavors: the GeForce 6600 and the GeForce 6600 GT. The GT will be clocked at 500MHz for both the graphics core and memory, while the vanilla 6600 will carry a core clock speed of 300MHz. Board vendors will be free to set their own clock speeds for the 6600’s memory, so we could see some variety there. As far as memory types are concerned, the GeForce 6600 GT’s higher clock speeds demand GDDR3, while the vanilla 6600 can get by with plain old DDR memory.

Priced at $149 and $199, respectively, the PCI-Express GeForce 6600 and 6600 GT will face off against ATI’s Radeon X600 Pro and X600 XT, and possibly even the X300 XT. Here’s how they all compare when it comes to fill rates and memory bandwidth:

  Core clock (MHz) Pixel pipelines  Peak fill rate (Mpixels/s) Texture units per pixel pipeline Peak fill rate (Mtexels/s) Memory clock (MHz) Memory bus width (bits) Peak memory bandwidth (GB/s)
Radeon X300 325 4 1300 1 1300 200 128 6.4
Radeon X600 Pro 400 4 1600 1 1600 300 128 9.6
Radeon X600 XT 500 4 2000 1 2000 370 128 11.8
GeForce 6600 300 8 2400 1 2400 NA 128 NA
GeForce 6600 GT 500 8 4000 1 4000 500 128 16.0

Theoretical fill rate and memory bandwidth peaks aren’t the be all and end all of graphics performance, but they can hint at a card’s potential. As far as potential goes, the GeForce 6600s have plenty. Although their core clock speeds are unremarkable—or even low in the case of the GeForce 6600—when compared with ATI’s Radeon X600 line, NV43’s eight pixel pipelines give the 6600s a significant fill rate advantage over their competition. The GeForce 6600 GT offers twice the fill rate of an X600 XT and the vanilla 6600’s fill rate is 50% higher than the X600 Pro’s.

Things aren’t quite as uneven in the memory bandwidth department. Still, the 6600 GT delivers 36% more memory bandwidth than the Radeon X600 XT thanks to a higher memory clock. Since GeForce 6600 board vendors will have considerable freedom setting memory clock speeds, I can’t comment on how well the vanilla GeForce 6600 will stack up against the Radeon X600 Pro. Both cards have a 128-bit memory bus, so it will come down to clock speeds. NVIDIA’s board partners should have no problems matching the X600 Pro’s 300MHz memory clock.


The GeForce 6600 GT
Source: NVIDIA

As far as board characteristics go, the GeForce 6600s look potentially well-behaved. Both cards can get by with a single-slot cooler, which is especially impressive for the 500MHz 6600 GT. Neither card requires an auxiliary power connector, in part thanks to the extra power delivered by the PCI Express interface. However, AGP versions of the 6600 GT may require a little extra juice.


SLI in action
Source: NVIDIA

SLI, baby
Although it’s not an option on the vanilla GeForce 6600, 6600 GT owners will be able to run a pair of cards in SLI. NVIDIA has already announced SLI for its GeForce 6800 family, and you can read more about the graphics card teaming technology here.

Running a pair of 6600 GT cards in tandem should yield a nice performance boost, but there’s a catch. You’ll need a motherboard with two PCI Express graphics slots. Currently, the only mobos available with dual PCI-E graphics slots are workstation-class dual Xeon boards that are priced well over $500. Xeon processors aren’t exactly cheap, either. That’s hardly a practical platform for mid-range SLI with $199 graphics cards.

Fortunately for NVIDIA, they’re also in the chipset business. It seems entirely likely that NVIDIA has nForce core logic in the works with enough PCI Express lanes for two PCI-E graphics slots. When that chipset will arrive is anyone’s guess. Until we see reasonable-priced single-processor boards with dual PCI-E graphics slots, SLI probably isn’t going to take off with the GeForce 6600 GT.

It’s really a shame that more affordable SLI platforms don’t exist, because a pair of GeForce 6600 GTs could be scary fast. SLI also offers an incremental upgrade path that’s sure to intrigue users with an aversion to dropping a lot of cash on a single graphics card purchase.

Video goodies
Since the GeForce 6600 family is targeted at mainstream consumers, NVIDIA is also hyping the cards’ advanced video features. The 6600s support advanced de-interlacing, inverse 3:2 pull down, WMV9 acceleration, and motion estimation, just like the GeForce 6800 family. These video features could be particularly valuable should NVIDIA offer a version of its Personal Cinema based on the 6600 family. My home theater PC beckons.

 

Conclusions
According to internal testing conducted by NVIDIA, which should be taken with a hearty helping of salt, the GeForce 6600 GT offers up to three times the performance of ATI’s Radeon X600 XT in DOOM 3. The GT apparently also offers between 1.7 and 2.5 times the performance of the X600 XT in a wider range of games and benchmark applications that includes 3DMark03, AquaMark3, Call of Duty, Halo, and Unreal Tournament 2004. NVIDIA’s GeForce 6600 presentation makes no mention of relative performance in Far Cry, though.

Because the GeForce 6600 GT has massive fill rate and healthy memory bandwidth advantages over the Radeon X600 XT, NVIDIA’s performance claims are certainly plausible. We’ll be able to test for ourselves soon; GeForce 6600 series cards are scheduled to ship out in mid to late September. Boards could show up in major retail outlets like Best Buy a little later than that, depending on when retailers schedule changes to their floor stock, but that shouldn’t affect online vendors or smaller brick-and-mortar outlets. I’m not sure how rabid a retail market will exist for mid-range PCI Express graphics cards in the next couple of months, anyway.

And therein lies the problem. The fact that the initial cards will be PCI Express-only is perhaps the only chink in the GeForce 6600 series’ otherwise gleaming armor. The only desktop PCI Express graphics platforms currently on the market are Intel’s 900-series chipsets, which require space-heater Prescott Pentium 4 processors that are hardly popular among gamers and enthusiasts. Major OEMs offering Prescott- and Celeron D-based systems may be eager to offer GeForce 6600 series cards with their Intel PCI Express systems, but that doesn’t do much for gamers and enthusiasts looking for a $200 graphics upgrade. We’ll probably need to see a PCI Express graphics platform for the Athlon 64 or an AGP implementation of NV43 before the GeForce 6600 series stands a chance of taking off among gamers and enthusiasts.

In the end, the GeForce 6600 series is exactly the kind of trickle down I like to see in the graphics industry. The 6600s bring Shader Model 3.0, 32-bit internal precision, and 64-bit blending to lower price points and more mainstream products, hopefully encouraging developers to take advantage of these features in future games and applications. When NVIDIA launched the GeForce 6800 series graphics processors, they pledged to offer a top-to-bottom line of graphics products based on GeForce 6 technology. The GeForce 6600 series brings them two steps closer to fulfilling that promise. 

The Tech Report - Editorial ProcessOur Editorial Process

The Tech Report editorial policy is centered on providing helpful, accurate content that offers real value to our readers. We only work with experienced writers who have specific knowledge in the topics they cover, including latest developments in technology, online privacy, cryptocurrencies, software, and more. Our editorial policy ensures that each topic is researched and curated by our in-house editors. We maintain rigorous journalistic standards, and every article is 100% written by real authors.

Geoff Gasior Former Managing Editor

Geoff Gasior Former Managing Editor

Geoff Gasior, a seasoned tech marketing expert with over 20 years of experience, specializes in crafting engaging narratives that connect people with technology. At Tech Report, he excelled in editorial management, covering all aspects of computer hardware and software and much more.

Gasior's deep expertise in this field allows him to effectively communicate complex concepts to a wide range of audiences, making technology accessible and engaging for everyone

Latest News

Elon Musk’s Company xAI Is Planning to Launch a Supercomputer by Fall 2025
News

Elon Musk’s xAI to Launch Supercomputer by Fall 2025 to Enhance Grok

Wiener AI $3M milestone
Crypto News

$WAI Presale Breaks Past $3M at a Pace of $100,000/Day, Follows $TURBO Up 60%

WienerAI ($WAI), a new project that combines memes and artificial intelligence (AI), surpassed the $3M milestone on presale. Although the presale started slow, it has now gained traction, with positive...

News

South Africa’s Justice Department Suspends Third-Party Payments after Attempted Cyberattack

South Africa’s Department of Justice and Constitutional Development (DJ&CD) has been hit by a cyberattack. Those who require immediate child maintenance have been asked to go the traditional way—visit their...

Elon Musk Says AI Will Take All Our Jobs In The Future 
News

Elon Musk Says AI Will Take All Our Jobs In The Future 

SEC Wins Against YouTuber Ian Balina Over Unregistered Crypto Promo
Crypto News

SEC Wins Against YouTuber Ian Balina Over Unregistered Crypto Promo

Financial Analyst Predicts Massive Upcoming Rally for XRP
Crypto News

Financial Analyst Predicts Massive Upcoming Rally for XRP

Bitcoin Options Expire Soon: Could This Lead to a Rise in Crypto Prices?
Crypto News

Bitcoin Options to Expire Soon: Could This Lead to a Rise in Crypto Prices?