ATI’s Radeon X1650 XT graphics card

DON’T LET THE Radeon X1650 XT’s name fool you. Although the amalgamation of letters and numbers behind “Radeon” might lead you to believe this card is a direct heir of the notoriously poky Radeon X1600 XT, this puppy is much more potent than its predecessor. In fact, its GPU is more like two X1600 XTs fused together, with roughly twice the graphics processing power in nearly every meaningful sense. The X1650 XT has 24 pixel shader processors instead of 12; it has eight texturing units rather than four; and it can draw a healthy ocho pixels per clock, not just an anemic cuatro like the X1600 XT before it.

Those numbers may be the recipe for success for the Radeon X1650 XT, making it a worthy rival of the GeForce 7600 GT at around $149. If so, this product arrives not a second too soon. It seems like ATI hasn’t had a credible offering in this segment of the market since hooded flannel shirts were all the rage. Can the Radeon X1650 XT break the red team’s mid-range curse? Let’s have a look.


The Radeon X1650 XT

Meet the wild child
The Radeon X1650 XT’s unassuming appearance conceals its true personality. Under that pedestrian single-slot cooler lies a wildly transgressive graphics card, driven by a GPU that refuses to honor the boundaries of class or convention. The X1650 XT is part of the Radeon X1600 series, yet its graphics processor is not the RV530 silicon that has traditionally powered cards in that product line. The intrigue gets even deeper when you examine this mysterious GPU, code-named RV560. Truth be told, this is actually the same graphics chip behind the Radeon X1950 Pro that we reviewed a couple of weeks ago, the R570. For the X1650 XT, though, ATI has disabled portions of the chip and assigned it a new code name. If I recall correctly, this is the first time ATI has fabricated two code names for the same piece of silicon. So basically, ATI has chucked the conventions for both video card names and GPU code names in recent weeks, and the Radeon X1650 XT is the result.

Not that there’s anything wrong with that.

In fact, the X1650 XT benefits from its upper-middle-class heritage. Its RV570 GPU (sorry, but I’m not calling it RV560) has had a portion of its on-chip resources deactivated, either because of faults in some parts of the chip or simply for the sake of product segmentation. This hobbled GPU can still take on the GeForce 7600 GT with its one good arm, though, thanks to 24 working pixel shaders, eight vertex shaders, and eight texture units/render back-ends. These rendering bits and pieces run at a GPU core clock of 575MHz. To keep card costs down, the Radeon X1650 XT has only a 128-bit path to memory (like the GeForce 7600 GT) and not 256 bits (like its big brother, the X1950 Pro.) The X1650 XT’s 256MB of GDDR3 RAM runs at 675MHz.


Dual dual-link DVI ports flank the X1650 XT’s TV-out port

The X1650 XT’s cluster of ports befits a brand-new graphics card. The two dual-link DVI ports come with full support for HDCP, so they can participate in the copy-protection schemes used by the latest high-def displays.

If you’re driving a big display at high res with an X1650 XT, you may want to give it some help in the form of additional X1650 XT cards that run alongside it. That’s a distinct possibility thanks to the pair of internal CrossFire connectors on the top edge of the card. We’ve tested the X1650 XT in a dual-card CrossFire config, and ATI has confirmed for us that they plan to enable support for more than two cards in CrossFire using staggered connectors at some point in the future, although we don’t know much more than that.

That’s pretty much it for the Radeon X1650 XT’s basic specifications. Of course, it’s based on very familiar Radeon X1000-series GPU technology, with features and image quality that match everything up to the Radeon X1950 XTX. ATI says to expect cards at online retailers the week of November 13. The big remaining question is performance.

Test notes
We did run into a few snags in our testing, although none of them affected the Radeon X1650 XT. Most notably, when we tried to run a pair of GeForce 7600 GT cards in SLI, we encountered some odd image artifacts that we couldn’t make go away. The image artifacts didn’t appear to affect performance, so we’ve included results for the GeForce 7600 GT in SLI. If we find a resolution for the problem and performance changes, we’ll update the scores in this article.

Also, the 3DMark06 test results for the Radeon X1950 XTX CrossFire system were obtained using an Asus P5W DH motherboard, for reasons explained here. Otherwise, we used the test systems as described below.

Our testing methods
As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and the results were averaged.

Our test systems were configured like so:

Processor Core 2 Extreme X6800 2.93GHz Core 2 Extreme X6800 2.93GHz Core 2 Extreme X6800 2.93GHz
System bus 1066MHz (266MHz quad-pumped) 1066MHz (266MHz quad-pumped) 1066MHz (266MHz quad-pumped)
Motherboard Asus P5N32-SLI Deluxe Intel D975XBX Asus P5W DH
BIOS revision 0204 BX97510J.86A.1073.2006.0427.1210 0801
0305
North bridge nForce4 SLI X16 Intel Edition 975X MCH 975X MCH
South bridge nForce4 MCP ICH7R ICH7R
Chipset drivers ForceWare 6.86 INF Update 7.2.2.1007
Intel Matrix Storage Manager 5.5.0.1035
INF Update 7.2.2.1007
Intel Matrix Storage Manager 5.5.0.1035
Memory size 2GB (2 DIMMs) 2GB (2 DIMMs) 2GB (2 DIMMs)
Memory type Corsair TWIN2X2048-8500C5 DDR2 SDRAM at 800MHz Corsair TWIN2X2048-8500C5 DDR2 SDRAM at 800MHz Corsair TWIN2X2048-8500C5 DDR2 SDRAM at 800MHz
CAS latency (CL) 4 4 4
RAS to CAS delay (tRCD) 4 4 4
RAS precharge (tRP) 4 4 4
Cycle time (tRAS) 15 15 15
Hard drive Maxtor DiamondMax 10 250GB SATA 150 Maxtor DiamondMax 10 250GB SATA 150 Maxtor DiamondMax 10 250GB SATA 150
Audio Integrated nForce4/ALC850 with Realtek 5.10.0.6150 drivers Integrated ICH7R/STAC9221D5 with SigmaTel 5.10.5143.0 drivers Integrated ICH7R/ALC882M with Realtek 5.10.00.5247 drivers
Graphics Radeon X1650 XT 256MB PCIe
with Catalyst 8.301-060926a-036790E-ATI drivers
Radeon X1900 XTX 512MB PCIe + Radeon X1900 CrossFire
with Catalyst 8.282-060802a-035515C-ATI drivers
Dual Radeon X1650 XT 256MB PCIe
with Catalyst 8.301-060926a-036790E-ATI drivers
Radeon X1800 GTO 256MB PCIe
with Catalyst 8.282-060802a-035722C-ATI drivers
Radeon X1950 XTX 512MB PCIe + Radeon X1950 CrossFire
with Catalyst 8.282-060802a-03584E-ATI drivers
Dual Radeon X1950 Pro 256MB PCIe
with Catalyst 8.291-060822a-036024C-ATI drivers
Radeon X1900 GT 256MB PCIe
with Catalyst 8.282-060802a-035722C-ATI drivers
Radeon X1900 XT 256MB PCIe + Radeon X1900 CrossFire
with Catalyst 8.282-060802a-035515C-ATI drivers
Radeon X1900 XT 256MB PCIe
with Catalyst 8.282-060802a-03584E-ATI drivers
Radeon X1950 Pro 256MB PCIe
with Catalyst 8.291-060822a-036024C-ATI drivers
Radeon X1900 XTX 512MB PCIe
with Catalyst 8.282-060802a-03584E-ATI drivers
Radeon X1950 XTX 512MB PCIe
with Catalyst 8.282-060802a-03584E-ATI drivers
BFG GeForce 7600 GT OC 256MB PCIe
with ForceWare 91.47 drivers
Dual BFG GeForce 7600 GT OC 256MB PCIe
with ForceWare 91.47 drivers
XFX GeForce 7900 GS 480M Extreme 256MB PCIe
with ForceWare 91.47 drivers
Dual XFX GeForce 7900 GS 480M Extreme 256MB PCIe
with ForceWare 91.47 drivers
GeForce 7900 GT 256MB PCIe
with ForceWare 91.31 drivers
Dual GeForce 7900 GT 256MB PCIe
with ForceWare 91.31 drivers
XFX GeForce 7950 GT 570M Extreme 512MB PCIe
with ForceWare 91.47 drivers
Dual XFX GeForce 7950 GT 570M Extreme 512MB PCIe
with ForceWare 91.47 drivers
GeForce 7900 GTX 512MB PCIe
with ForceWare 91.31 drivers
Dual GeForce 7900 GTX 512MB PCIe
with ForceWare 91.31 drivers
GeForce 7950 GX2 1GB PCIe
with ForceWare 91.31 drivers
Dual GeForce 7950 GX2 1GB PCIe
with ForceWare 91.47 drivers
OS Windows XP Professional (32-bit)
OS updates Service Pack 2, DirectX 9.0c update (August 2006)

Thanks to Corsair for providing us with memory for our testing. Their quality, service, and support are easily superior to no-name DIMMs.

Our test systems were powered by OCZ GameXStream 700W power supply units. Thanks to OCZ for providing these units for our use in testing.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults.

The test systems’ Windows desktops were set at 1280×960 in 32-bit color at an 85Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.

We used the following versions of our test applications:

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Pixel-filling power
I’m inclined to chuck our fill rate table out of the window soon, because it’s becoming less relevant as programmable pixel shaders become more prevalent. Still, we’ll have a quick look at it here, since it helps illustrate what the Radeon X1650 XT has over the rest of the Radeon X1600 lineup.

Core
clock
(MHz)
Pixels/
clock
Peak
fill rate
(Mpixels/s)
Textures/
clock
Peak
fill rate
(Mtexels/s)
Effective
memory
clock (MHz)
Memory
bus width
(bits)
Peak memory
bandwidth
(GB/s)
Radeon X1650 XT 575 8 4600 8 4600 1350 128 21.6
Radeon X1650 Pro 600 4 2400 4 2400 1400 128 22.4
GeForce 7600 GT 560 8 4480 12 6720 1400 128 22.4
All-In-Wonder X1900 500 16 8000 16 8000 960 256 30.7
Radeon X1800 GTO 500 12 6000 12 6000 1000 256 32.0
GeForce 7800 GT 400 16 6400 20 8000 1000 256 32.0
Radeon X1800 XL 500 16 8000 16 8000 1000 256 32.0
GeForce 7800 GTX 430 16 6880 24 10320 1200 256 38.4
Radeon X1900 GT 575 12 6900 12 6900 1200 256 38.4
GeForce 7900 GS 450 16 7200 20 9000 1320 256 42.2
GeForce 7900 GT 450 16 7200 24 10800 1320 256 42.2
Radeon X1950 Pro 575 12 6900 12 6900 1380 256 44.2
XFX GeForce 7900 GS 480M 480 16 7680 20 9600 1400 256 44.8
GeForce 7950 GT 550 16 8800 24 13200 1400 256 44.8
Radeon X1900 XT 625 16 10000 16 10000 1450 256 46.4
XFX GeForce 7950 GT 570M 570 16 9120 24 13680 1460 256 46.7
Radeon X1800 XT 625 16 10000 16 10000 1500 256 48.0
Radeon X1900 XTX 650 16 10400 16 10400 1550 256 49.6
GeForce 7900 GTX 650 16 10400 24 15600 1600 256 51.2
GeForce 7800 GTX 512 550 16 8800 24 13200 1700 256 54.4
Radeon X1950 XTX 650 16 10400 16 10400 2000 256 64.0
GeForce 7950 GX2 2 * 500 32 16000 48 24000 1200 2 * 256 76.8

Although it has nearly the exact same amount of memory bandwidth as its little brother, the Radeon X1650 Pro, the X1650 XT can paint nearly twice as many pixels with its GPU each second. In fact, the X1650 XT’s pixel fill rate surpasses that of the GeForce 7600 GT. The GeForce, however, has more peak texturing capacity.

All of this tech talk doesn’t mean much if the cards can’t deliver on their theoretical promise, however. This quick synthetic fill rate test checks to see how close they can get.

The X1650 XT can’t quite match the GeForce 7600 GT in these tests, but it does get very close to its own peak theoretical throughput when applying multiple textures. For its part, the BFG Tech 7600 GT OC card slightly outperforms the theoretical peak numbers we have in the table above because it’s running at higher-than-stock clock speeds. These things are merely academic at the end of the day, though, so let’s move on to real games.

Quake 4
In order to make sure we pushed the video cards as hard as possible, we enabled Quake 4’s multiprocessor support before testing.

There are an awful lot of LCD displays out there these days with 1280×1024 resolution. Although it’s tempting to try given the numbers, I don’t think I’d attempt to play through Quake 4 on a Radeon X1650 XT at these quality settings and that resolution. Based on my experience, frame rates can get rather choppy. Dropping back to 2X AA seems to yield more consistently acceptable performance, although that’s a compromise GeForce 7600 GT owners likely won’t have to make in this game. The Radeon X1650 XT CrossFire rig does quite well up to 1600×1200 resolution, but it appears to hit a wall at 2048×1536. The GeForce 7600 GT in SLI handles this crazy-high res more gracefully.

F.E.A.R.
We’ve used FRAPS to play through a sequence in F.E.A.R. in the past, but this time around, we’re using the game’s built-in “test settings” benchmark for a quick, repeatable comparison.

Ok, so F.E.A.R. at its max quality settings is admittedly a bit much for any single graphics card in this price range. The X1650 XT does match up well to the GeForce 7600 GT in single-card performance, but it suffers from the same odd issues that other Radeons do in CrossFire mode in this game.

Half-Life 2: Episode One
The Source game engine uses an integer data format for its high-dynamic-range rendering, which allows all of these cards to combine HDR rendering with 4X antialiasing.

This is a good-looking game, but it doesn’t have the most complex lighting, so it runs reasonably well on a mid-range graphics card. Here, the X1650 XT has a bit of an edge over the 7600 GT. I played this game some with a single Radeon X1650 XT, and it runs smoothly enough at 1280×1024 with 4X AA and HDR. Using those settings at 1600×1200 is pushing it, though.

The Elder Scrolls IV: Oblivion
We tested Oblivion by manually playing through a specific point in the game five times while recording frame rates using the FRAPS utility. Each gameplay sequence lasted 60 seconds. This method has the advantage of simulating real gameplay quite closely, but it comes at the expense of precise repeatability. We believe five sample sessions are sufficient to get reasonably consistent and trustworthy results. In addition to average frame rates, we’ve included the low frames rates, because those tend to reflect the user experience in performance-critical situations. In order to diminish the effect of outliers, we’ve reported the median of the five low frame rates we encountered. We set Oblivion’s graphical quality settings to “Ultra High.” The screen resolution was set to 1600×1200 resolution, with HDR lighting enabled. 16X anisotropic filtering was forced on via the cards’ driver control panels.

The Radeon X1650 XT just barely edges out the GeForce 7600 GT in Oblivion. The performance numbers are awfully close, but you will appreciate the Radeon’s superior texture filtering in this game. This is another instance where our chosen comparative test resolution is a bit out of a single X1650’s league. I did try playing some Oblivion on the Radeon X1650 XT with these same quality settings at 1280×1024, and it ran quite well. I was impressed.

Ghost Recon Advanced Warfighter
We tested GRAW with FRAPS, as well. We cranked up all of the quality settings for this game, with the exception of antialiasing. However, GRAW doesn’t allow cards with 256MB of memory to run with its highest texture quality setting, so those cards were all running at the game’s “Medium” texture quality.

This one is really too close to call. Although the GeForce 7600 GT’s average frame rates in single and dual-card configurations is higher, its median low frame rates are very similar to the Radeon X1650 XT’s.

3DMark06

The 3DMark scores simply serve to confirm what we’ve already seen: the Radeon X1650 XT and GeForce 7600 GT are extremely closely matched in performance.

The 7600 GT takes two out of 3DMark’s three synthetic shader tests, but the X1650’s eight vertex shaders give it the edge in the complex vertex test.

Power consumption
We measured total system power consumption at the wall socket using an Extech power analyzer model 380803. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. We tested all of the video cards using the Asus P5N32-SLI SE Deluxe motherboard, save for the CrossFire system, which required a different chipset. For that system, we used an Intel D975XBX motherboard.

The idle measurements were taken at the Windows desktop. The cards were tested under load running Oblivion using the game’s Ultra Quality setting at 1600×1200 resolution with 16X anisotropic filtering.

The performance race between the Radeon X1650 XT and the GeForce 7600 GT is practically too close to call, but here we see some real differences. The X1650 XT draws about 10W more than the 7600 GT at idle and about 30W more under load. That’s really not surprising considering that the Nvidia G73 GPU is about 125 mm², while the RV570 is 230 mm². Oddly enough, the X1650 XT seems to draw a little more power than the higher end Radeon X1950 Pro based on the same GPU. Noise levels and cooling
We measured noise levels on our test systems, sitting on an open test bench, using an Extech model 407727 digital sound level meter. The meter was mounted on a tripod approximately 14″ from the test system at a height even with the top of the video card. The meter was aimed at the very center of the test systems’ motherboards, so that no airflow from the CPU or video card coolers passed directly over the meter’s microphone. We used the OSHA-standard weighting and speed for these measurements.

You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured, including CPU and chipset fans. We had temperature-based fan speed controls enabled on the motherboard, just as we would in a working system. We think that’s a fair method of measuring, since (to give one example) running a pair of cards in SLI may cause the motherboard’s coolers to work harder. The motherboard we used for all single-card and SLI configurations was the Asus P5N32-SLI SE Deluxe, which on our open test bench required an auxiliary chipset cooler. The Asus P5W DH Deluxe motherboard we used for CrossFire testing didn’t require a chipset cooler, so those systems were inherently a little bit quieter. In all cases, we used a Zalman CNPS9500 LED to cool the CPU.

Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card’s highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.

We measured the coolers at idle on the Windows desktop and under load while playing back our Quake 4 nettimedemo. The cards were given plenty of opportunity to heat up while playing back the demo multiple times. Still, in some cases, the coolers did not ramp up to their very highest speeds under load. The Radeon X1800 GTO and Radeon X1900 cards, for instance, could have been louder had they needed to crank up their blowers to top speed. Fortunately, that wasn’t necessary in this case, even after running a game for an extended period of time.

You’ll see two sets of numbers for the GeForce 7950 GT below, one for the XFX cards with their passive cooling and another for the BFG Tech cards, which use the stock Nvidia active cooler. I measured them both for an obvious reason: they were going to produce very different results.

As we’ve noted before, small coolers tend to be noisy, and the X1650 XT can’t entirely escape that fact. This card really isn’t a terrible offender overall. It’s a little louder at idle than most, but not horrible. While running a game, the X1650 XT’s cooler doesn’t crank up to the kind of high-pitched whine you get from the GeForce 7600 GT, thank goodness. The difference shows on our decibel meter. CrossFire, on the other hand, is another story—especially under load. I’m not sure if there was some kind of odd harmonic effect between the two X1650 cards’ coolers or what. Readings on the decibel meter seemed to cycle up and down through a much broader range than we’ve seen from other coolers. I tried to pick a reasonable center of that range to include in our results, and it turned out to be louder than anything else we’ve recorded. I wouldn’t worry too much about that, though. Subjectively, the X1650 XT isn’t an especially loud card.

Overclocking
I had some trouble using ATI’s auto-overclocking utility to find the optimal clock speed in our Radeon X1950 Pro review, but I ran into no such issues with the X1650 XT. The utility settled on 621MHz core and 763MHz memory clocks for both single-card and CrossFire configs—pretty much maxing out the peak values for the manual sliders in the control panel—and both were stable at those speeds.

Unfortunately, overclocking this thing to the apparent max in ATI’s control panel doesn’t get you much additional performance. The headroom does seem to be there, though, to take it up quite a ways.

Conclusions
The Radeon X1650 XT performs almost exactly on par with the GeForce 7600 GT overall. At long last, it establishes a performance balance between ATI’s Radeon X1000-series products and Nvidia’s GeForce 7 series products at $149. This same performance balance has existed for a while at other price points, and things are tighter than ever after the fall round of graphics card refreshes. In fact, I’m a little surprised to see ATI simply meeting Nvidia with a product that offers nearly equivalent performance to the GeForce 7600 GT. Typically, you’d expect some one-upsmanship to kick in here. After all, it’s not like the GeForce 7600 GT wasn’t a well-defined target—it’s been selling since March. Had ATI pushed the GPU or memory clocks a little further, they could have grabbed the decisive performance lead.

The respective strengths of these two competing graphics technologies remain more or less the same at $149 as elsewhere. GeForce 7 chips tend to be smaller and more power efficient, while the Radeon X1000 series sports a handful of additional features that can lead to higher image quality. I think those picky image quality differences tend to matter more in high-end graphics cards where advanced texture filtering and antialiasing modes will get more use. In this case, I wouldn’t give the Radeon X1650 XT a decisive edge over the GeForce 7600 GT on that basis. ATI may have the advantage, though, simply because the Radeon X1650 XT is a newer product with dual-link DVI outputs and support for HDCP.

The question of dual-card capability is an intriguing one. No CrossFire config we’ve tested, including the X1650 XT, seems to handle F.E.A.R. well. Overall, Nvidia’s SLI seems to be the more mature dual-GPU solution. On the other hand, we can’t really recommend SLI for the GeForce 7600 GT given the “green fringing” problem we’ve encountered. We haven’t seen a fix for this problem in the two months since we first reported it. I suspect not many folks are running GeForce 7600 GT cards in SLI, or we’d have a fix by now. That’s a shame, because whether you’re talking ATI or Nvidia, running a pair of $149 graphics cards in a dual-GPU config actually looks like a pretty decent value for the money.

Speaking of value, the X1650 XT’s value proposition is a tricky subject. If you happen to have a monitor whose optimal resolution is 1280×1024 or (shudder) lower, you really don’t need more graphics card than the Radeon X1650 XT in order to run today’s games at some fairly high quality settings. Dropping $149 for one of these cards will give you a monstrous improvement over the awful built-in graphics in most cheaper PCs, and you may not need to spend any more.

Then again, there are much better graphics cards to be had for just 50 bucks more, including the Radeon X1950 Pro and the GeForce 7900 GS, both of which feature 256-bit memory interfaces for roughly twice the bandwidth of the X1650 XT. That means they’ll run games with more antialiasing, higher quality settings, and better texture filtering without slowing down. The additional $50 will probably be money well spent for all but the most casual of gamers, especially over the long run as newer, more demanding games arrive.

Comments closed
    • madgun
    • 13 years ago

    might have been some yielding issues in the new 80nm process that limited the core clocks for the x1650xt

    • DrDillyBar
    • 13 years ago

    hmm, x1950 and newfangled CrossFire or x1900 XT for giggles?

    • flip-mode
    • 13 years ago

    Not to be an ass but…

    Those graphs are pretty congested and… is there any reason that we need to see uber high end cards and uber high end cards in SLI in this review? IMO, if you show the 7900GS, x1950-pro, 7600GT, x1650XT, and perhaps the x1600XT, and then each of those in dual-card config, then you’ve shown everything you need to. Who out there is thinking, “Hmm, it’s a toss up between the x1950xtx in crossfire or the x1650xt.”

    I wouldn’t say anything if the graphs weren’t so dang crowded.

      • mboza
      • 13 years ago

      What makes it worse is that this card is bottom or second bottom in almost every single graph, so there is no sense of what performance a step down would give.

      If you are going to show all the high end cards, it would have been nice to see some lesser cards, like the 7300GS and the X1600XT to fill out the lower end, or even last gen cards like a 6600 or 6800 for all of us wondering how much of a difference an upgrade might make.

    • MadManOriginal
    • 13 years ago

    Totally OT, but how come TR is the only website that consistently loses my login details?

    • droopy1592
    • 13 years ago

    Give me a passively cooled single slot midrange card and I’m all in.

      • dragmor
      • 13 years ago

      Your talking about the 7600GS

      • Flying Fox
      • 13 years ago

      I’m sure Asus and Gigabyte are thinking about it.

      • Damage
      • 13 years ago

      Why don’t you edit your post and take out your zip code and sort type? Would make the page look right and wouldn’t give away your location so much.

        • Dposcorp
        • 13 years ago

        Sorry Damage, done.
        #41, this may be the last AIW, so maybe a collectors item.

      • l33t-g4m3r
      • 13 years ago

      i bought the aiw for 299…. i think.
      but yeah, it is a better deal.

      • stmok
      • 13 years ago

      Eh, isn’t one of the consequences of the ATI buy-out, is the killing the AIW line? (That’s what I hear what AMD plans to do)

        • rgreen83
        • 13 years ago

        Actually i heard the x1900 AIW was the last AIW before even rumors of the AMD merger.

    • DreadCthulhu
    • 13 years ago

    At $150, this card is a poor deal – Newegg has a couple 7600GT in the $115-$120 range after MIR; even before rebates they are under $150.

      • rgreen83
      • 13 years ago

      I doubt this card will be at $149 for long, because as you said, it would be silly. The x1950 pro far outperforms it and can be had for ~$170 shipped, $20 for 50% graphics power and %100 more mem bandwidth, no contest.

    • Mr Bill
    • 13 years ago

    ten characters

    • Proesterchen
    • 13 years ago

    Personally, I have to wonder what the point of launching this card today was, given that they already had to postpone the launch by a couple of weeks before, and cards are only supposed to show up in another 2 weeks. Why not postpone the launch until you have the volume in the channel? (it’s not like this competes with a certain nVidia chipset poised to be launched in the meantime)

    Another thing, if RV560 == RV570 (for at least a sizeable part of the product run), one has to wonder about the state of the 80nm process their using. If they chose to have a down-marked card for salvage, and have huge chunks of the original die deactivated for it, errors might be all over the place and/or redundancy lacking severely.

    • adisor19
    • 13 years ago

    Umm where are the pr0n pics of the naked GPU ? ๐Ÿ™

    Adi

      • Damage
      • 13 years ago

      In the X1950 Pro review.

    • unmake
    • 13 years ago

    When were hooded flannel shirts all the rage?

    • Peldor
    • 13 years ago

    New math on the power comparison?

    209W – 179W = “about 20W”?

      • BoBzeBuilder
      • 13 years ago

      Coffee doesn’t come cheap buddy.

      • Damage
      • 13 years ago

      Doh. Fixed.

    • king_kilr
    • 13 years ago

    #17, Not really ever since the X800/6800 series Nvidia has released better products first. ATi has been playing ketchup for a while.

      • Krogoth
      • 13 years ago

      Let me clearify, they both have winners and losers for this generation.

      The losers on the Nvidia front are: Geforce 73xx, Geforce 7900GTX (cost-wise), Geforce 7900GX2 and quad SLI.

      Nvidia’s winners are: Geforce 7950GX2, Geforce 7900GT, Geforce 7900GTO, Geforce 7600GT

      ATI front losers are: X1600, X1800XT (intially), X1900GT, X1900XTX and older CF.

      ATi’s winners are: X13xx, 256MB X1900XT, X1800GTO, X1800XT (later) and X1950XT.

    • flip-mode
    • 13 years ago

    Meh, I’m not impressed. It should have had a better cooler. It should have gone 600/700 to at least edge-out the 7600GT. Hell, they should have just put a full-fledged R570 on the card and let the 128-bit memory handle the market segmentation.

    I’d certainly pony up another $50 for the X1950-Pro.

    Thanks for the review Wasson.

    • MadManOriginal
    • 13 years ago

    Hmph, a bit underwhelming performance-wise. It’s on par with the 7600GT which is good but coming out this late in the cycle it should have a noticable advantage. Too bad they didn’t do a 256-bit memory bus, a little strange really since the same core on the x1950pro has it and I don’t think this is a drop-in replacement for previous x1600s on an identical PCB, that could have kicked it up a notch for sure. Now…if it was unlockable then we’d be talking.

    It looks like a good candidate for HTPCs in any case.

      • crabjokeman
      • 13 years ago

      y[

        • MadManOriginal
        • 13 years ago

        HDCP with some kind of performance. There are cheaper HDCP cards but they have worse performance.

    • R2P2
    • 13 years ago

    q[

      • d2brothe
      • 13 years ago

      Yes but they don’t have teh glitz of a dual card config :P…. plus u can’t buy half a more expensive card and get the other half later…

    • chakhay2000
    • 13 years ago

    Don’t you just hate it how ATI’s been playing catch up to Nvidia’s titans for an annoyingly long time now? Sigh. I miss the R300 days ๐Ÿ™

      • Krogoth
      • 13 years ago

      Nvidia and ATI (DAMMIT) were on the same level for this generation, nether platform was clearly better then the other.

        • Flying Fox
        • 13 years ago

        Performance wise they may be neck and neck, but the 7600GT still wins in more of the quantitative tests (the ones that deal with numbers). But once power consumption is added into the consideration the 7600GT simply beats it in the performance per watt department. Good job on the new fan though as it beats the 7600GT, so that’s good news. However, I would have liked it to get a passive version. I am definitely surprised that the wattage at load is more than the full RV570. :whoa:

        Gigabyte and Asus, let’s get to work.

        PS. Now, how does this thing fold, if at all?

    • MadManOriginal
    • 13 years ago

    No link…it’s the ultimate paper launch because it’s only referenced electronically not even on paper!

    I’m sure it will be fixed but in the meantime click the ‘Tech Report’ graphic to the right or the link in post #4

    • Spotpuff
    • 13 years ago

    Maybe it’s just me but I can’t find a link to the article.

      • flip-mode
      • 13 years ago

      Yeah, me neither. This is cruel! Thanks to Spuppy below!

    • sigher
    • 13 years ago

    I like this line:
    “Thanks to Corsair for providing us with memory for our testing. Their quality, service, and support are easily superior to no-name DIMMs.”

    Such accolades, are they better than chewing glass too? and better than a punch in the face? easily superior than no-name, haha

    • Nullvoid
    • 13 years ago

    It’s funny how even after all their attempts they still can’t produce a card better than the 7600gt at the same price point :/

      • A_Pickle
      • 13 years ago

      It really depends on what your needs are. Videophiles will find the ATI cards to be far superior to Nvidia’s offerings, and certain gamers find ATI image quality to be far superior. I notice the in-game differences, my X800 did anti-aliasing better at 4x than my current Geforce Go 7900 GS does at 4x. And Half-Life 2 remembered my graphics settings with my X800, but it doesn’t with my 7900. And I can’t do high dynamic range alongside anti-aliasing.

      I wish I could’ve gotten a mobility X1800… but hey. :/

        • Nullvoid
        • 13 years ago

        I’ve always felt that if video playback is your main concern you might aswell just get an x1300 class card and save some pennies. But I guess you’re right.

      • swaaye
      • 13 years ago

      It’s not that they can’t. They just don’t want to create a card that costs them more. But, there are better values than this out there anyway. This is just another hundred gallons of water thrown into the vast, flooded video card market. They probably really like the idea of being able to sell those defective chips and want to end RV530 production. X1600 was sure pitiful.

    • Jigar
    • 13 years ago

    Good Article “Damage” Thanks.

    • kvndoom
    • 13 years ago

    Dude, where’s my link???

      • A_Pickle
      • 13 years ago

      “_[

        • Nullvoid
        • 13 years ago

        oh they’re taunting us ๐Ÿ™

      • sigher
      • 13 years ago

      Click the “tech report” pic next to it.

Pin It on Pinterest

Share This