A first look at the Radeon HD 4850

The battle between AMD and Nvidia for the hearts, minds, and disposable income of PC enthusiasts is starting to get scrappy. First, AMD scheduled a press event to distract folks from Nvidia’s GeForce GTX 200 series launch, which Nvidia subsequently pulled forward. Perhaps in retaliation, the green team then divulged plans to unleash a faster version of its GeForce 9800 GTX. This GeForce 9800 GTX+ will sell for only $229, dropping the vanilla 9800 to $199—conveniently stepping on the price point of AMD’s next-gen Radeon HD 4850 graphics card. Lest it be outmaneuvered, and because cards are actually available for sale already, AMD has decided to lift the curtains on the 4850 a little early. Keep reading for our first look at AMD’s new mid-range Radeon.

The Radeon HD 4850 revealed, sort of

While we’re limited in what we can say about the Radeon HD 4850 and its shiny new graphics processor, the card itself is fair game. And here she is:

This particular card comes from Sapphire, which provides a sticker for the reference design’s cooling shroud. As you can see, the card is a single-slot design, just like the Radeon HD 3850 that came before it. Thanks to its svelte cooler, the 4850 won’t cannibalize adjacent expansion slots. The slim design should also make it easier for users to assemble three- and four-way CrossFire configurations.

The Radeon HD 4850 has only a single six-pin PCI Express power connector, which bodes well for the card’s power consumption. For what it’s worth, the GeForce 9800 GTX and GTX+ each have a pair of PCIe power plugs.

The 4850 features a core clock speed of 625MHz, and it’s equipped with 512MB of GDDR3 memory running at nearly 1GHz. AMD’s Catalyst Control Center software reports 59GB/s of memory bandwidth, as well. With a little reverse math and memory running at 993MHz, it looks like the 4850’s path to memory is 256 bits wide.

Our testing methods

As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and the results were averaged.

Our test systems were configured like so:

Processor Core
2 Extreme QX9650
3.0GHz
System
bus
1333MHz
(333MHz quad-pumped)
Motherboard Gigabyte
GA-X38-DQ6
BIOS
revision
F9a
North
bridge
X38
MCH
South
bridge
ICH9R
Chipset
drivers
INF
update 8.3.1.1009

Matrix Storage Manager 7.8

Memory
size
4GB
(4 DIMMs)
Memory
type
2
x Corsair
TWIN2X20488500C5D
DDR2 SDRAM
at 800MHz
CAS
latency (CL)
5
RAS
to CAS delay (tRCD)
5
RAS
precharge (tRP)
5
Cycle
time (tRAS)
18
Command
rate
2T
Audio Integrated
ICH9R/ALC889A

with RealTek 6.0.1.5618 drivers

Graphics

Radeon HD 2900 XT 512MB PCIe

with Catalyst 8.5 drivers

Asus Radeon HD 3870 512MB PCIe

with Catalyst 8.5 drivers



Radeon HD 3870 X2 1GB PCIe

with Catalyst 8.5 drivers


Radeon HD 4850 512MB PCIe

with Catalyst 8.501.1-080612a-064906E-ATI drivers

MSI
GeForce
8800 GTX 768MB PCIe

with ForceWare 175.16 drivers

XFX
GeForce
9800 GTX 512MB PCIe

with ForceWare 175.16 drivers

XFX
GeForce
9800 GX2 1GB PCIe

with ForceWare 175.16 drivers

GeForce
GTX 260 896MB PCIe

with ForceWare 177.34 drivers

GeForce
GTX 280 1GB PCIe

with ForceWare 177.34 drivers

Hard
drive
WD
Caviar SE16 320GB SATA
OS Windows
Vista Ultimate
x64 Edition
OS
updates
Service
Pack 1, DirectX March 2008 update

Thanks to Corsair for providing us with memory for our testing. Their quality, service, and support are easily superior to no-name DIMMs.

Our test systems were powered by PC Power & Cooling Silencer 750W power supply units. The Silencer 750W was a runaway Editor’s Choice winner in our epic 11-way power supply roundup, so it seemed like a fitting choice for our test rigs. Thanks to OCZ for providing these units for our use in testing.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

We used the following versions of our test applications:

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Synthetic tests

3DMark Vantage’s color fill test is typically limited by memory bandwidth, but the Radeon HD 4850 still manages to push more pixels per second than the GeForce 9800 GTX, which has over 70GB/s of peak memory bandwidth to the Radeon’s 59GB/s.

Moving to RightMark’s fill rate test, which uses 8-bit integer texture formats, the Radeon falls behind the 9800 GTX when thrown multiple textures.

Vantage’s texture fill rate test uses FP16 textures, and in it, the 4850 really shines. Not only does the card push significantly more texels per second than the 9800 GTX, it just edges out Nvidia’s new flagship GeForce GTX 280.

The 4850’s filtering performance scales predictably here. Sure, the Radeon may trail the 9800 GTX with each filtering type, but its filtering performance is vastly improved over the Radeon HD 3870.

Through all but one of 3DMark’s synthetic shader tests, the Radeon HD 4850 fares extremely well. Only the GPU cloth test seems to give the 4850 trouble, and even then, it’s still significantly faster than the other Radeons. Otherwise, the Radeon HD 4850 is faster than the GeForce 9800 GTX, particularly in the parallax occlusion mapping and Perlin noise tests. In the latter, the Radeon even outguns the GeForce GTX 260.

Call of Duty 4: Modern Warfare

We tested Call of Duty 4 by recording a custom demo of a multiplayer gaming session and playing it back using the game’s timedemo capability. Since these are high-end graphics configs we’re testing, we enabled 4X antialiasing and 16X anisotropic filtering and turned up the game’s texture and image quality settings to their limits.

We’ve chosen to test at 1680×1050, 1920×1200, and 2560×1600—resolutions of roughly two, three, and four megapixels—to see how performance scales.

The Radeon HD 4850 is an absolute monster in Call of Duty 4, matching the performance of the dual-GPU 3870 X2. That puts the 4850 comfortably ahead of the GeForce 9800 GTX with each resolution we tested. Heck, the Radeon even manages to hang with the GeForce GTX 260 until we hit a display resolution of 2560×1600.

Half-Life 2: Episode Two

We used a custom-recorded timedemo for this game, as well. We tested Episode Two with the in-game image quality options cranked, with 4X AA and 16 anisotropic filtering. HDR lighting and motion blur were both enabled.

In Episode Two, the Radeon HD 4850 maintains its lead over the GeForce 9800 GTX. Note the huge jump in performance over AMD’s last mid-range offering, the Radeon HD 3870.

Enemy Territory: Quake Wars

We tested this game with 4X antialiasing and 16X anisotropic filtering enabled, along with “high” settings for all of the game’s quality options except “Shader level” which was set to “Ultra.” We left the diffuse, bump, and specular texture quality settings at their default levels, though. Shadow and smooth foliage were enabled, but soft particles were disabled. Again, we used a custom timedemo recorded for use in this review.

Quake Wars allows the 4850 to extend its lead over the GeForce 9800 GTX. The Radeon delivers an impressive 50 frames per second at 2560×1600—twice that of the 9800. More impressive, however, is the fact that the 4850 is nearly within striking distance of the GeForce GTX 260, which is double the cost.

Crysis

Rather than use a timedemo, we tested Crysis by playing the game and using FRAPS to record frame rates. Because this way of doing things can introduce a lot of variation from one run to the next, we tested each card in five 60-second gameplay sessions.

Also, we’ve chosen a new area for testing Crysis. This time, we’re on a hillside in the recovery level having a firefight with six or seven of the bad guys. As before, we’ve tested at two different settings, with the game’s “High” quality presets and with its “Very high” ones, also.

Crysis gives us our first look at AMD’s newest CrossFire couplet, whose performance flirts with that of the GeForce GTX 280. Running only a single card, the Radeon HD 4850 and GeForce 9800 GTX look evenly matched. The latter is quicker with Crysis‘ high-quality detail setting, while the former takes the lead if you crank the eye candy all the way up.

Assassin’s Creed

There has been some controversy surrounding the PC version of Assassin’s Creed, but we couldn’t resist testing it, in part because it’s such a gorgeous, well-produced game. Also, hey, we were curious to see how the performance picture looks for ourselves. The originally shipped version of this game can take advantage of the Radeon HD 3870 GPU’s DirectX 10.1 capabilities to get a performance boost with antialiasing, and as you may have heard, Ubisoft chose to remove the DX10.1 path in an update to the game. we chose to test the game without this patch, leaving DX10.1 support intact.

We used our standard FRAPS procedure here, five sessions of 60 seconds each, while free-running across the rooftops in Damascus. All of the game’s quality options were maxed out, and we had to edit a config file manually in order to enable 4X AA at this resolution. Eh, it worked.

The Radeon HD 4850 slots in between the GeForce GTX 280 and 260 in Assassin’s Creed, putting it well ahead of the 9800 GTX. Note that the Radeon has the same median low frame rate as Nvidia’s latest high-end behemoth.

Race Driver GRID

We tested this absolutely gorgeous-looking game with FRAPS, as well, and in order to keep things simple, we decided to capture frame rates over a single, longer session as we raced around the track. This approach has the advantage of letting me report second-by-second frame-rate results.

GRID finds the 4850 between Nvidia’s GeForce 200 series cards yet again. Its minimum frame rate may be a little lower than that of the GTX 260, but the Radeon still has a healthy cushion over the GeForce 9800 GTX.

3DMark Vantage

And finally, we have 3DMark Vantage’s overall index. Note that we used the “High” presets for the benchmark rather than “Extreme,” which is what everyone else seems to be using. Somehow, we thought frame rates in the fives were low enough.

The GeForce 9800 GTX is simply no match for the Radeon HD 4850 in 3DMark Vantage. Here, the Radeon delivers nearly the same performance as AMD’s CrossFire-on-a-stick 3870 X2.

Power consumption

We measured total system power consumption at the wall socket using an Extech power analyzer model 380803. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The cards were plugged into a motherboard on an open test bench.

The idle measurements were taken at the Windows Vista desktop with the Aero theme enabled. The cards were tested under load running Half-Life 2 Episode Two at 2560×1600 resolution, using the same settings we did for performance testing.

The Radeon HD 4850’s power consumption is lower than that of the GeForce 9800 GTX both at idle and under load. Interestingly, though, the card actually consumes more power than the GTX 200 series at idle.

GPU temperatures

Per your requests, we’ve added GPU temperature readings to our results. We captured these using AMD’s Catalyst Control Center and Nvidia’s nTune Monitor, so we’re basically relying on the cards to report their temperatures properly. In the case of multi-GPU configs, we only got one number out of CCC. We used the highest of the numbers from the Nvidia monitoring app. These temperatures were recorded while running the “rthdribl” demo in a window. Windowed apps only seem to use one GPU, so it’s possible the dual-GPU cards could get hotter with both GPUs in action. Hard to get a temperature reading if you can’t see the monitoring app, though.

AMD’s use of a single-slot cooler for the 4850 yields the highest load GPU temperatures of the lot. The GeForce 9800 GTX runs more than 25 degrees cooler.

Conclusions

It’s been a long time since a Radeon was the graphics card of choice at the all-important $199 price point, but the HD 4850 looks like it might have the title locked up. The current GeForce 9800 GTX is simply no match for AMD’s latest mid-range offering, and Nvidia’s surprise, the GeForce 9800 GTX+, has quite a bit of ground to make up if it hopes to be competitive. We’ll have a full work-up of the GTX+ soon, of course, but the cards only just arrived this morning.

While it would be hasty to draw too many conclusions before we have a better grasp of the GeForce 9800 GTX+’s performance, and before AMD fully reveals its Radeon HD 4000 series, one thing is certain: the graphics war looks more competitive now than it’s been in a very long time. That’s ultimately a good thing for consumers, especially since AMD looks keen to take the fight aggressively to mid-range products that most consumers can afford.

Comments closed
    • SNM
    • 11 years ago

    So apparently the NDA has been lifted on the 4870 and the architectural details (or at least AnandTech has put their review up). Do we get more TR goodness?

    • Bensam123
    • 11 years ago

    Good review, as always I’m digging the common lingo and lame jokes only techies can get. >_<

    No pictures without the heatsink on?

    This might be a selfish request as well, but reading the Geforce 280 review and seeing the card without the heatsink, it would be nice to see the heatsink flipped over next to it to see what it all covers. It’d also be nice if the kind of material the heatsink was made out of was mentioned. Then again I like replacing a lot of my heatsinks with 3rd party ones if the originals are crappy.

    Also, what’s the heatsink covering towards the edge of the card with the power connector (guessing voltage regulators or something) and is it vital to be covered?

      • ChronoReverse
      • 11 years ago

      They’re VRMs and it’s part of the single HSF assembly.

      Cooling them isn’t absolutely essential and frankly, the current stock cooler is actually heating them IMO when I examine my own card.

    • fpsduck
    • 11 years ago

    The Red Strikes Back!

    I’m about to buy HD 3650
    but I think I’ll wait for a new HD 4650 (if it does exist).

      • moritzgedig
      • 11 years ago

      good for you

    • Voldenuit
    • 11 years ago

    I’d like to see independent thermal measurements, not just readings reported off (uncalibrated) silicon. Maybe a laser thermometer would work (but how to get consistent readings? I’ll leave that up to TR to decide).

    Another useful bit of information would be the effect of GPUs on case, system and CPU temps. With a standardised test rig, readers would then be able to judge how much a GPU affects their system temps. Simple temperature readings don’t tell us enough, because a 110W GPU at, say 85C might be putting out less heat than a 150W GPU at 70C.

      • Bensam123
      • 11 years ago

      I’d also like to see if dipping the card in liquid nitrogen increases frame rates, oh and also some voltage mods.

      Do temps really matter that much? If they were testing custom heatsinks then I’d agree, but, unfortunately, they don’t do many heatsink reviews around here. 🙁

      • Saber Cherry
      • 11 years ago

      A 110W card puts out 110W of heat and a 150W card puts out 150W of heat, so there is no reason to do such tests except to verify that thermodynamics is locally working as expected. If the card exhausts out of the case, it will not add that heat to the case; otherwise, it will. The GPU temperature is irrelevant to anything except stability, life expectancy, and overclocking headroom of the card.

      I agree about external verification of GPU temperature, as the internal temperature sensors are not known to be very accurate. But it’s not actually possible to measure the peak temperature (near the middle of the GPU area), as this is one of those cases where taking the measurement (of the silicon, or ceramic chip package, or whatever) would interfere a lot with the heatsink and thus render the results invalid.

        • ssidbroadcast
        • 11 years ago

        Can’t you just point one of those laser-point thermometers to the heatsink? Mechanics use them in autoshops, and they’re quite accurate until you point them at the open sky.

          • Saber Cherry
          • 11 years ago

          Infrared ranged thermometers are very accurate at what they measure, but the heatsink temperature does not really tell you anything. The silicon is much hotter than the heatsink, and the heatsink base is much hotter than the heatsink surface. The temperature of the heatsink surface is related to heatsink geometry, ambient temperature, airflow, and so many other variables that it tells you nothing about the temperature of the core (except giving a lower bound).

          If you have a big copper heatsink with a fan and a small aluminum heatsink without a fan, the small heatsink will be far hotter even if the two cores were the same temperature.

          • Voldenuit
          • 11 years ago

          I agree, laser thermometer is the way to go.

          I know some sites that measure the exhaust temp of gpus (limited to dual slot exhausting coolers). Quite a few also measure the rear of the pcb under the GPU socket.

          Another thing that was missing from the 4850 review were noise measurements. Understandable, given the rush when the NDA was moved forward, but it would be nice to revisit this now.

            • Forge
            • 11 years ago

            I can tell you from firsthand experience that 1. it won’t be in any way meaningful till ATI updates the drivers to include Powerplay and thus fan profiles and also 2. idle will be lost far below DamageLabs’ noise floor, and possibly load also.

        • Voldenuit
        • 11 years ago

        l[< A 110W card puts out 110W of heat and a 150W card puts out 150W of heat<]l Except that the TDP between two manufacturers is not comparable (witness intel vs amd). And even if they are accurate, there is no guarantee that the maximum TDP is met (or exceeded) at a given workload. Running F@H on a 4850 vs 9800GTX will probably give different temperatures to running Crysis on the same two cards. You also hit the nail on the head when you mention dual slot and exhaust coolers. A different proportion of heat will be dumped into the case with these coolers, and it would be useful to be able to measure that objectively.

          • Saber Cherry
          • 11 years ago

          l[

            • Voldenuit
            • 11 years ago

            Then you would have seen that the power draw measured by xbitlabs does not always correspond to the quoted manufacturer “TDP”*, either.

            * More often (maximum) power draw is quoted, more to give PSU/motherboard requirements than to provide quantitative information on heat output. And these are usually conservative estimates.

    • michael_d
    • 11 years ago

    I would like to see test results with CrossFireX utilizing 3 or 4 GPUs on Phenom 9850 machine.

    • Forge
    • 11 years ago

    Oh, and something I haven’t really seen addressed anywhere: the idle and load temps on 4850 being not bad, and in fact quite good.

    The 4850’s load temp IME is around 80C. This is not unreasonable, and is in fact lower than most of the other current cards. My 8800GT had a higher load temp, even with a dual-slot cooler.

    The 4850’s idle temps are not right yet. Powerplay isn’t working yet. The 4850 is putting out way more heat at idle than it will be in a month or two. AMD rushed the 4850 a bit to try and stink up the GT200 launches, and I don’t mind, since it got me the 4850 a few weeks earlier. It’s running hot, and performance isn’t 100%, but a lot of that should be fixed in the next week when the 4870 rolls out, as that’s when both 4850+4870 were planned to launch.

    What’s actually nice about the 4850 thermals is that the fan *is* spinning up at load, and load temps are only a few C above idle temps. Once the idle temps get fixed, this is going to be a very very sweet little card.

      • Voldenuit
      • 11 years ago

      So your idle and load temps are both around 80C? I’ve seen reviews that stated idle temps in the 50-60C range. What version Catalyst drivers are you using?

      Also, have you tried remounting your heatsink? There’ve been many reports that this has been helpful *cough*Macbook*cough.

      I’m waiting for mine to arrive in the mail -_-.

        • Forge
        • 11 years ago

        Yep. Running the 4850 hotfix drivers only (despite the naming, they are a full and complete driver release, CCC and all). I was just running F@H GPU and watching a movie, I shut both down when I read your comment. As the GPU utilization dropped, so did the fan speed. From about 29%/2000rpm to 14%/1100rpm (not doing the math, just reading off from GPU-Z). Temps have settled slowly from 82C to 75C, at around the same speed the fan speed dropped. Yesterday the delta was much smaller, but it’s hot and humid so I have the AC running today.

          • Meadows
          • 11 years ago

          2000 RPM sounds good (acoustics-wise) so the limitation might make sense there. My custom GPU cooling can’t run faster than about 82 percent (it’s the same between 82 and 100) and that’s somewhere around 1550 RPM. Still cooler than this, though, by a healthy margin of 20 degrees. And it’s summer here and all, and my case would be warm no matter the season.

        • Forge
        • 11 years ago

        Oh yes, and I did also remount the HSF. I found quite nice Shin Etsu-looking paste on the core and rubbery white pads on the ram. Replaced the core goop with AS5 and reattached the whole lot. I think the loosening and then retightening of the screws on the core bracket makes more difference than anything else. The goop on the core was good stuff and in proper amount, but the four spring-loaded screws on the bracket were a bit loose.

      • zgirl
      • 11 years ago

      Really 80c. I cooked a 8800GTS with a dual slot cooler because it would heat up to the low 80’s because the fan wouldn’t spin up under load. With riva tuner and a new card I got it down to 55ish idle, mid to upper 60s under load. No issues with the card since.

      To me 80c seems too hot. Plus it doesn’t help I have everything in a Shuttle SN25P.

      It seems to be the only issue I’ve seen with the card is it runs a bit hot.

      • boing
      • 11 years ago

      Does a fully functioning PowerPlay require a newer card or just newer drivers?

    • SecretMaster
    • 11 years ago

    As much as the 4850 holds promise, I think I’ll wait for a card with a dual-slot cooler (as people said earlier). Those temperature readings are waaaay too high for my tastes. Especially when I’ve been eyeing the Palit 9600GT, which doesn’t even break 55 at load.

      • sigher
      • 11 years ago

      There are reports the temps are high due to the drivers not having the right code to manage the fan yet, but the question is if that’s bad or good news, it might be pretty bad in that it means more noise down the line and that ATI first has to go through 8 months of horrible driver bugs to get it right.

    • Lans
    • 11 years ago

    #211, of course it does but what you said later is true. As long as not too much of the heat from GPU stays in case (either due to overall case air/heat flow design or design of GPU cooler) then all is well.

    Although, I would like to have more margin should GPU fan slow (normal wear and tear) or die plus added heat means more work to get rid of it.

    I personally just like to build very quiet to silent (if possible) systems so that is why it is a larger issue to me. 🙂

      • Voldenuit
      • 11 years ago

      r[http://www.techpowerup.com/reviews/MSI/HD_4850/22.html<]§

    • crazybus
    • 11 years ago

    I’ll make a point that some people seem to be missing.

    *[

      • ssidbroadcast
      • 11 years ago

      crazybus is on point.

    • Lans
    • 11 years ago

    The things holding me back from getting a HD 4850 right now (seen it on newegg.com) is: GDDR5 (maybe only for HD 4870?) and the load temperature. I think AIB will come up with something soon and replace that stock cooling solution. 🙂

    I personally don’t mind dual-slot cooling solutions as I’ll almost never go for multi card solutions and haven’t run into space problem. But I suppose why go for dual slot when single slot is okay? Anyways, if HD 3870 was any indication, we should see better designs soon. 🙂

    Also, theinquirer.net, pit a 9600 GT against a HD 4850 and there was no comparison (to me). I suppose is useful given the 8800 GT / 9600 GT was / is very popular. 🙂 I would wait for a HD 4650 (guessing based on HD 3xxx naming) or to see Nvidia’s newer mid – low range before considering get a 8800 / 9600 though.

    • ssidbroadcast
    • 11 years ago

    This is the first time in a long while I’ve seen an article break the 200-comment barrier. What a fiery issue!

      • eitje
      • 11 years ago

      or, perhaps a lot of people are making some good money drumming up conversation. 😉

        • ssidbroadcast
        • 11 years ago

        Mm? More like a /[

          • eitje
          • 11 years ago

          i was talking about suspected shills in our midst, but hey – as long as someone’s making money!

            • willyolio
            • 11 years ago

            well, most of the posts do seem to be actually discussing the card/review…

      • Krogoth
      • 11 years ago

      I suspect it is because the 4850 is the first GPU to generate considerable interest since 8800GT. In addition, it is from AMD who hasn’t made anything compelling since X19xx family.

    • Clint Torres
    • 11 years ago

    What is wrong with you people?? Why does everyone have to take sides? These are products, not religions. Gimme a damn break!

    BTW, Tide kicks a$$ and anyone who uses Cheer is an idiot!

      • ew
      • 11 years ago

      Ya, but Cheer has a better price/performance ratio. As a hardcore gamer I just don’t need the sheer cleaning power that is Tide.

        • bthylafh
        • 11 years ago

        Purple! No, Green!

      • Meadows
      • 11 years ago

      Bad analogy. You may take products seriously but you mustn’t take religion seriously.

        • Grigory
        • 11 years ago

        Haha, nice one! At least products are about things that exist! 🙂

      • ludi
      • 11 years ago

      Ah, so you’re lacking in “Cheer”. That explains plenty.

    • DRAKLOR 4000
    • 11 years ago

    gotta give it to those guys at AMD. good job!

    • 0g1
    • 11 years ago

    Cant wait for 4870 reviews next week and availability around 8th July.

      • 0g1
      • 11 years ago

      And X2 in 2 months :D.

    • Krogoth
    • 11 years ago

    It looks like AMD *cough* ATI *cough* embarrasses Nvidia for this round. 4850 is a genuine 9600GT and 8800GT killer.

    Their strategy of shifting R&D resources to make smaller, multi-chip solutions is paying off. Remember how long ago, infamous FUDzilla said that Rxxx family will be multi-chip?

    It is more ironic that AMD’s CPU-division tried to pull off the same single-monolith philosophy (Phenom versus Q6xxx) that Nvidia tried for this round, but ran into problems.

      • PRIME1
      • 11 years ago

      Except that the 8800GT costs $130 less and has been out for a long time. Maybe it’s more embarrassing that it took ATI this long to catch up.

      Also their moment in the sun did not even last a day.
      §[< http://www.firingsquad.com/hardware/amd_radeon_hd_4850_geforce_9800_gtx+/default.asp<]§

        • Krogoth
        • 11 years ago

        Just remove those green-shaded glasses. It is making you look really silly.

        $130 less, WTF are you smoking? 8800GTs still hover around $149 and HD 4850 soundly defeats it with less power requirements. IMO, easily worth the $30-50 difference. 9600GT also goes for roughly the same amount as 8800GT and it has slightly inferior performance to 8800GT.

        The time from 8800GT’s official depute and 4850 is only freaking six months. It is not that bad considering that GPU market has been on the stump side for a while. Nvidia should be the that is most embarrassed when their latest, greatest only is 20-30% faster then design that is almost two years old (8800GTX) for $449 to $649 USD.

        9800GTX+ is just damage control. It is nothing more then a more aggressively clocked 9800GTX (same damm die, just more manufacturing refinements that permit higher clocks). Your link only shows it soundly defeating the 4850 in pro-Nvidia applications (Lost Planet and Crysis).

        The bottom line is that AMD is making Nvidia sweat a little and that is ultimate good for the customer. It is called competition. They can afford better performance without donating a organ.

          • Meadows
          • 11 years ago

          g{

            • Krogoth
            • 11 years ago

            I am hardly the ATI*-fan as you claim. I have own several Nvidia products in the past, because they were the superior choice at the time of purchase.

            I want the superior product to win in the end. Nvidia simply does not have it this time around. They got a little complacence and cocky. ATI* seized the opportunity to get back into the spotlight. Nvidia is doing damage control ATM and probably is making plans for something that will make R7xx and GT2xx look like FXs. 😉

            Let me break it down

            GT2xx = Nvidia’s HD “2900XT” (fast, yet underwhelming performance for what it could do on paper, hot, pricey)

            HD 4xxx = ATI’s “8800GT” (impressive performance/$$$ ratio, same performance as previous generation high-end product, cool, easy on power compared to competition)

            9800GTX+ = HD 2900Pro (last ditch effort to fight off competition)

            ATI* = lets me honest, it is only AMD in name.

            • Meadows
            • 11 years ago

            Bad analogy in my opinion.
            No drivers in the world have been capable to make the 2900 XT work like it was supposed to. On the flip side, we can see nVidia’s success record with their cards and that shows they’ve learnt how to drive those cards and bring out more of the “on paper” statistics, even if some people complain about said drivers. This is the reason why I’m hopeful that they’ll come up with some magic sauce binaries that will make the new cards run faster. It may be the 177.39 release (debut of PhysX support on GPU, real soon), it may be something else – but they’d better bring out more of what they have, and do it quick.

            If I was given money and a choice, I’d naturally go for the HD 4850 (unless the 4870 requires the same power connectors, in which case I’d wait for that one) – I’m lazy and I want to keep my old PSU. 🙂

            • Krogoth
            • 11 years ago

            Newer drivers for HD 2900XT actually made it faster, but it has not enough to make it more desirable over 8800GT and its siblings.

            I doubt GT2xx is going get massive gains from driver improvements. It will certainly gain more, but like HD 2900XT its other faults still remain (hot, power-hungry).

            • Meadows
            • 11 years ago

            It’s neither. Power consumption is very nice compared to the performance you get (with idle consumption being astoundingly good) and it’s not hot either (in sheer contrast with the HD 4850 which is a sandwich toaster – however, the slim form factor might be a positive point for some people).

            • ssidbroadcast
            • 11 years ago

            As much as it pains me Krogoth, I gotta side with Meadows on this one. The GT280, by TR’s account, does have pretty compelling performance when it comes to heat and energy consumption. It’s only downside is the premium price, really.

            • Krogoth
            • 11 years ago

            Ahem, the GT280 is a freaking power hog when loaded. The only reason that it has lower temperatures then 4870 is because it has a huge stock cooler versus 4870’s thin-single slot solution.

            • Meadows
            • 11 years ago

            Somebody didn’t check the power charts in TR’s review.

            • Krogoth
            • 11 years ago

            GT280 only looks good for idle consumption.

            When it is loaded it consumes somewhere between a 2900XT and 3870 X2. Not exactly, a winner by any means. It performs a bit better performance then 3870 X2 in some cases, but in others it is barely faster. I hardly call being more energy efficient. At least, it is not a lot worse then 9800GTX that it replaces.

            4850 on other hand yields near-3870 X2 performance for a little more juice then a 3850. Now that is impressive.

            Sorry, Nvidia is not the winner in energy efficiency for this round.

            • Meadows
            • 11 years ago

            It IS a winner. Maybe you’re deranged, but you’ve got to notice that it performs exponentially better than an 2900 XT OR 3870 X2 while drawing _[

            • Krogoth
            • 11 years ago

            Wow, 20-40% gain is not I would called exponential. It is certainly faster though.

            It is good power consumption compared to 3870 X2 and 8800GTX, but 4xxx family is a lot better. 100W+ is still 100W regardless of its performance delta over competition.

            • ChronoReverse
            • 11 years ago

            Just to be clear, the 4870 has a dual-slot cooler. Only the 4850 is single slot.

            Furthermore, the current tests supposedly don’t have Powerplay working properly yet so idle should drop a bit.

            With that said, while the GTX280 sucks a ton of power on load, it also sips it when idle.

            • Krogoth
            • 11 years ago

            My bad, I had meant 4850.

            My point still holds.

            • eitje
            • 11 years ago

            it comes from the phrase “through rose-colored glasses”. when someone sees things through rose-colored glasses, they’re seeing them as better than they really are.

            i know it’s a song title, and my guess would be that the root of this particular meme is that hippies used to wear rose-colored glasses (still do, i bet).

        • Convert
        • 11 years ago

        I thought you were in the states PRIME1? I am not sure what the $130 less means exactly since currently, in the US anyways, the difference between the cards is nowhere near that.

        The 8800GT is badly beaten, especially when you take into account the difference in price. At least at TR.

        The article you link to really does show a different story though, the 4850 is rather underwhelming if those results are anything to go off of.

        • Fighterpilot
        • 11 years ago

        LOL..take a look at Firing Squad’s Bioshock results compared with Anand and other sites. (1900×1200)
        Your link shows the 4850 getting beat by just about every Nvidia card but virtually every other site has the 4850 easily beating all but the GX2.
        §[< http://www.anandtech.com/video/showdoc.aspx?i=3338&p=12<]§ Those green tinted glasses he's wearing are to hide the tears in his eyes :)

          • Chaos-Storm
          • 11 years ago

          Firing Squad uses beta drivers, while Anand actually uses 8.6, AFAIK

        • danny e.
        • 11 years ago

        math?

    • lycium
    • 11 years ago

    i’d really like to get one, if they could do something about those crazy temperatures! forget sli / crossfire, i just want a single large and cool-running gpu block…

      • Meadows
      • 11 years ago

      Anything up to and including 100 degrees is perfect operating temperature for a GPU – so long as the RAM chips are not covered by the same heatsink.

      • Lord.Blue
      • 11 years ago

      supposedly you can from what I’ve seen around the net…have to find you a link…(rumages through the URLs….)

    • Jigar
    • 11 years ago

    WOW, Amd’s small monster 4850 (big monster is on its way ) just killed the 9800GTX… This was really good show..

    • Pax-UX
    • 11 years ago

    This is ever interesting, great review!

    • YeuEmMaiMai
    • 11 years ago

    waits patiently for the 4870 to hit the market…….

    • blerb
    • 11 years ago

    Yep, this is definitely out at newegg.

    • Kurotetsu
    • 11 years ago

    I really would like to see how a Crossfire of 2 of these performs, particularly on a P45 board. You’d FINALLY have a platform that offers both top performance AND stability (no, nForce + SLI does not offer that already).

      • elmopuddy
      • 11 years ago

      Guru3d has some CF results.. pretty nice too

    • PetMiceRnice
    • 11 years ago

    See, competition is good for the marketplace.

    • paulWTAMU
    • 11 years ago

    I’m really impressed by this card–it’s getting amazing performance at a very reasonable price point. I’m debating upgrading within the next year, looks like ATI may be the GPU of choice this time, for the first time in recent memory 🙂

    • cynan
    • 11 years ago

    I wonder why there is a discrepancy in performance across the various reviews?

    For example, TR and Guru3d show similar favorable performance for the HD 4850. However, PC Perspective, for example, shows performance to be on par with an 8800gt…

    Could the new catalyst 8.6 drivers really make that much of a difference? (I don’t think PC perspective used these..)

    Anandtech’s review seems to peg performance somewhere in the middle – and it is using catalyst 8.5 drivers… Interesting

    • Thresher
    • 11 years ago

    Something that I think gets a bit of short shrift in these things is image quality. Unfortunately, it’s damned difficult to quantify. I have an 8800GTS (640) in my gaming rig and an X1900XT in my Mac. While the video card in the Mac is definitely the slower of the two, the image quality is loads better. I cannot tell you why with any certainty, but I can definitely tell a difference.

    I remember thinking the same thing back when I got my 9800, which was my first ATI card ever.

      • Mourmain
      • 11 years ago

      I’ve seen reviews showing screenshot comaparisons… let’s hope one appears for the new generation of cards.

        • sigher
        • 11 years ago

        That’s a bit iffy, screenshots of an ATI card shown on a nvidia powered system, what are you seeing really? plus some stuff affecting experience is only visible when in motion.

          • Chryx
          • 11 years ago

          Are you really suggesting that an nvidia card will misinterpret jpeg (or ideally tiff/png) screenshots? because that’s kinda what you’re saying there.

            • Meadows
            • 11 years ago

            Nonsense. G80 and above has virtually equivalent image quality to competing Radeons because of new DirectX architecture and precision enforcements (not talking about Dx10.1 here, since that one only adds some trivialities to the list).

            Quite actually, that 8800 GTS probably has _[

            • boing
            • 11 years ago

            Could also be the old issue of low-standard DAC’s used in the early GeForce 1/2/3/(4?) cards. I still remember when the Matrox G400 gave far superior imagequality. Nowadays when almost everyone is using digital DVI I don’t think that’s an issue anymore.

            • Forge
            • 11 years ago

            Well, when I switched out my 8800GT for my 4850 yesterday, I immediately noticed that BIOS text and Windows Safe Mode were OMFG clearer. I even put the GT back in to test my response. Text modes and low res were very very very clearly different. I’m not sure if it’s GPU vs. panel scaling, but the Nvidia output was very round and slightly fuzzy, looked almost antialiased. The 4850 output is razor sharp.

            When I put the 4850 back in, installed drivers, and returned to 1600*1200, it was far less pronounced, but I believe it’s a tiny bit clearer there too.

            The low-res output’s perceived quality is not even comparable.

            • boing
            • 11 years ago

            Are you using VGA or DVI-output?

            • Meadows
            • 11 years ago

            Take a wild guess. I suppose it’s VGA.

      • willyolio
      • 11 years ago

      some people have tried, mostly subjective tests, and i think ATI usually wins out over nVidia.

      §[< http://www.maximumpc.com/article/videocard_image_quality_shootout<]§ that's the latest one i've seen.

    • dragmor
    • 11 years ago

    1) Powerplay for the 4850 cards is not yet enabled, hence the high idle power draw
    2) Members at OCAU are getting a 20c reduction in temps by remounting the stock cooling using AS5.
    3) Do not buy the 4850 until you have seen the results for the 4870. The 4870 is the card you really want (and wont cost much more). Double the memory bandwidth of the 4850, much better cooler and beats the GT260 in pretty much everything.

      • 0g1
      • 11 years ago

      Thanks for the info. I’m really anticipating the 4870 because almost double memory bandwidth and 20% faster GPU. Should kick some serious ass. Although it wont beat the GTX280 very often, it will get very close for a very big price difference.

      • mako
      • 11 years ago

      If idle power gets lower, then yeah, I’m upgrading.

      • Flying Fox
      • 11 years ago

      So the heatsink mounting issue, can this be a first batch thing? May be wait a little bit they will mount it properly? By then we will have proper Powerplay profiles?

      Or may be we need to wait for the bigger board makers to slap on their own coolers?

    • elmopuddy
    • 11 years ago

    Great timing! My old 7900GTX died, so I will move my 9600GT to that machine, and get me a 4850 or 4870 to go with my new 24″ monitor.

    yummy, gotta love competition!

    • PRIME1
    • 11 years ago

    FYI it looks like the 9800GTX+ will be 55nm so it’s more than just a clock bump, it’s a new chip.

    §[< http://www.pcper.com/comments.php?nid=5817<]§

    • ish718
    • 11 years ago

    1+ for AMD
    🙂

    -1 for Nvidia and their overpriced product 😮

    • PrincipalSkinner
    • 11 years ago

    When Nvidia launches it’s next gen cards we have TR’s review that spans 16 pages with going into details about architecture.
    When ATi lauches, we have review that spans 11 pages with no details at all. Why is that? Is it because it’s just ‘a first look’? I’m not being cynical, i’m just asking.

      • MadManOriginal
      • 11 years ago

      Read again, it’s because AMD was forced to move up the launch date/performance NDA because lots of resellers were already selling it. They did not lift the technology NDA that will let us know about the architectural details though. That will come on the 25th which was the original launch date.

      • ssidbroadcast
      • 11 years ago

      Time constraints, methinks. Geoff got the carpet pulled out from underneath him with the unexpected early-lift of the press embargo.

    • FubbHead
    • 11 years ago

    What…. Didn’t I just buy a 8800GT…. like 6 months ago? There’s already TWO model series above that? You gotta be friggin’ kidding me….

    Oh well, good to see it might just be ATI…AMD… whatever…. next time.

      • Thresher
      • 11 years ago

      Not really. The 9xxx series is the same architecture, just a die shrink.

      This is a major problem for nVidia, their naming schemes tend to be overly complex. They said they were doing away with all that crap, then they go ahead and make a 9800GTX+. Go figure.

    • leor
    • 11 years ago

    this is the first card i’d consider swapping my 8800GTX for, if for no other reason than to free up a slot in my case and cut down on some of the heat!

    • Thanato
    • 11 years ago

    Can the 4000”s crossfire with the 3000’s? Please tell me it will work cuz then i don’t need to discard my 3870’s for a 4000 series, I can just add them all together and save a ton of cash.

    • gerryg
    • 11 years ago

    Don’t know if anybody said it already, but I really appreciate the new way TR is doing the performance graphs. Much better!

    And the fact that you guys got this review together with all the appropriate comparisons as quickly as anybody and definitely better than most was definitely impressive.

    Kudos all around to Team TR!

      • FubbHead
      • 11 years ago

      Personally, I would really like the product(s) reviewed to stand out a bit more in the graphs, so they get easy to spot.

        • Anomymous Gerbil
        • 11 years ago

        Yes, I can never understand why they don’t colour-code their graphs like that.

          • Damage
          • 11 years ago

          This is some kind of joke, right?

            • cynan
            • 11 years ago

            As in #105 is giving a sarcastic reply to #94.

            Unfortunately it would seem that #94 is genuine

            Some people are just never satisfied…

            *[

            • gerryg
            • 11 years ago

            Eh? *Unfortunately* I was genuine? I complimented TR on making the graphs clearer. The recent Radeon 3870 1GB review was old-style. The GTX 280 article was mostly newer, in that it ID’d the green team from the red team. This one on the 4850 did red/green again but also highlighted the specific product being reviewed. So over the past three articles there’s been a significant improvement, IMO. I have no idea what you or #107 or #105 are talking about.

            Did anyone else think the graphs were honestly an improvement? +/-1 your answer.

            • Voldenuit
            • 11 years ago

            The GTX280/260 review was bad because it was hard to find the focus of the review in the mass of bars.

            The 4850 review was an improvement in that regard, but I still think it could be cleaned up to look more aesthetically pleasing (not to mention easier to digest). A faded green/red for the other products would have made the 4850 bars stand out more imo.

            • BobbinThreadbare
            • 11 years ago

            I really liked the new graph colors. Easy to tell which cards were being reviewed, easy to tell what cards were from which company, I liked the shade of green and red they picked. It was awesome.

            • JustAnEngineer
            • 11 years ago

            Hooray, for excellent visual communication! I assume that Dissonance has read Tufte’s book:
            §[< http://www.amazon.com/Visual-Display-Quantitative-Information-2nd/dp/0961392142/<]§

            • indeego
            • 11 years ago

            They could go a few steps more:
            1. Define the colors. Yes we are geeks and know what brand is red and green, but you should still define the elements for those who aren’t aware, or just those that see a mass of colors.

            2. Define the reviewed card color. Optionally, keep that reviewed card color the same throughout all reviews, so you can tell at a glance from all graphs what is what.

            Oh and I find it somewhat ironic that the cover of that book is fairly hard to readg{<.<}g 🙂

        • A_Pickle
        • 11 years ago

        WE SHUD MAYK THEM STAKS UF GRAPIKS KAWRDZ DUHPENDING ON HOW FAST TEHY AHR. LULZ.

      • DrDillyBar
      • 11 years ago

      yes… Breath Damage, Breath…

      • Bensam123
      • 11 years ago

      Call me silly, but I liked the way they were colored before. Now it’s starting to turn into a multi-color showroom. I’ve seen posts like this in like every hardware review for a few months and I’m personally waiting for it to escalate into asking for neon colors, perhaps animated bar graphs? -_-;

      If there was a better coloring scheme, I would say it would be coloring all the cards that compete with each other the same color so people can easily compare them (either by model or by price). But that just makes too much sense.

      The Gefore or Radeon at the left side of the graph tells you which brand they are easy enough. Perhaps I’m not seeing the reasoning for coloring them green or red because I sit in neither camp.

    • Saber Cherry
    • 11 years ago

    What’s the difference between the 4850 and 4870? Is this public information?

    Also:

    Do ATI’s drivers still use over 100 megs of ram? I love AMD, but I will never install such garbage code on my computer…

      • MadManOriginal
      • 11 years ago

      Afaik stock clock speeds, GDDR5 on 4870 although there are supposed to be GDDR5 4850 variants, and probably double-slot cooling on 4870. I really want to know if there are PCB differences that might make the 4870 oc better.

      • ChronoReverse
      • 11 years ago

      The CCC and the drivers are separate. You can install the drivers without the CCC

        • Saber Cherry
        • 11 years ago

        I’ll take that as a ‘yes’. Last I checked, it was impossible to control the drivers without CCC – things like anisotropic filtering levels, TV out, color response curves, optimizations, and so forth.

          • ew
          • 11 years ago

          CCC for the 8.6 drivers is running in the system tray and task manager has it at 6,604k of memory. When I bring it to the foreground it goes up to 22,284k.

            • Saber Cherry
            • 11 years ago

            Thanks. That’s not so bad at all (but it should still be about 2 MB).

            • BobbinThreadbare
            • 11 years ago

            It used the .NET library, so complain to Microsoft about memory usage. I think it’s much better on Vista because Vista loads the .NET library on startup.

            • Saber Cherry
            • 11 years ago

            It was their choice to use .net instead of sticking with C. There was no problem with the functionality in the old control panel, they just wanted to make it ‘slicker’ or something, lost a lot of customers in the process, and refused to bring back the old one despite popular demand. Writing any part of drivers in a high-level super-bloated virtual-machine-based language is asinine, even though the actual hardware interface component of the drivers is probably not written using .net. Not that .net is technically a language, it’s just a bloat system underneath a language.

            • willyolio
            • 11 years ago

            /[

            • Saber Cherry
            • 11 years ago

            Oh really?

            It’s easy, isn’t it? Just program some garbage that takes 100 MB of ram, and say “Ignore the numbers, they’re wrong”. You can say the same about framerates.
            “Q: Why is the framerate so low in benchmarks?
            A: In actual gameplay, the framerate will be much higher.”

            So, what is this imaginary memory used for, and why is it used if it is not needed? Guess what, I’m a programmer. In fact, I’m an exceptionally good programmer, whether or not you want to believe the word of a random person on the internet. And good programmers don’t take more memory than they need.

            • Meadows
            • 11 years ago

            Saber Cherry is on point.

            Besides, I used to program too but that was a hobby and it died out of me.

            • ew
            • 11 years ago

            Actually he is not on point. He has completely missed that this behavior is a byproduct of garbage collection type memory management. Garbage collection has exactly the behavior described by AMD’s FAQ and it is a perfectly good way to program a user space tool like CCC. Garbage collectors do have a small performance penalty but performance is not necessary for CCC. CCC is not the driver as Saber Cherry seems to think. It won’t affect game performance.

            • ew
            • 11 years ago

            If you were really a good programmer then you’d know that .NET uses a garbage collector for memory management and in that case the claims AMD is making are reasonable.

            Again, it just sounds like you know nothing about what your talking about.

            • Saber Cherry
            • 11 years ago

            I mainly write in Java, which has garbage collection and far less memory overhead than .net. And you don’t seem to understand what garbage collection is. It only works on released memory (objects that can no longer be referenced by the program); it doesn’t just magically free up memory at will that is still accessible. The only possible way to automatically free up large amounts of memory that a program is still using, on demand (as ATI claims), is to swap it to the page file*. This is something that all programs do automatically, because it is handled by the operating system. If you are using 100 MB of excess memory, that swap will take (on my computer) about 5 seconds /[

            • ew
            • 11 years ago

            Are you even reading what other people write? My first post in this thread stated that CCC from the 8.6 driver distribution uses 6MB according to task manager when it is in the system tray. So are you just arguing for the sake of it now or what?

            • Saber Cherry
            • 11 years ago

            No, someone wrote 6MB / 22MB. But then someone ELSE wrote that in Vista, .net is loaded at startup. Therefore, assume the usage will be much higher in XP. My friend uses an ATI card and has (on XP) a fresh boot RAM usage of 150MB, on a clean nlited slimstreamed XPSP3 install… and it went up from 80 (right after the clean install of Windows) to 150 after installing ATI’s driver package. Mine, with an nVidia card and a 5+-year-old XP install, boots at 93MB right now. I configured all of his services so that they would be like mine, except that he does have a printer and a some handheld thing he docks with his computer, both of which use drivers.

            Nobody in this thread has clearly posted the RAM usage of ATI’s driver package in XP, so get off your high horse.

            • ew
            • 11 years ago

            If there is no information then why do you keep harping on about them being poorly written? What evidence suggests that to you. BTW, the 6MB/22MB numbers were for XP.

            • reactorfuel
            • 11 years ago

            Windows’ process memory usage numbers are misleading, though – to continue your analogy, it’s like interpreting a synthetic benchmark versus real-world results.

            The “mem usage” column doesn’t accurately report the amount of memory actually used by a process. As a supergenius programmer, I’m sure you understand how this can’t really be distilled down to a single number: you have shared libraries and resources accessed by more than one process at a time. Which process “owns” something like comctl32.dll? Even if you discount stuff that’s shared between processes, it’s still hard to boil things down to a single number. You have to poke around and see what’s actually taking up memory, what’s just memory-mapped I/O, and so on. For a rough-and-ready number to throw into Task Manager once every couple of seconds, it’s far easier to just look at how much address space the program is using.

            If you’re interesting in reading more, there’s a pretty good guide available here:
            §[< http://shsc.info/WindowsMemoryManagement<]§

          • crazybus
          • 11 years ago

          You could try ATI Tray Tools.

            • A_Pickle
            • 11 years ago

            Absolutely not. This way, we can continue debating about memory usage, good programming practice, how .net sucks, how free memory actually ISN’T wasted memory, etcetera.

            Be off with your “compromise” or problem-solving blather.

      • cegras
      • 11 years ago

      Uh, why do you want ‘free memory’?

      The fact that the RAM is being used is a good thing. Vista doesn’t ‘hog’ ram, it’s preloading things to memory so that it can be loaded faster.

      What is it with people and being worried about running out of ram? The OS can intelligently allocate resources to and from programs (or at least I hope). The way you talk makes it seem like you want everything to be streamed off the hard drive.

      • shank15217
      • 11 years ago

      8GB of ram will cost you $140 on newegg.. seriously this is nitpicking

        • Meadows
        • 11 years ago

        The score is yours.

        • Saber Cherry
        • 11 years ago

        XP can’t typically handle more than 3.25 GB; my current motherboard only supports DDR1; I only have 1 GB of RAM; and loading time/responsiveness of a program is proportional to RAM usage. If a program is small enough to fit in a modern L2 cache, it will fly.

        You bring up a good point, though. As long as a resource is inexpensive, it should be wasted profligately. That’s why I love flash ads – because I love dedicating my computer resources to other people’s garbage.

        And by the way~

        Just because you don’t have a use for something, does not mean that nobody does. For me, every last byte of RAM is useful.

          • Meadows
          • 11 years ago

          Which is why people use Vista with SuperFetch.

    • wingless
    • 11 years ago

    “HD 2900XT, you have failed me for the last time!” (“Vader chokes” video card to death)

    Folks, this isn’t going to be a hard choice for me. I like Nvidia but why not get next gen hardware for last gen pricing. The 9800GTX+ will probably be spiffy as hell but its last gen as far as I’m concerned. 2×4850 in CF simply OWN. The 4870 may as well be called a high end card at this point too….

      • A_Pickle
      • 11 years ago

      Enjoy the multi-monitor support and monthly driver updates. 😀

    • Sargent Duck
    • 11 years ago

    Actually, I was wrong. Tigerdirect up here in Canada has them in stock. $243 though…

      • DrDillyBar
      • 11 years ago

      my local store has them listed at $199.99

        • DrDillyBar
        • 11 years ago

        anyone know if I can CF a 3870 and 4850 haa

          • Deli
          • 11 years ago

          Yes you can xfire 3xxx and 4xxx. go to fudzilla.com, they have an article of someone doing it. pics and 3dmark06 run.

            • reactorfuel
            • 11 years ago

            Not an article, just a link to a German forum post. With one screenshot. Oh, and a low-resolution picture of the cards physically fitting in a system. ATI might have something in the works, but official or widespread confirmation would probably be a good idea before anybody puts down money based on supposed cross-generation multi-GPU support.

      • ChronoReverse
      • 11 years ago

      NCIX now has them for $209 (the Powercolor one is $199 after the instant rebate).

    • Chrispy_
    • 11 years ago

    \o/
    Market competition I care about!

    • Dent
    • 11 years ago

    Any chance Tech Report might try out Age of Conan MMO, and report on performance of some of the better price/performance cards?

    Right now, it tends to perform poorly on many reasonable cards. Radeon cards seem to be suffering particularly (yes the game has the Nvidia, how its ment to be played logo/video on startup).

      • djlex
      • 11 years ago

      /signed.

      AoC is sort of the the Crysis of MMORPG’s (poorly coded or high tech graphics – take your pick).

      Yes please, if possible, work AoC benchies into your mix. While the video card world is historically dominated by FPS players, a lot of us play MMO’s as well. AoC picked up over 400,000 subscribers in it’s first week alone, making it one of (if not THE) biggest MMORPG releases in recent history.

      As Dent stated, Age of Conan has been particularly taxing on GPU’s, and unfortunately performance hasn’t exactly been uniform from card to card. Near as I can tell, the only site currently benching on AoC is [H] (and pcghx in weird german), and they’ve presented some interesting info on this shader intensive game. According to [H], even the GTX280 can’t play 2560×1600 max eye candy.

      I’m currently designing a system to play AoC on a Matrox Triple Head 2 Go (TH2G) which requires 3840×1024. My head has been swimming with all these options, and it’s seeming more and more like I’ll need to find that ONE particular setup that happens to like this game.

      ..Incidentally, this is my first post here. Sorry for the book. Thanks for all the work ya’ll do.

      • ish718
      • 11 years ago

      Reviewers rarely test MMORPGs on video cards

        • Mystic-G
        • 11 years ago

        Indeed, but if more start straining video cards then that could change.

        • djlex
        • 11 years ago

        True. I’d guess that’s due to one or more of the following:

        – MMO’s are difficult to uniformly bench due to dynamic in-game conditions and the lack of built-in “time trial” tools.

        – The most widely subscribed MMO is very easy on hardware. Pop in a 7xxx or even a 6xxx and run WoW maxed out on your mom’s Dell…

        – Aside from very hardcore PvP, framerate hasn’t been all THAT critical in MMO’s. It’s not like you’ll be able to zomgwtfpwn with 10 extra frames/sec as in some shooters. Less competitive gameplay linked to system performance leads to less hardware tweaking.

        Things are changing now that MMO’s are updating their game engines. There are even games on the horizon (albeit not very good ones) using the Crysis CryEngine2.

        • Pettytheft
        • 11 years ago

        They tested Guild Wars for quite a while here.

          • d0g_p00p
          • 11 years ago

          Guild Wars is not a “true” MMORPG. You can pretty much replicate the same settings.

    • Lord.Blue
    • 11 years ago
    • ssidbroadcast
    • 11 years ago

    It bothers me that nVidia stooped to basically overclocking their pre-existing 9800 and threw in a -[

      • GTVic
      • 11 years ago

      I support that idea.

        • ssidbroadcast
        • 11 years ago

        Hey, thanks buddy.

      • deepthought86
      • 11 years ago

      Look at how that pathetic Anandtech handled the Nvidia last-minute desperate GTX+ annoucements. What a sad little shill

        • MadManOriginal
        • 11 years ago

        And aside from Bioshcok their review numbers don’t put the 4850 in as good a light as other reivews. Hmm :-/

      • thecoldanddarkone
      • 11 years ago

      It’s not just an overclock, it’s a shrink and overclock.

        • ssidbroadcast
        • 11 years ago

        No it’s just an overclock.

          • Mystic-G
          • 11 years ago

          You guys are /[

            • MadManOriginal
            • 11 years ago

            It’s both but from the practical standpoint that the die shrink doesn’t change the architecture, just allows faster clockspeeds, calling it an overclock is sufficient.

            • ssidbroadcast
            • 11 years ago

            Where’s the article or link that confirms it’s also a shrink?

            • thecoldanddarkone
            • 11 years ago

            post 117, they even have pictures…

    • GTVic
    • 11 years ago

    Good move for ATI to ignore the ultra high end, way too expensive, graphics segment. That means more effort is spent on the cards the majority of us purchase.

    This looks like a good replacement for my 7600GT which I believe was the 2nd most popular card in the last Steam hardware survey.

    §[< http://www.steampowered.com/status/survey.html<]§ We need a way to punish companies that optimize games solely for nVidia hardware.

      • l33t-g4m3r
      • 11 years ago

      just don’t buy their games. works for me.

    • ludi
    • 11 years ago

    “These temperatures were recorded while running the “rthdribl” demo in a window. Windowed apps only seem to use one GPU, so it’s possible the dual-GPU cards could get hotter with both GPUs in action. Hard to get a temperature reading if you can’t see the monitoring app, though.”

    Dual monitors?

    • glacius555
    • 11 years ago

    I am shocked… Hehe, I think I’ll give my 9600GT away VERY soon!!! BTW, it was not necessary to include 8800GT in the equation today, now that nVidia will sell 9800GTX for 199$ (and this one definitely kills 8800GT many times).

      • UberGerbil
      • 11 years ago

      I like having older cards in the charts. Not everyone upgrades every six months, especially at the mainstream price points, so if you’re looking at replacing a card that’s a year or two old you’d want to know how much delta you’d really see. I would’ve liked to see the 9600 in the chart as well.

    • Fighterpilot
    • 11 years ago

    Great card for $200.
    Power draw under load is amazingly low.The load temps look awfully high tho….wonder what’s up with that?
    Very nice improvements over the last gen cards…+1 to the Red team.

    • Price0331
    • 11 years ago

    This is one of the reasons PC gaming is far from dead.

    • Ricardo Dawkins
    • 11 years ago

    Good job, ATI…AMD

    • Thanato
    • 11 years ago

    I’d really like to see benchmarks with an all AMD platform.

    • DrDillyBar
    • 11 years ago

    Very cool.

    • Mystic-G
    • 11 years ago

    r[

      • VILLAIN_xx
      • 11 years ago

      haha.. youre probably right! BUT i have other intentions for the ATI brand.
      Modding reasons. I hear these new designs will be moddable to Firestream.

      Plus, im not gonna pay anything over 199 to get similiar performance to a 9800gtx.

      PS. TWIMTBP = more reason for me not to endorse Nvidia for fishy practice. It’s Moral territory now. I can care less if their TWIMTBP brand meanz they getz an extra 10 fps’z!

    • Sargent Duck
    • 11 years ago

    Wow. This thing beat my wildest expectations. I knew I was gonna be buying one, but I’ll definitely be getting one now. Just gotta wait for some to get up to Canada…

    • cappa84
    • 11 years ago

    Good job AMD and ATi!! Congrats!!

    • StashTheVampede
    • 11 years ago

    Here’s hoping that EFI64bit roms are out there and this card can be flashed!

    • SNM
    • 11 years ago

    Wow, that’s a monster.

    But I have to ask, why did this review (and the GT280 one) leave out the highly popular 8800GT while including something like the 2900XT?

    • oldDummy
    • 11 years ago

    A quick look at price trends implies a choice between a GX2 or 4850 CF for about the same pricepoint.

    Both good choices….

      • ChronoReverse
      • 11 years ago

      ATM the 4850 would have better AA performance and DX10.1 though. It would come down to whether one has a CF-mobo or not.

        • oldDummy
        • 11 years ago

        The power draw and temp edge seem to favor the GX2.

        Implied anyway.

          • Lord.Blue
          • 11 years ago

          seems like you might be able to make it better if better thermal compounds are used (say AC5)

          • ChronoReverse
          • 11 years ago

          Power draw is much better for the 4850 but temperatures are poor because of the crappy cooling solution

      • ssidbroadcast
      • 11 years ago

      Keep in mind CF drivers are more plentiful, offer better features (like dragging windowed 3D inbetween two monitors, or for that matter, multi-monitor support period)

    • bthylafh
    • 11 years ago

    Too bad ATI’s Linux drivers suck. That little fact means I’m stuck with Nvidia.

      • ChronoReverse
      • 11 years ago

      Have you tried the current Catalysts for the Linux? From what I hear, they work fairly decently and a new set is released every month with more improvements.

        • cobalt
        • 11 years ago

        I tried them not a couple months ago and still ran into bugs. They’re definitely improving, but I think NVIDIA still has the upper hand. . . for now.

          • glynor
          • 11 years ago

          A couple of months is a /[

            • Lord.Blue
            • 11 years ago

            The 8.6 batch were supposed to be a large improvment even in Linux.

            • BlackStar
            • 11 years ago

            This is true. I just installed 8.6 and they are markedly improved over 8.5.

            Since Catalyst 7.10, fglrx has followed an awesome upwards spiral to the point that most issues problems are resolved now (unless you are using some specific cards with PCI-E <-> AGP bridges).

            I don’t know how, but AMD/Ati no longer suffer from the driver issues of yesteryear. Indeed, my Nvidia 6800 and 7600 have many more driver problems than my Ati 9600 and X1950, especially on Vista.

            • cobalt
            • 11 years ago

            I understand that a lot can happen in a couple months, but in comparison to the past 10 years I’ve been unable to use an ATI card, it doesn’t feel like very long at all. I keep revisiting the situation periodically.

            I sincerely hope their latest drivers really are an improvement — there’s been no competition in the high-end visualization market for a long, long time. I’ll take a gander when I have some time.

      • l33t-g4m3r
      • 11 years ago

      Bzzzt! wrong.
      phoronix just did an article on this, and they pretty much say ATI is ahead of nvidia for linux support.
      §[< http://www.phoronix.com/scan.php?page=article&item=amd_evolution&num=1<]§

        • Pettytheft
        • 11 years ago

        Why is it every time people see something that says ATI all the Linux nuts come chiming in. This is not 5 years ago things have changed. Try a ATI card before regurgitating old information.

          • DrDillyBar
          • 11 years ago

          I totally couldn’t get my Rage128 working…

          • bthylafh
          • 11 years ago

          I had the worst time getting my 9600XT working well in Linux, even until I replaced it a year ago. That was an older card that should have been well-supported by early 2007. Instead I got piss-poor framerates and so-so stability.

          Maybe they really are good now, but I’ll let someone else be the guinea pig, kthx.

            • Lord.Blue
            • 11 years ago

            Ever since AMD bought ATi, their drivers have improved by leaps and bounds. Even the dreaded CCC works now.

            • ludi
            • 11 years ago

            Dude! The 9600XT was released in, what, 2004? Old is as old does; cards that old are lucky if they even get updated Windows support at this point.

            • A_Pickle
            • 11 years ago
            • bthylafh
            • 11 years ago

            Have you a point? The card had never been supported well through its entire life. Yet with the GF2MX that preceded it and the GF7900GS I’ve got now, the Linux drivers were nearly as good as the Windows versions from day 1.

            • Lord.Blue
            • 11 years ago

            I have a 9600 and it is supported even in the new drivers.

            • ludi
            • 11 years ago

            Nvidia has had good Linux support from the way back when, even before ATi had any Linux support at all, so that’s not too surprising.

            What was /[

        • cobalt
        • 11 years ago

        That buzzer sound seems to imply that you believe ATI being ahead of NVIDIA for Linux support is obvious. In my experience, if true this would be the first time ever.

        Thanks for the link. Timely article; I’ll have to read when I get a chance (and give their latest drivers a try, too, of course).

    • oldDummy
    • 11 years ago

    Good Job getting this out on short notice.

      • Damage
      • 11 years ago

      Yeah, but why didn’t we test with the 8800 GT in Mass Effect?!!

        • glynor
        • 11 years ago

        Nice work to Geoff and the whole Tech Report team. Keep it up.

        I honestly think it is sites like this one that are forcing both of the two companies to stay honest and not just keep re-labeling old tech with new naming schemes and calling them new cards.

        Well… /[

        • danny e.
        • 11 years ago

        or throw the 9600GT in … also some of the older games.. where the heck is Quake III in the round up. Not all of us can afford all the “fancy new games”!!!!!$%$

        • astraelraen
        • 11 years ago

        Actually my comment on Mass Effect was more of a suggestion.

        My comment about the 8800gt was real though, as it is clearly a very popular video card which is worth comparing too. You made a comparison to the 2900xt and 8800gtx which probably have less of a user base than the 8800gt. I assume you probably did not have a spare 8800gt laying around.

        I realize the hastiness to which this review was put together and I respect that. Techreport is awesome. 🙂

          • UberGerbil
          • 11 years ago

          It wasn’t put together with hastiness. It was put together with /[

        • eitje
        • 11 years ago

        and you didn’t tell us how well it folds. 😛

    • Voldenuit
    • 11 years ago

    Nice. I’m now feeling validated in placing an order for a 4850 this morning.

    Interestingly, the price on the Sapphire card jumped from $185 to $197 20 minutes after I placed my order… :p

    • Usacomp2k3
    • 11 years ago

    Wow. Color me impressed. If they have enough supply, this card is going to be a huge seller.

      • eloj
      • 11 years ago

      Now, if someone could explain why a graphics card that isn’t doing _anything_ is drawing 100W and hitting 89c…

      I mean, they DO clock these suckers down, right? And there are no shader units, texturing units, etc working when the framebuffer is basically just sitting there.

      Insane.

        • Voldenuit
        • 11 years ago

        Remember, the *system* is drawing 100W, not just the GPU. Idle power draw on the 4850 is typical of most GPUs, it’s rather that the GT200 series has some very advanced and aggressive power saving schemes that makes it look bad.

        And the temps reported were load, I don’t see idle temps reported.

        • elty
        • 11 years ago

        1) The card does not draw 100W on idle. The whole system draw 127W on idle.
        2) The temperature is for full load, not idle.

        • Helmore
        • 11 years ago

        PowerPlay, ATI’s version of Cool and Quiet, is not yet enabled in current drivers of these cards and that’s why power consumption on idle is higher than what it’s supposed to be. Idle power consumption on these cards is supposed to be even lower than the 3870.

          • l33t-g4m3r
          • 11 years ago

          good to know.

    • desertfox84
    • 11 years ago

    Really? 59 r[

    • BlackStar
    • 11 years ago

    This looks like my next card, jumping from X1950Pro. DX10.1, too, compared to Nvidia’s 200 series.

    A great value, too, at $199 – absolutely agree with #3’s comment.

    • Spotpuff
    • 11 years ago

    Good news for all consumers and it’s nice to see AMD back in the mix 🙂

    I am, however, disappointed with the lack of MMA Donkey news.

    • asdsa
    • 11 years ago

    “We chose to test the game without this patch, leaving DX10.1 support intact.” An excellent touch :). And GTX+ is just a pathetic attempt from nvidia to ruin 4850 lauch but it just won’t work. ATI Rocks!!!

    • astraelraen
    • 11 years ago

    Why no 8800gt in the comparison? I’m sure its a card ALOT of people own and would be good to compare with this card.

    Also, you guys should pick up Mass Effect for testing, it seems like a pretty taxing game in some areas. Also, its really pretty 🙂

      • eloj
      • 11 years ago

      Suggest not using it because of DRM-issues.

        • JustAnEngineer
        • 11 years ago

        Exactly. You get three Mass Effect installations, total, for however many years that you own the game. TR testing would permanently use up a dozen of the overly-restrictive SecureROM licenses that EA uses for Mass Effect.

    • elty
    • 11 years ago

    I am not sure why AMD chose to sell this for $199. They can sell it for $249 or even $299 and it will still provide a better value than the 9800.

    For $199 you have a card that can destroy 9800 GTX, and sometimes faster than the GTX 260 – a $399 part that does not even have DX 10.1. The only downside is you can’t enlarge your epeen with 3D Mark Vintage, if it really matters.

      • ChronoReverse
      • 11 years ago

      Who cares, it’s finally time somebody has stepped up to the plate and stopped burning us with the high prices (I’m starting to mix metaphors in my excitement).

      • VILLAIN_xx
      • 11 years ago

      hey, for 199 bucks.. it makes people think for without a doubt this is their next card.

      I know when im forced to go down the DX10 route when battle field 3 comes out, this will be my next card of choice. Hell, price might even be competitive even more next few months.

      • Liquidus
      • 11 years ago

      Aren’t you kind of exaggerating? This card isn’t THAT different from a 9800 GTX. If it were $250, I might get the $200 9800 or a 8800 GTS instead.

      • Liquidus
      • 11 years ago

      Aren’t you kind of exaggerating? This card isn’t THAT different from a 9800 GTX. If it were 250 or 300, I might get the $200 9800 or a 8800 gts.

        • Lord.Blue
        • 11 years ago

        Why? Especially when the driver support is so much better on the AMD side?

      • lex-ington
      • 11 years ago

      Think about it . . .

      Once they say it’s “Next Generation”, alot of people will just buy it at $199 without checking any data to see where it falls.

      For the price of the GTX260, you can go crossfire
      For the price of the GTX280, you can do a triple crossfire setup.

      Crossfire sales, whether AMD or Intel, means more money for AMD whether through direct sales oftheir own stuff, or licensing fees from Intel.

      It also shows that their process is stable and probably pumping out a good bit of usable wafers (less waste = more profit).

      and the card is still single slot.

      • Thanato
      • 11 years ago

      Probably.., cheap production cost keep the cost down for the consumers and profits up for AMD.

      • Suspenders
      • 11 years ago

      Regain market share. They need as many of these puppies out there as possible, to avoid future shenanigans like the Assasins Creed DX10.1 fiasco. More of your cards out there, more reason for developers not to screw you over…

        • SinisterCanuck
        • 11 years ago

        Newegg has the limit per customer set on these card at 99 per customer. Yeah I think AMD went with an all out blitz as far as stock is concerned.

          • ew
          • 11 years ago

          They are also giving quantity discounts!

      • donkeycrock
      • 11 years ago

      They need to win customers back.

    • A_Pickle
    • 11 years ago

    FINALLY.

    Good performance at a good price point, without the stupidity of Nvidia drivers. My day has come!

    • MadManOriginal
    • 11 years ago

    Sweet, thanks for a -[

      • Helmore
      • 11 years ago

      Well AA is mostly done in the SPs and there are loads more SPs on this card than on the 38×0, 800 to be precise. There are also some sources saying that they also beefed up the RBE (render back ends, or ROPs in NVIDIA terms) as they are now capable of doing 4 Z-checks per clock per RBE, instead of the 2 Zs that the 2/3 series did.

        • MadManOriginal
        • 11 years ago

        Yea I figured the additional SPs would help with the AA but it seems like there has to be something else going on here. It could have just been a matter of requiring more SPs but I suspect there’s something more rather than just more of the same. The kind of usual nitty-gritty detail which Scott puts in to full GPU articles such as what you said in the second sentence is what I want to know about.

        • marvelous
        • 11 years ago

        No AMD moved away from SP AA.

Pin It on Pinterest

Share This