AMD’s Radeon HD 4830 graphics processor

Ok, folks, this is gonna be a quick one. AMD has a brand-new Radeon to unveil today, and it’s certainly worthy of our attention. However, Damage Labs is humming away with the sound of a great many things being tested right about now, so our time to devote to this new graphics card is limited. We’ll be in and out of our look at the Radeon HD 4830 in no time, faster and cleaner than a celebrity marriage.

Little bro steps out

Yep, this new card is indeed called the Radeon HD 4830. The name tells you almost everything you need to know about this product, which would appear to be the last piece of AMD’s 4000-series Radeon lineup to fall into place. Those of you who read our recent review of affordable graphics cards may recall that AMD didn’t have much to offer between the (sensational for its price) Radeon HD 4670 at 80 bucks and the all-world Radeon HD 4850 at about $180. Well, that’s where the 4830 comes in.

This new model is, like the 4850 and 4870, based on RV770 silicon, but in its tamest form yet. Yes, folks, the great product segmentation game continues with yet another chip having perfectly good—or possibly totally flawed—bits and pieces deactivated to maintain a neat separation between models. On the 4830, two of the RV770’s 10 SIMD units have been disabled, reducing shader power (and likely performance) somewhat. Since those SIMD units are tied to texture management units, the GPU’s TMU count has dropped proportionately. The end result: the Radeon HD 4830 has a total of 128 shader execution units—or 640 stream processors, in AMD parlance—and can filter up to 32 textures per clock.

That’s it for the neutering, though. The 4830 keeps all four of the RV770’s render back-ends and associated memory controllers intact, leaving it with an aggregate 256-bit memory interface. The card’s GPU core runs at 575MHz, and it comes with 512MB of GDDR3 memory clocked at 900MHz (or 1800MT/s, for those of you keeping score at home.)

All of those numbers, acronyms, and GPU gobbledygook should add up to a pretty competitive product at its price. But, to give your brain a rest, here’s a nice picture.

Ahhh…

Our sample card, which came from AMD, is based on the same basic board design as the Radeon HD 4850. The two are practically visually indistinguishable, right down to the single-slot cooler. However, AMD says board vendors will likely ship Radeon HD 4830 cards that use custom board designs and custom coolers, some of which will likely be of the dual-slot variety.

Somewhat unexpectedly, the 4830 also shares the 4850’s board power rating of 110W, even with its lower clock speeds and shader-ectomy. Why? AMD says it’s because 4830 cards may include chips that didn’t quite pass muster for use in the Radeon HD 4850 or 4870. Those GPUs may need a little extra juice (that is, voltage) in order to do their thing. However, as you’ll see, at least our copy of the 4830 didn’t draw nearly as much power as our 4850.

The final bit of information you need to know about the 4830 before we move on to our performance testing? Pricing, of course. AMD’s suggested “e-tail” price is $129, smack-dab in the middle of the hole in the Radeon HD 4000 lineup. That puts the 4830 almost directly opposite the GeForce 9800 GT, and AMD identifies that card as the 4830’s most direct competitor.

Such things are never entirely straightforward, though, these days. Nvidia points not to the regular ol’ 9800 GT but the higher-clocked variants like this MSI card for $119.99 at Newegg as the 4830’s truest competition. The MSI’s core clock runs at 680MHz, well above the 9800 GT’s 600MHz baseline speed. Right now, that card packs a $20 rebate, as well, potentially improving its value proposition, if you’re willing to risk seeing your 20 bucks ground up in the teeth of a bureaucracy designed to minimize redemption rates because you didn’t write neatly enough on the little form.

Ah, I love rebates.

AMD can play this game, too, of course. Right here, you’ll find a 4830 card from Sapphire for $129.99 with a $10 mail-in rebate attached. So, at the time of publication, the 4830’s net price is a little higher than a hot-clocked 9800 GT, but that could change overnight. Prices could drop or rise, and rebates could expand or evaporate. Radeon board makers could intro 4830 variants with higher clock speeds, as well, or a killer new S3 Graphics product could turn the market on its ear. So who knows? Let’s just look at how the Radeon HD 4830 performs, and I’ll leave the fine-tuned deal-mongering to you.

Before we go on, though, I should mention up front that the GeForce 9800 GT card we’ve tested in the following pages is not a higher-clocked card. Instead, it’s a Palit card with a bone-stock clock speed and 1GB of memory. That extra memory isn’t likely to do much of anything for the 9800 GT at the resolutions we’ve tested, but higher clock speeds surely would—something to keep in mind.

Our testing methods

As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and the results were averaged.

Our test systems were configured like so:

Processor Core 2 Extreme QX9650 3.0GHz
System bus 1333MHz (333MHz quad-pumped)
Motherboard Gigabyte GA-X38-DQ6
BIOS revision F9a
North bridge X38 MCH
South bridge ICH9R
Chipset drivers INF update 8.3.1.1009
Matrix Storage Manager 7.8
Memory size 2GB (4 DIMMs)
Memory type Corsair
TWIN2X40966400C4DHX
DDR2 SDRAM
at 800MHz
CAS latency (CL) 4
RAS to CAS delay (tRCD) 4
RAS precharge (tRP) 4
Cycle time (tRAS) 12
Command rate 2T
Audio Integrated ICH9R/ALC889A
with RealTek 6.0.1.5618 drivers
Graphics
Radeon HD
4670 512MB GDDR3 PCIe

with Catalyst

8.53-080805a-067874E-ATI drivers

Diamond Radeon HD
3850 512MB PCIe

with Catalyst 8.8 drivers
Radeon HD
4830 512MB PCIe

with 8.542-081003a-070362E-ATI drivers
Asus Radeon HD 4850 512MB PCIe
with Catalyst
8.8 drivers
Diamond Radeon HD
4870 512MB PCIe

with Catalyst 8.9 drivers
Radeon HD
4870 1GB PCIe

with Catalyst 8.9 drivers
Palit Radeon HD
4870 X2 2GB PCIe

with Catalyst 8.9 drivers
Zotac GeForce 9500 GT ZONE

512MB GDDR3 PCIe

with ForceWare 177.92 drivers

EVGA
GeForce 9600 GSO 512MB PCIe

with ForceWare 177.92 drivers

BFG
GeForce 9600 GT OCX 512MB PCIe

with ForceWare 177.92 drivers

Palit GeForce
9800 GT 1GB PCIe

with ForceWare 177.92 drivers

GeForce
9800 GTX+ 512MB PCIe

with ForceWare 177.92 drivers

Palit GeForce
GTX 260 896MB PCIe

with ForceWare 178.13 drivers

Zotac GeForce GTX 260 (216 SPs) AMP²! Edition 896MB PCIe

with ForceWare 178.13 drivers

XFX GeForce
GTX 280 1GB PCIe

with ForceWare 178.13 drivers

Hard drive WD Caviar SE16 320GB SATA
OS Windows Vista Ultimate x64 Edition
OS updates Service Pack 1, DirectX March 2008 update

Thanks to Corsair for providing us with memory for our testing. Their quality, service, and support are easily superior to no-name DIMMs.

Our test systems were powered by PC Power & Cooling Silencer 750W power supply units. The Silencer 750W was a runaway Editor’s Choice winner in our epic 11-way power supply roundup, so it seemed like a fitting choice for our test rigs.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

We used the following versions of our test applications:

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Specs and synthetics

We’ll get our customary start with a look at specs and some synthetic benchmarks designed to test them. The thing you need to know about the numbers below is that they come from the actual cards we tested. In some cases, those cards ran at clock speeds somewhat higher or lower than the official reference clocks established by the GPU makers, which may explain why our numbers sometimes vary from the official specifications.

Peak
pixel
fill rate
(Gpixels/s)

Peak bilinear

texel
filtering
rate
(Gtexels/s)


Peak bilinear

FP16 texel
filtering
rate
(Gtexels/s)


Peak
memory
bandwidth
(GB/s)

Peak shader
arithmetic (GFLOPS)

Single-issue Dual-issue

GeForce 9500 GT

4.4 8.8 4.4 25.6 90 134

GeForce 9600 GSO

6.7 26.6 13.3 38.5 259 389

GeForce 9600 GT

11.6 23.2 11.6 62.2 237 355

GeForce 9800 GT

9.6 33.6 16.8 57.6 339 508
GeForce 9800 GTX+

11.8 47.2 23.6 70.4 470 705
GeForce 9800 GX2

19.2 76.8 38.4 128.0 768 1152
GeForce GTX 260

16.1 36.9 18.4 111.9 477 715
GeForce GTX 260 216 SPs

18.1 46.7 23.3 117.9 607 910
GeForce GTX 280

19.3 48.2 24.1 141.7 622 933
Radeon HD 4650 4.8 19.2 9.6 16.0 384
Radeon HD 4670 6.0 24.0 12.0 32.0 480
Radeon HD 3850 11.6 11.6 11.6 57.6 464
Radeon HD 4830 9.2 18.4 9.2 57.6 736
Radeon HD 4850

10.0 25.0 12.5 63.6 1000
Radeon HD 4870

12.0 30.0 15.0 115.2 1200
Radeon HD 4870 X2

24.0 60.0 30.0 230.4 2400

On paper, the 4830 matches up well against the GeForce 9800 GT. The two share the exact same memory bus width and clock speed, so they have the same peak theoretical memory bandwidth. The 9800 GT would appear to have quite a bit more texturing power, and the 4830 seems to have a pronounced edge in shader arithmetic rates.

In practice, the picture is a little different. The Radeon HD 4830 takes both the color and fill rate tests and three out of the four shader tests, although its margins of victory in the shader tests aren’t as resounding as the gigaflops numbers might suggest.

Call of Duty 4: Modern Warfare

We tested Call of Duty 4 by recording a custom demo of a multiplayer gaming session and playing it back using the game’s timedemo capability. We’ve chosen to test at display resolutions of 1280×1024, 1680×1050, and 1920×1200, which were the three most popular resolutions in our hardware survey. We generally enabled image quality enhancements like 4X antialiasing and 16X anisotropic filtering, and we also added 2560×1600 to the list for the very fastest cards, in order to really stress them. For the slower cards, we also tested at 1280×1024 with antialiasing disabled, as well.

This is a close one, but the Radeon HD 4830 is slightly faster than the 9800 GT, especially at 1280×1024 where the GPU is less of a performance constraint, possibly because AMD’s graphics driver executes a little more quickly.

The big thing to take away from this test is simple: either card will run this game very acceptably at up to 1680×1050 resolution. At 1920×1200, you may have to cut back on image quality options like antialiasing a bit, especially in multiplayer.

Half-Life 2: Episode Two

We used a custom-recorded timedemo for this game, as well. We tested with most of Episode Two‘s in-game image quality options turned up, including HDR lighting. Reflections were set to “reflect world,” and motion blur was disabled.

This is a clear win for the Radeon HD 4830, obviously. Then again, even the Radeon HD 4670 can run this game reasonably well at 1920×1200 with these image quality settings (which are quite good).

Enemy Territory: Quake Wars

We tested this game with “high” settings for all of the game’s quality options except “Shader level” which was set to “Ultra.” Shadows and smooth foliage were enabled, but soft particles were disabled. Again, we used a custom timedemo recorded for use in this review.

Chalk up another one for the 4830. We’re still talking about frame rate averages of over 60 FPS at 1920×1200, but the 4830 practically shadows the GeForce 9800 GTX+ here, leaving the 9800 GT in its dust.

Crysis Warhead

Rather than use a timedemo, I tested Crysis Warhead by playing the game and using FRAPS to record frame rates. Because this way of doing things can introduce a lot of variation from one run to the next, I tested each card in five 60-second gameplay sessions. The benefit of testing in this way is that we get more info about exactly how the cards performed, including low frame rate numbers and frame-by-frame performance data. The frame-by-frame info for each card was taken from a single, hopefully representative play-testing session.

We used Warhead‘s “Mainstream” quality level for testing, which is the second option on a ladder that has four steps. The “Gamer” and “Enthusiast” settings are both higher quality levels.

Obviously, this game puts quite a bit more strain on the GPU. The 4830 handles it with composure, basically matching the GeForce 9800 GT. With frame rate minimums around 25-28 FPS, both cards run Warhead quite nicely at these settings. I enjoyed blowing stuff up with each.

Power consumption

We measured total system power consumption at the wall socket using an Extech power analyzer model 380803. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The cards were plugged into a motherboard on an open test bench.

The idle measurements were taken at the Windows Vista desktop with the Aero theme enabled. The cards were tested under load running Half-Life 2 Episode Two at 1680×1050 resolution, using the same settings we did for performance testing.

The 4830’s power consumption is admirably low given this card’s performance. The 9800 GT draws more power at idle and when running Episode Two. This is one area where a higher-clocked version of the GeForce 9800 GT would not help even the score, either.

Noise levels

We measured noise levels on our test systems, sitting on an open test bench, using an Extech model 407727 digital sound level meter. The meter was mounted on a tripod approximately 12″ from the test system at a height even with the top of the video card. We used the OSHA-standard weighting and speed for these measurements.

You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card’s highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.

Well, OK, the 4830 and many of the other cards were quiet enough not to register on our sound level meter, which doesn’t go below ~40 dB. What this result tells you is that the 4830—and many of these other cards—were nice and quiet for us. In other news, I have a nice, new sound level meter sitting here that’s capable of measuring noise levels down to ~26 dB. We’ll have to try it out soon.

GPU temperatures

Per your requests, I’ve added GPU temperature readings to our results. I captured these using AMD’s Catalyst Control Center and Nvidia’s nTune Monitor, so we’re basically relying on the cards to report their temperatures properly. These temperatures were recorded while running the “rthdribl” demo in a window.

85°C seems to be par for the course for the Radeon HD 4800 series. AMD insists these temperatures are well within its expectations, and I can live with that. Just don’t try to pull a hot card out of your system without letting it sit for a while, or your fingers are in for some pain.

Conclusions

AMD’s Radeon HD 4000 series GPU architecture has now made it into nearly every corner of the market, from the bargain bin to graphics cards costing over $500, and it’s a winner at nearly every price point, including this one. The Radeon HD 4830 nicely slides into the last real gap remaining in AMD’s lineup and more than gives the GeForce 9800 GT a run for its money.

Generally speaking, the 4830 proved to be a bit faster than the 9800 GT in our quick round of tests. In terms of overall performance, the contest between the two cards is close enough that, yes, going to a higher-clocked variant of the 9800 GT could potentially tip the balance in Nvidia’s direction—perhaps. But there are cases like Quake Wars, where the 4830 matched the more expensive GeForce 9800 GTX+, in which even a generous clock speed boost wouldn’t allow the 9800 GT to keep up.

Still, these cards are matched closely enough, and perform well enough in many of today’s games, that I wouldn’t choose between them based solely on performance. Image quality is also something of a wash, with the two major GPU makers’ DirectX 10-class chips producing roughly comparable output. AMD’s slightly newer GPU architecture does have some potential advantages, including very strong performance with 8X multisampled AA and DirectX 10.1 support, but Nvidia has credible alternatives in the form of its 8X CSAA mode and the promise of PhysX acceleration in future games.

So I dunno. Take your pick. You can’t go wrong with either. Gut feeling: I’d go with the Radeon HD 4830 on the strength of its combination of speed and power efficiency—depending, of course, on what kind of deal I could get on it.

Update – 10/27/08: The Radeon HD 4830 card that AMD sent to us had only seven of its eight SIMD clusters and texture management units enabled, which reduced its performance slightly. As a result, the numbers in the original version of this review did not reflect the 4830’s true performance. We have now updated the BIOS on our card, tested the 4830 again with the proper number of units enabled, and updated the scores in this review. The 4830 is marginally faster, and we have modified some of our commentary to reflect this fact. Our overall assessment of the product, however, is unchanged.

Comments closed
    • Jacu
    • 11 years ago

    I guess it is time to let the good’ol X1900XT to go. I ordered Sapphire 4830 which will be perfect (decent cooler, aggressive underclock when idle) considering my Antec Sonata case and should be a good match to my 3800×2 proc. Actually I will first see what kind of difference this card will make in Fallout 3 and think about changing that proc to a 6000×2. Speed increase should be about 50%. Do you think I am bottlenecked with this processor?

    To my mind this card is surprisingly cheap! Competition is a good thing, I must say.

    Anyone have an idea how much faster 4830 is compared to X1900XT?

    • MadManOriginal
    • 11 years ago

    Say, what ever happene to including GPU and memory clock speeds in the fillrate etc chart?

    • ish718
    • 11 years ago

    I didn’t realize how much the price/performance value dropped so much when you go from HD4850 to HD4870 instead of going from HD4830 to HD4850.

    Going from HD4830 to HD4850 is like a 25% increase in price and around a 5%-15% increase in performance

    But going from a HD4850 to a HD4870 is like a 50% increase in price and around a 10%-20% increase in performance with the 20% being the more rare performance increase.

      • MadManOriginal
      • 11 years ago

      Percents are usually screwy when you’re looking at things below $150. $25 is 16% of $150 so it’s important to keep absolute price in mind as well.

    • Damage
    • 11 years ago

    I’ve updated the review with new scores for the Radeon HD 4830 with all eight of its SIMD arrays and TMUs enabled. See the note at the end of the review for more info. Our overall take on the product isn’t changed by these minor performance gains, though.

      • matnath1
      • 11 years ago

      Hey Scott:

      Woulda bin nice to see the old numbers along with the revised ones to see the performance differential… Just for kicks.

    • Tamale
    • 11 years ago

    whoops – reply to 40 – a 4850 for $140 after rebate is still even better, according to that argument

    • Damage
    • 11 years ago

    AMD has indeed confirmed that it shipped reviewers Radeon HD 4830 sample cards with too many units disabled, hurting performance versus the shipping products. We have already flashed our card with a new BIOS. Before the flash, our card showed up in GPU-Z as having 560 SPs. Afterwards, it reads 640 SPs. We’ve confirmed minor performance gains with the new BIOS.

    I’ll see about doing an update, but I’m not sure I’ll be able to swing it right away due to other very pressing obligations I have.

      • Forge
      • 11 years ago

      I wouldn’t bother. Just tack a note onto the last page noting that 3-6% performance gains over the reviewed card are to be expected, and make sure to add an updated/fixed/third-party 4830 into the next GPU-related review you do.

      No sense delaying or rushing another review to get an update/correction to this one. A note and updated numbers in another review is as good or better.

      • Meadows
      • 11 years ago

      We already know the card kicks ass, and it just became a touch better. I think people can rest knowing that much until you have the proper time.

      Core i7 (p)review soon? 😉

      • MadManOriginal
      • 11 years ago

      I have a feeling you won’t be able to leave improper results up 😉 Whenever you can update them that’s great,please be sure to update power draw too, some places didn’t do that.

      • flip-mode
      • 11 years ago

      Don’t matter, Prime1 will always use the gimped results.

    • Flying Fox
    • 11 years ago

    Based on current WUs, if Folding is more important then the 9800GT is still the better choice.

    Things can change in a hurry once Stanford releases more AMD-friendly WU types.

    • Rza79
    • 11 years ago

    Damage are you going to retest with the new bios?

    §[<http://forums.techpowerup.com/showthread.php?t=74650<]§

    • ihira
    • 11 years ago

    I think you have the 9600GT and 9800GT backwards on the Enemy Territory:QW 0AA0AF 1280×1024 graph

    §[<https://techreport.com/articles.x/15752/6<]§

    • swaaye
    • 11 years ago

    Well we have yet another lower, upper performance-midrange-gamer-value graphics card now. 😉 Have we reached the “magical” +/-$5-per-product-line point yet?

    That 9800GT isn’t very representative of what you see at the store these days though. Many of them seem to run >600 MHz core these days and those run cheaper than a 4830. That would decisively smack up a 9600GT. I doubt it could rival that 4830 yet though.

    I do like the 4670’s idle power use and performance level. That’s good stuff; it’s awfully close to the power demands of an IGP but you get way, way more performance in every way. It’s too bad that with the ability to go +$10 and get quite a bit more game card, that it isn’t really worth buying. And if you’re not into games and just need video decoding, a $20 3450 or 780G is basically its equal (unless you’re very picky).

    IMO, anyone who plays games should just jump on a 9800GTX or 4850 and be done with it. They are so cheap that the difference between anything lower is just splitting hairs and thinking too hard.

    • Thanato
    • 11 years ago

    So uh is there any reviews of the 4830 in a crossfire config? 2way, 3way? how well does the card crossfire with a 4850?

    Is this question a waist of time. I’m just curious.

      • HurgyMcGurgyGurg
      • 11 years ago

      Well there is a suitable reason as to why they did not test Crossfire for it.

      First off as the article mentions this is a quick review, they probably did not have time to do Crossfire and generally Crossfire is meant for enthusiast and a $130 card is definitely not in that category. You might argue, well when the 8800 GT (9800 GT) was reviewed they did a SLI test soon after, but you have to remember that card for a good while sold for well over its marked price at $200 or $250, almost twice the price of this card. Also Crossfire is usually meant for playing well at games at really high resolutions, and as a trend set by Grid shows, 512 mb just doesn’t cut it anymore. In short, its probably too handicapped to scale well in the area that people buying Crossfire need.

      After all, if your spending at least $500 on a monitor to play at 2500×1600, I think you would spend more than $130 per, on graphics cards.

      However, It does seem that this card could be quite a good value in Crossfire despite the above and I would be interested also to see it included just for the sake of it.

        • BoBzeBuilder
        • 11 years ago

        Dude, this card and the 8800GT are perfect cards for Crossfire and SLI. 2*130 = $260 to give you similar or better performance than a GTX 280? Yes please.

    • UberGerbil
    • 11 years ago

    Fine review. This card looks like a winner for the fat upper-middle of the market.

    One question: since this GPU is an RV770 only cut down by two SIMD units, and may just be a binned version of chips otherwise destined for the higher 48xx boards, I assume it still has working dp fp units (unlike the RV730)? Not that it really matters for its intended market; just curious.

    You know, Scott, now that you’ve got that fancy new sound meter itching to be tried, here’s something to think about once you get certain other significant looming reviews out of the way….

    How about putting together a fully water-cooled (ie silent) test rig, strictly for testing nothing other than video cards’ sound levels? It wouldn’t have to be fancy — just a bare motherboard hooked to a water cooling rig (you have an old reserator kicking around IIRC) — and you wouldn’t have to update the CPU or the motherboard at least until the next PCIe revision. You’d just pop the card in, install the drivers (you could keep a couple of HDs around with an OS image with drivers from the red and green camps, so all you’d have to do is a quick driver update), do an idle sound reading, then launch a benchmark that pegs the GPU and do another sound reading. You could use an IR gun to do a “finger-friendliness” external temp reading while you’re at it.

      • flip-mode
      • 11 years ago

      If all you care about is noise, you don’t need water. Prolly just underclock the CPU and use a Ninja fanless, at least it seems possible. You’d need a near silent PSU too.

      Also, the configuration of the room could make a big difference. An 8′ x 8′ gyp board room with a hard floor and will likely bounce sound around enough to register if compared to a large room with carpet and couches and lots of alcoves and turns and stuff, and that’s even when taking a reading from just 3 feet or maybe even less.

        • MadManOriginal
        • 11 years ago

        Yea for true silence or near-silence WC isn’t the way to go, WC is for getting great cooling performance without resorting to really high-speed fans. I’m confident that with a well though out case air flow scheme with low-noise fans you could run a 45nm CPU and a good graphics card with a large passive or low-speed fan heatsink.

          • UberGerbil
          • 11 years ago

          Well I was assuming you’d stick the PSU and water cooler radiator and pumps in a box or a closet or something. This only has to run for 5 minutes or less, just long enough to be able to take a couple of isolated readings of the GPU’s noise output.

            • MadManOriginal
            • 11 years ago

            Oops, I didn’t quite realize what was meant by the original post. Something like that could be done, or simply a passively-cooled system for the sole purpose of noise testing the graphics cards.

    • PRIME1
    • 11 years ago

    TR should have tested the overclocked MSI it’s only $99 after rebate
    §[<http://www.newegg.com/Product/Product.aspx?Item=N82E16814127381<]§

      • Usacomp2k3
      • 11 years ago

      I think they test with whatever they have kicking around.

        • PRIME1
        • 11 years ago

        I know and I’m fine with that. It’s just funny what a ruckus that caused last time. A lot of QQ

    • Fighterpilot
    • 11 years ago

    I think there have been good points made for either case here.
    How I see the problem is this:
    No one ever asks “Is the overclocked 216 made by(whichever company) faster than a stock 4870?
    What they always hear is “GTX216 beats HD4870″with no caveats on which model is being tested.
    We need to get the point across that particular card “A” being tested is faster than this model “B” made by the other maker.
    It’s the broad generalizations that cause much of the fanboy wars when in fact it is the particulars of each card and how it performs /[

      • BoBzeBuilder
      • 11 years ago

      I gotta agree with that statement. These are good times. I remember well the $699 highend cards and that having decent performance was just fantasy for most people.
      Today, you can build a nice potent system for that price.

    • flip-mode
    • 11 years ago

    post edited to remove excessive wrongness. dag nabbit.

      • moloch
      • 11 years ago

      Check your eyes

        • flip-mode
        • 11 years ago

        Yep, you’re right. It’s a 4670, not a 4870.

    • Fighterpilot
    • 11 years ago

    I can’t help thinking that all the protests about using a higher clocked card in the HD48701GB V GTX260 216 test a few weeks ago are somewhat responsible for TR avoiding the “Higher clocked” MSI 9800GT mentioned early in the article…
    Props to TR for listening to the valid criticisms made on that article,perhaps the people who were so vocal in shouting them down can take a lesson from this.
    The stock 9800GT puts up a pretty good fight against it,I wonder how understanding the NVidia fans here would have been if a higher clocked 4830 had been tested instead and showed a much bigger winning margin than today’s results?

      • MadManOriginal
      • 11 years ago

      Yea I noticed that mention too and it made me smirk. This article isn’t pre-overclocked-card free but in the future stock speed testing or stock+oc will keep the rants down.

      • flip-mode
      • 11 years ago

      Lame pettyness. If you can buy it off the shelf with a lifetime warranty then crybaby fanboys need to get over it. Hopefully, TR didn’t bow to such ridiculous complaints. I suppose that if the 9800GT had come out on top in this review that fanboys would be screaming over the fact that the 9800GT had 1GB of RAM – yet another unforgivable deviation from the “reference design”.

      It is price and performance off the shelf that matters, not some arbitrary “reference design”. The only teeth that argument has is that people in some locales only have access to reference spec’d cards.

        • Fighterpilot
        • 11 years ago

        With respect Flip I think you are out of line with that remark about”ridiculous complaints”.
        In this comparison with the MSI overclocked version against the HD4830 they give the nod to the NVidia card…
        §[<http://www.neoseeker.com/Articles/Hardware/Reviews/msi9800gt/14.html<]§ In effect that would have altered the TR conclusion quite considerably had they chosen to use that model.

          • flip-mode
          • 11 years ago

          I know you think I’m wrong, but I think you’re wrong about me being wrong. Keeping the FOC’d (factory overclocked) card out essentially means that TR is not showing the best available choices. How can a card that is available off the shelf at a factory set speed and a competitive price and a lifetime warranty not qualify for testing? I understand the rationale for each argument, but in the end the rationale that overcomes is the rationale that gives me the best product for my money. It is irrelevant whether or not the supplier of a card followed a *reference* design. Or else, if that really is relevant then we should be true to that – no products that don’t exactly fit the reference design in all aspects – color, video outputs, hsf design, sticker on the hsf, clock speed, RAM size, RAM speed, bundled items, etc. And only suggested retail prices should be quoted – we should only be interested in the product as suggested by Nvidia and ATI, not the resellers, not real life prices and products. Real life stuff is not what we should be looking at. Wait, that would be stupid.

            • JustAnEngineer
            • 11 years ago

            I consider it acceptable to include overclocked cards, but only if they appear as a separate line from the factory-clocked cards, which means twice as much testing time in the underground sweatshop and results in busier and more confusing charts.

            The problem is that these reviews are a snapshot on the day of release of the new card. When we look back at the review in a month trying to make a decision, that super-duper factory overclocked card could be more than twice as expensive as the stock cards due to cut-throat competition.

            We need the information on the stock-clocked products. Adding more testing for special overclocked versions is okay if Damage has the time and the inclination, but we cannot do without the stock cards.

    • Meadows
    • 11 years ago

    /[

      • MadManOriginal
      • 11 years ago

      Yeah I think we’ll see further MIRs :/ or slight drops in the 9800GT although with MIRs in a few cases it’s nearly there already. Anand’s review went indepth about MIRs and pricing for a whole page heh. There just isn’t much more room to squeeze prices closer in NVs lineup, as if the naming convention isn’t bad enough now the pricing is getting all out of wack.

      • moloch
      • 11 years ago

      Indeed I was really impressed with this card- lower power draw and equal or faster (in some cases alot) than a 9800GT- seems a win win situation, I was surprised it didnt win editiors choice or whatever actually :/

        • Meadows
        • 11 years ago

        The only thing that surprised me was the idle power consumption. You really can’t beat that unless you use an nForce board with HybridPower, I suppose (or if you have a low-power profile in RivaTuner like I do, but I think that wouldn’t bring them close enough).

    • ace24
    • 11 years ago

    double post…

    • MaxTheLimit
    • 11 years ago

    I wonder how long it will be before we see this card dip below the 130bucks mark. Most of the other AMD cards have dropped below the suggested price almost immediately.

      • MadManOriginal
      • 11 years ago

      I say through the weekend they’ll hold up then crack. The only card that was sorta hard to find for a while was the 4870.

    • MadManOriginal
    • 11 years ago

    This is something important to check Scott: §[<http://www.techpowerup.com/articles/other/155<]§ Nice review, nice card. I'm not sure what to make about power draw readings resulting in quite different ranks among various sites. One thing I noticed in this review even though it wasn't the featured card, and I may risk getting flamed by silly people for this given other recent remarks, is how well the 9600GT (note: overclocked version in this review) has held up over time. *Well, thinking about it the 9600GT hasn't been out that long, but still...

      • pmonti80
      • 11 years ago

      About the link, very interesting indeed. It would be interesting if TechReport would ask AMD what’s happening here.
      About the Geforce 9600GT, I was nearly banned last time so I won’t say any more thing.

      • mczak
      • 11 years ago

      By the looks of it (perlin noise score), tech report also got a 7 simd card. Talk about special review hardware 🙂

      • eitje
      • 11 years ago

      *[

        • Damage
        • 11 years ago

        I checked with AMD. They told me that they believe this utility is reading the card’s registers wrong and that there is no performance difference between cards that show 560 and 640 SPs in the utility. They told me they’re following up with their engineering folks, the utility author, and the publication to get this sorted out. We’ll keep an eye on it, but it appears there’s no real issue with the cards and their performance here, at least right now.

          • eitje
          • 11 years ago

          Yeah, that’s the first thing I thought we’d hear from them. 🙂

          • MadManOriginal
          • 11 years ago

          Their reply is a nice denial, although at least they say they’ll check, but there’s no denying the performance difference in the link in post #7.

          • MadManOriginal
          • 11 years ago

          Anand’s card was short some activated SPs, turns out it’s a BIOS issue and retesting shows moderate gains but nothing huge. I’m more interested in being sure the power draw numbers are correct. Have you checked with GPU-Z yet?

          • rpsgc
          • 11 years ago

          Oh there IS a difference. Anandtech’s updated review shows that, Techpowerup’s Reference AMD card (560SP) vs Powercolor (640SP) reviews show that.

    • A_Pickle
    • 11 years ago

    g[<*[<_[

      • willyolio
      • 11 years ago

      a strong fanboy would still try to argue that the 9600GT is better.

        • marvelous
        • 11 years ago

        It’s not better it’s just cheaper. Like 50% cheaper if you get a good deal with rebates.

          • Meadows
          • 11 years ago

          So its price-performance ratio might actually be up to snuff, but with the HD 4830 being still so cheap in an enthusiast’s terms, there’s no arguing which to pick.

      • ew
      • 11 years ago

      It’s just like when John McCain was all “the fundamentals of our economy are strong”.

        • VILLAIN_xx
        • 11 years ago

        McCain has some skills though.

        §[<http://www.youtube.com/watch?v=UMPNWT6NxMY<]§

        • SubSeven
        • 11 years ago

        They certainly are quite a bit stronger than elsewhere………. but my question here is why would you bring politics into this, a video card review? I have no love for either candidate but I really dislike when people use economic circumstances to propagate their agenda (that in most cases they know very little about and are in fact merely regurgitating what they heard on the TV)

          • ew
          • 11 years ago

          First “strong” and “stronger” are two different things. One is absolute and the other is relative.

          You said yourself you dislike when people make ignorant comments about the economy to propagate their agenda. Well I have a similar dislike for ignorant comments like PRIME1’s. I was just trying to point out the similarity. Didn’t mean to make this seem political.

            • ludi
            • 11 years ago

            Argument by analogy is dangerous enough on its own merits. Flaming other people’s political views or religious beliefs at the same time, in a context that has nothing to do with those political views or religious beliefs, pretty much guarantees that you will make enemies of both your person and argument — /[

            • ew
            • 11 years ago

            Your probably right but I couldn’t resist.

      • PRIME1
      • 11 years ago

      Considering how long the 9600GT has been out and how much less it costs compared to the 4830. You get what 10% more FPS? I bet the 9600GT out performs it in folding and physics by 100%

      Things to consider.

        • Meadows
        • 11 years ago

        The 9600 GT lost, live with it. The 9800 GT fared even worse.
        An nVidia fan said that.

          • PRIME1
          • 11 years ago

          The 9800GT fared even worse than the 9600GT…. OK. :rollseyes:

            • Meadows
            • 11 years ago

            Don’t tell me that /[

            • PRIME1
            • 11 years ago

            In what language? Clearly not English. You said the 9800GT was worse than the 9600GT.

            Even though the TR review (and most others) consider the older less expensive 9800GT to be the same as the new 4830.

            • Meadows
            • 11 years ago

            It’s not the same – currently, it’s a very bad value proposition. That’s why it fares even worse than the 9600 GT, which is at least nice for what it costs.

            • poulpy
            • 11 years ago

            And that’s not the same than what you wrote earlier so I suggest he teaches /[<"you write"<]/ or some courtesy maybe.

            • SubSeven
            • 11 years ago

            LOL. Sorry meadows, i just had to laugh at that. Seriously though guys, lets keep it civil. No point in going crazy over someone else’s opinion. Yes, everyone has one (and most times some people have opinions that are not of this planet) and I think we can agree that for the most part, they are entitled to it. If you happen to disagree, it’s more than fine to question and poke holes in another’s opinions… but I’m pretty sure doing so with personal attacks will not be conducive to your cause. I may be wrong, but that’s my two cents.

            • MadManOriginal
            • 11 years ago

            I think we can safely say that opinions are like a*holes…and so are some people 😉

            • Forge
            • 11 years ago

            Please don’t tell me that I will have to teach you how to read, I think you meant.

            • cegras
            • 11 years ago

            Successful troll’s attempt to divert discussion to semantics away from the main point is successful.

        • SubSeven
        • 11 years ago

        Of what relevance is the duration in market of the 9600GT? I thought what mattered is the price/performance ratio of whatever product, in the current environment, regardless of duration. Secondly I’m not sure where you see 10%, but most benchmarkets indicate gains of 20% or more. Oh and by the way, i personally love the 9600GT, it is probably my favorite card in the NV lineup at the moment in price/performance (being able to get one for as low as $75 after rebates is pretty darn awesome).

          • MaxTheLimit
          • 11 years ago

          I wonder if we are going to see another GTX 2xx in response to this one. Maybe a GTX 240? Or maybe GTS 260? GT 240? I love the nvidia naming scheme. But I don’t doubt there will be a new offering in response to this.

          • willyolio
          • 11 years ago

          well, if it’s been out for a long time, it means that drivers are pretty much optimized and you’re unlikely to see any more performance gains from future releases…

          i guess PRIME1 considers this a good thing.

            • MadManOriginal
            • 11 years ago

            NV seems to be able to find impovements for the architecture in game engines, not all of which are brand new (HL2:Ep2):
            q[http://www.techreport.com/discussions.x/15753<]§

    • ReAp3r-G
    • 11 years ago

    i like the numbers on that one…but the most impressive one was the X2 card…lol it didn’t flinch until 2560 reso rolled around

    amazing card…but yeah the 4830, definitely a worthwhile buy and at such an affordable price…i guess where AMD fails (somewhat) at CPU making/marketing they have ATI to thank for keeping them in the game so to speak 🙂

    • ssidbroadcast
    • 11 years ago

    For some reason, it feels instinctively so very /[

      • Creamsteak
      • 11 years ago

      I thought the 9800GT was the 8800 GT renamed?

        • ssidbroadcast
        • 11 years ago

        Yes, actually. Come to think of it, that’s sort of a problem for nVidia from a consumer-confusion standpoint. Their 9800 GT, by name, implies it’s only 1 generation behind–and technically, that’s half true– but really it’s “two” generations behind.

        It’s reasonable for a mid-high Product A to be outperformed by a mid-low Product B, but only if Product B is two generations newer than A.

        For example, an 8600 GT should handily outperform a 6800 Ultra.

          • marvelous
          • 11 years ago

          What how? It was ATI who were behind in tech. Now they’ve caught up in a big way with their efficient design.

          Nvidia should have never named G92 chips 8800gt and GTS. They should have called it 9800gt in the first place. This is where all the confusion comes up.

          GT200 doesn’t totally fail but their design was definitely flawed for how much performance it gives because they were pushing CUDA and physx.

      • DrDillyBar
      • 11 years ago

      Agreed. It did better then I thought it would.

      • ace24
      • 11 years ago

      But its the _[<48<]_xx part that matters. This is actually a low-high card, vs the 9800gt (8800gt if you will), which I'd consider a high-mid as well.

        • ssidbroadcast
        • 11 years ago

        Er.. low-high? Dude…

    • SecretMaster
    • 11 years ago

    Damn, that was fast. Still a great review, and even more awesome that the review is released the day that Cyril made the announcement on the news page.

Pin It on Pinterest

Share This