GeForce GT 430 brings Fermi to $79

Say hello to Nvidia’s cheapest DirectX 11, Fermi-based graphics processor yet: the GeForce GT 430. While Nvidia stressed during its presentation last week that partners will set their own prices, it did let slip a $79 target figure. At that price, the newcomer is headed for a confrontation with AMD’s Radeon HD 5570, which has been delivering bargain DX11 graphics around that same price point since February.

Before getting into any competitive comparisons, let’s take a quick look at what makes the GeForce GT 430. This card is Nvidia’s first desktop product to feature the GF108 graphics processor, a 585-million-transistor piece of silicon built on the same 40-nm TSMC fab process as the rest of the Fermi series. My measurements peg the GF108’s die area at approximately 115.5 mm², a little less than half the size of the GF106 chip inside the GeForce GTS 450


A block diagram of the GF108. Source: Nvidia.

Fittingly, the GF108 has half the number of shader multiprocessor blocks as the GF106—just two, each with 48 stream processors and one texture block. That means the GF108 has 96 SPs and is capable of filtering and sampling 16 texels per clock cycle. Nvidia has also included a single ROP partition, which can output four pixels per clock, as well as two 64-bit memory interfaces, for a total interface width of 128 bits. (For reference, the GeForce GTS 450 can churn out 16 pixels per clock and features a 128-bit memory interface, although it doesn’t utilize the GF106 GPU to its full potential.)

Nvidia quotes standard clock speeds of 700MHz core, 1400MHz shader, and 900MHz memory for the GeForce GT 430. The memory speed applies to the card’s 1GB of DDR3 RAM, so it makes for a transfer rate of 1.8GT/s and total bandwidth of 28.8GB/s for the whole card. By the way, the card should have a maximum power draw of about 49W—again, roughly half the GTS 450’s 106W. The low power consumption might be why Nvidia’s reference design has a half-height circuit board.

Here’s Asus’ take on the GeForce GT 430, the ENGT430, touting the aforementioned half-height design. This isn’t a reference card, though; Asus has thrown on a custom cooler, whose fan purportedly boasts a higher-than-average resistance to dust. This more tightly sealed fan should, the firm claims, increase the life span of the card by 25%. Asus also says it used particularly efficient and durable electrical components.

So, how does the GT 430 stack up against its most direct competitor, the Radeon HD 5570, and the next rung up Nvidia’s product ladder, the GeForce GTS 450?

  Peak pixel

fill rate

(Gpixels/s)

Peak bilinear

INT8 texel

filtering rate*

(Gtexels/s)

*FP16 is half rate

Peak

memory

bandwidth

(GB/s)

Peak shader

arithmetic

(GFLOPS)

GeForce GT 430 2.8 11.2 28.8 269
GeForce GTS 450 12.5 25.1 57.7 601
Radeon HD 5570 (DDR3) 5.2 13.0 28.8 520
Radeon HD 5570 (GDDR5) 5.2 13.0 57.6-64 520

This theoretical comparison isn’t all that flattering to the GT 430. The DDR3-based Radeon HD 5570 has higher theoretical numbers pretty much across the board, except for memory bandwidth. (AMD has a GDDR5-powered 5570 that offers a sizeable memory bandwidth increase, although I can’t seem to find anyone selling it right now.) The GT 430’s pixel fill rate looks particularly anemic next to the AMD offerings. Beyond direct match-ups, one thing to note is that these ~$80 cards are all considerably weaker than the GeForce GTS 450, which retails for as little as $122.99 at Newegg. Anyone even remotely serious about gaming would do well to spend the extra 40 bucks, I think.

Ultimately, as I pointed out earlier this month, the GeForce GT 430 may be part of a dying breed. The next generation of desktop processors will feature much-improved integrated graphics, which may be potent enough to discourage folks from spending $80 on a relatively underpowered discrete GPU. Perhaps we’ll see quicker $80 GPUs by then, but I wouldn’t be surprised to see an increasing number of cash-strapped gamers gravitate toward slightly more expensive (and much more powerful) solutions in 2011.

Comments closed
    • Dagwood
    • 9 years ago

    What is this TR knocking a graphics card even before they have done a review?!?

    If you think Intels on die graphics is going to replace this, then I have some suggested reading for you… It is from a trusted source that I had a high opinion of until recently.

    §[<https://techreport.com/articles.x/18539/6<]§ or you could just go to one of those other [H]ardware review sites that have a review with test results already!

    • thermistor
    • 9 years ago

    #66…I have been concurrently building/supporting PC’s for da kiddos…and I really don’t want anything expensive that grape kool-aid can take out;-)

    • lex-ington
    • 9 years ago

    Well, I hope that 3D TV’s sell well for the sake of this card and future generations of this 3D stuff moving up the line of cards – and I don’t even use nVidia cards.

    I for one CANNOT, for all its worth, sit on my couch to watch something for 2 hours with a heavy pair of ugly glasses on my face . . . . and pay $2500 for the “privilege” (and that’s just the TV and the glasses).

    • TaBoVilla
    • 9 years ago

    the only way this could have “rained” on AMD’s HD6k parade was if nVidia had found a way for GF108s to actually generate DC power back to the system and handed them out for free.

    • thermistor
    • 9 years ago

    49,50, 51…No, with onboard graphics, I drop frames in watching DVD’s, get lots of extended pauses, etc. both with old discrete cards as well as the integrated graphics as described, when running Vista aero + WMC. Granted I’m running Cel4xx series CPU’s, but I’ve gone thru a lot of permutations and only with a 3450 or better, or 8400GS, do I get good playback. Only by disabling aero was I able to make WMC free of annoying pauses, dropped frames, etc.

    Yes, possibly a better CPU, even just an entry-level dual core, might help. But then again, I’d rather put a few $$$ into the GFX cuz the system will game in a pinch.

    The only reason I’m familiar with the subject is because I’m cheap, have bought a lot of cheap graphics cards, have experience with Intel/nV onboard graphics courtesy of several cheap motherboards, but am passionate about a decent HTPC that is cheap, pulls little power, but performs well for live cable TV via USB adapter/Netflix/Hulu/WMC/DVD playback.

    I stand by my observations about any onboard gfx (both nForce 610i and Intel prior to X4500), as well as older, lower-end discrete cards. Perhaps with the caveat that I’m running a single-core CPU.

      • UberGerbil
      • 9 years ago

      Actually, just upgrading to Win7 might help with the dropped fames and/or not requiring you to disable Aero. There were some improvements under the hood in that regard.

      • MadManOriginal
      • 9 years ago

      A single Core CPU huh…have you ever actually, you know, looked at CPU utilization when you got the poor playback and other issues? Not that I’m disagreeing with you that for minimal power draw when using such a CPU that a low-end discrete card isn’t beneficial, it’s just sort of funny how you were ‘hiding’ the fairly slow single core CPU info and then sprung it on us 😉

      • Skrying
      • 9 years ago

      You’re cheap but waste your money on multiple low end motherboards instead of a single decent one? That’s makes no sense. Your problem isn’t the graphics chips. It’s the CPU.

    • Mystic-G
    • 9 years ago

    I guess for those who just wanna watch videos in 1920×1080 resolution on their computer.

    Or you can spend $20 more to buy a GTS250 and actually play games on pretty good settings.

    • thermistor
    • 9 years ago

    I can say without a doubt that *any* Intel integrated graphics south of the X4500 cannot run Vista (7?) aero + WMC. And I’m not totally sure about the X4500. Additionally, neither can the nForce 610i. Nope with the FX5200. If you disable aero, then DVD’s and streaming Netflix does OK, but who wants to do that?

    However, I know an AMD 3450, 8400GS, 4550 will.

    This card seems to be too many $$$ for app’s like a TVPC and a dual monitor doing productivity apps like Word/PPT, and not good enough for killer gaming. But I’d say the same thing about the 4650, 9500GT and other $50-75 cards. But then I wouldn’t drop more than about $30 for HTPC/dual monitor productivity cards.

      • Skrying
      • 9 years ago

      The X3100 inside my aging Dell XPS M1330 handles Aero in Windows 7 just fine as well as Media Center.

        • srg86
        • 9 years ago

        My Atom net top’s GMA950 also handles Aero in Windows 7 just fine as well. There is a lot of FUD around Intel graphics.

      • Usacomp2k3
      • 9 years ago

      The GMA950 works fine, what are you talking about?

        • UberGerbil
        • 9 years ago

        He’s talking about Vista. The rest of you are using Win7. I know there doesn’t seem to be much difference between the two on the surface, but in this particular area (requiring Aero to be disabled to get smooth video playback, etc) the changes in the DWM between the two OSes are significant.

          • indeego
          • 9 years ago

          Q35 chipset IGP [and everything below] sucks on both Vista and 7 Aero as far as I’m concerned. Jerky all aroundg{<.<}g

          • jackbomb
          • 9 years ago

          I didn’t have a problem playing 14mbps 1080p AVC files under Vista with Aero enabled. GMA950+C2D T5600.

    • travbrad
    • 9 years ago

    Wow this looks to be significantly slower than my 4830 (which was $85 almost 2 years ago…). Admittedly it will probably have good power consumption, but what good is that if games aren’t playable?

    • swaaye
    • 9 years ago

    I have a hard time seeing a reason to buy a low-end card like this instead of just sticking with an IGP. It certainly isn’t a great game card and I would spend the extra $30 or whatever to get double the performance if I want to play games. On the other hand, a modern IGP can do everything but games more than adequately and it will use less power than a add-in card.

    And get rid of that fan already. There’s no reason to have a bottom rung card like this equipped with a fan, well aside from it probably being cheaper to manufacture than more metal in the cooler.

      • MadManOriginal
      • 9 years ago

      I’d even say that modern IGPs aren’t half-bad as long as you aren’t stupid and expect them to play newly released games with AA and high settings at a high resolution. I really do wish sites would have some common sense when testing IGPs and either use older games or do an ‘equivalency’ test showing which 3-4 year old card IGPs are on par with, after all older games can still be great fun and with Steam there are always some great older titles for cheap. How does a 785G/890G/Core i5 IGP compare to a 7600GT, 8400GS, 9400GT, HD4350 etc, and how well can they play things like CS:S or the ‘hot shooter’ or strategy game from 3-4 years ago? Review sites sorely let me down in this area…doing extrapolations to the 4th degree is tedious and inaccurate.

        • Chrispy_
        • 9 years ago

        I hate to condone its use, but this is where 3DMark is actually valid.

        If you view the O.R.B. database you can basically get a rough idea of pretty much anything and it’s all done with a consistent test.

        Whilst GTX 480’s will be benchmarked by review settings at 2560×1024 with 8x and 16x Aniso focussing on the latest DX11 games and benchmarks, several nutjobs have done the same thing for 3DMark 2003/5/6/vantage

        And yes, you can easily quantify just how much better that GTX460 is than your old 6600GT which vanished from any benchmark 3 years ago. It may not be a real-world test but it’s a whole lot more reliable than extrapolating through several generations thanks to driver updates, game patches etc.

    • phileasfogg
    • 9 years ago

    The speed grade on the Hynix DDR3 chips says -12C. That means 800Mhz (1600Mbps). So how come NVidia/Asus can get this to run at 900Mhz (1800Mbps)? Are they running it at 1.80V instead of 1.50V to eke out the extra 100Mhz? So that would amount to ‘overclocking’ the memory, correct?

      • Game_boy
      • 9 years ago

      If you look at the GF100, they’re doing the opposite – running high-binned memory at much lower speeds because their memory controller is broken. So this is confusing.

      • MadManOriginal
      • 9 years ago

      Most silicon has a ‘free’ 10% speed bump available statistically speaking – not every single part will be able to do it but overall it’s no issue getting it. That goes doubly so for mature tech on processes that have been tweaked a fair amount for yields.

        • phileasfogg
        • 9 years ago

        Are you a practicing hardware engineer or are you just generalising? Do you know that the Hynix 54nm -BFR DDR3 1Gbit die are not specified for 900Mhz? You call the 54nm process a “mature” process. Oh really? And how would you know? Have you measured the voltage on the DDR3 components on this GT430 card? Can you say for sure if it’s 1.5V or 1.8V? Don’t spout off meaningless garbage unless you know of what you speak – I’ve been in the semi business for 24 years, and I’m guessing you haven’t spent even half that time in the field.

          • MadManOriginal
          • 9 years ago

          I’m Gordon Moore, smart guy. And you?

      • mczak
      • 9 years ago

      900Mhz is just the reference memory clock. I bet almost all cards you can buy (except probably the OC versions) will use 800Mhz, which is what all other cards featuring ddr3 are using too…

    • Wintermane
    • 9 years ago

    God I love you guys… Thus card is more then powerful enough to play most all the games out there on smaller monitors with low/no aa and thats exactly how peoiple who buy 80 bucks cards play games…

    And in 2 months it will be 60 bucks…

    As for on die chips.. remember BANDWIDTH… thgis thing has 28.8 gigs a second… at most an on die chip is gona have what.. a share of 12 or so? And a power budget of what.. 20 watts?

      • MadManOriginal
      • 9 years ago

      Nothing exists in a vacuum so comparisons to what else is available are valid. Comparing it to integrated is one thing and I don’t think you’ll find anyone saying i[

        • sweatshopking
        • 9 years ago

        and according to anandtech, the image quality is inferior to the ATI cards as well…

    • UberGerbil
    • 9 years ago

    Wait, where’t the AGP version?

    😉

      • paulWTAMU
      • 9 years ago

      hasn’t AGP been shot, buried, mourned and forgot by now? Kinda like ISA?

        • UberGerbil
        • 9 years ago

        You just couldn’t see the emoticon?

        (Also, PCI refuses to die, and AGP is really just a pimped version of that.)

    • OneArmedScissor
    • 9 years ago

    What does it matter if it’s “Fermi” if it’s a bastardized GTS 240, which already blew? I think we need to put up a sign. “Don’t feed the Nvidia marketing machine.”

    It’s funny how the original 8800s with 96 SPs had 20 ROPs…at 90nm, and that wasn’t even the full chip. This tiny little thing isn’t even running the clock speed of the 9800GTX+/GTS 250. They talk up Fermi so much, but something still isn’t right.

    • Chrispy_
    • 9 years ago

    I hear the logic behind “dying breed”, “last gasp” etc

    However, until Intel’s graphics drivers match AMD/Nvidia’s, the important feature at the low end is compatibility and image quality, not performance.

    I’m talking about custom resolutions, ratio correction, Accuracy of 3D rendering, full and accurate feature-list support. Many people don’t care that it’s slow, but they do care that it looks correct. Intel fails hard here (Microstation, AutoCAD, 3DS Max, etc)

    For mainstream gaming, then yes – these things are rapidly reaching obsolescence.

      • UberGerbil
      • 9 years ago

      g[

        • cygnus1
        • 9 years ago

        l[

        • Chrispy_
        • 9 years ago

        Heh, your antialiased flaming spear made me chuckle. I can just imagine that being put into the next zero-punctuation review 🙂

        However:
        l[

    • MadManOriginal
    • 9 years ago

    This might be vaguely intreresting if it was really half a GTS 450 – 8 pixels/clock and GDDR5 for some more memory bandwidth. It’s just too cut down as it is. If it had the few additional bits I mentioned it would be a worthy successor to the great 9600GT and wash the bitter taste of the GT 240 away.

      • derFunkenstein
      • 9 years ago

      Yeah, 1 ROP partition isn’t enough. Even the GT240 has 2 for 8 pixels/clock. And GDDR5. At $80 it’d beat a Radeon 5670, and you might see some movement at the low end. Right now it’s just meh.

    • Voldenuit
    • 9 years ago

    This is probably the last gasp for low end discrete graphics. All future mainstream CPUs from both camps have on-die graphics, and nobody’s going to be pairing a $79 GPU with a high end CPU (at least, not near enough people for the volume numbers to work out for this type of product).

    ‘Course, once low end GPUs die out, this will probably leave intel users at the mercy of intel’s driver team, so this is probably not great news for people in terms of choice. What are the HTPC provisions like for SB and the H6x series chipsets? For a hugely successful company, intel doesn’t get (or care) about its enthusiasts (be they gamers, HTPC buffs or folders) very much. While raising the minimum bar for discrete graphics has its upside, silence and HTPC enthusiasts who like passively cooled GPUs might not rejoice at the shape (or rather, sound) of things to come.

      • MadManOriginal
      • 9 years ago

      You managed to turn an article about an upper low-end NV card in to a rant bashing Intel. Well done!

        • travbrad
        • 9 years ago

        Hey someone had to do it.

    • flip-mode
    • 9 years ago

    Could make for a good Linux card.

    I wonder how good Sandy Bridge graphics will be on Linux. Will it play Quake 4?

      • sweatshopking
      • 9 years ago

      probably would handle quake 4. question is, why would you want it to?

      • stmok
      • 9 years ago

      By the time “Sandy Bridge” is released, Linux support is limited to the 6 execution units (eu) model. Intel folks are behind in adding support for the 12 eu model. (This latter one was previewed to rival budget discrete video cards in Anandtech’s article.)

      Looking through git, there’s entries to identify SB’s IGP in the Linux Kernel.

      • maroon1
      • 9 years ago

      Of course it can run Quack 4. I’m surprised that you even ask this question. I could run Quack 4 on my ancient geforce 6600 at 1024×768, with most settings on high.

      According to anandtech review, SB can run more demanding and modern games like Batman AA and Dragon Age

      And remember that was an early review which uses early drivers, and anandtech was not even sure whether the GPU turbo was enabled or not.

        • flip-mode
        • 9 years ago

        I didn’t know Batman or DA ran on Linux – that’s why I picked Quake 4.

      • swaaye
      • 9 years ago

      Oh come now. Don’t you have some old PCIe card lying around? Free as in beer? 😀

        • flip-mode
        • 9 years ago

        Heh, funny you should mention it. I have some low profile cards that don’t fit my case, and I have an ATI Fire GL 3100 that – believe it or not – is completely unstable in Linux. I thought Fire GL cards were supposed to be good choices for Linux. Luckily, that card cost me nothing. Unfortunately, the 690G mobo I have is crummy for Linux too – if that board was slow in Windows, it’s even slower in Linux. The good news is that it’s just as cheap to upgrade to a new 785G mobo as it is to get a low-end video card like a GTS 430 or HD5670 – and getting the new mobo would help performance in other areas why my 690G mobo sucks the tea bag too.

          • Deanjo
          • 9 years ago

          Any ATI card pretty much sucks in linux.

            • swaaye
            • 9 years ago

            Oh I dunno. That somewhat recent totally open-source “radeon” driver that supports everything from the original Radeon on up is pretty damn nice.

            I’m sure it sucks for gaming but I don’t need that with Linux. It seems to work great for everything else. I ran it on a Radeon 9600 and a 7000 for a few months for a simple web kiosk.

            NVIDIA dropped support for their legacy driver so I had to replace my trusty TNT2 M64 lol. Hey it did web browsing well enough when it had a real driver, but that “nv” driver is no such thing. 😀

    • Meadows
    • 9 years ago

    Actually seems perfect for the power-usage-conscious MMO gamer, but save for those people with budget screens, it’s probably next to useless for modern gaming.

      • FuturePastNow
      • 9 years ago

      Yeah, but there are lots of people still at 1280×1024 or 1440×900.

        • Meadows
        • 9 years ago

        For which this card is too weak in new titles.

          • flip-mode
          • 9 years ago

          Make sure to point out that you were definitely right, and repeat yourself if necessary – or even if unnecessary, make it seem necessary. And, correct others’ grammar whenever possible.

            • Meadows
            • 9 years ago

            Silly, I’m doing that already. And if people got things the first time, I’d never have to repeat myself, so it’s necessary as far as I’m concerned.

            • mph_Ragnarok
            • 9 years ago

            dudes chill,
            i love meadows

            the secret is to say vacuous things. if there is no content or opinion in your post, then there’s nothing for which meadows can bite you in the ass.

            • sweatshopking
            • 9 years ago

            hey, I think the sky is a pretty color today. I also like the canadian coin! Nice quarter! i have 2 of those from my wife!

            On a related note, I got a layoff notice today boys 🙁 No more POS company for me. we’re going tits up. Wish me luck, it’s tough finding work these days.

            • MadManOriginal
            • 9 years ago

            Good luck bro. These days you might just have to suck it up and take a crappy job you don’t really like at all at some point. You certainly wouldn’t be the only one doing that.

            • sweatshopking
            • 9 years ago

            thanks man, Can I move in with you? Prostitution is legal in canada, I can make SOME money.

            • MadManOriginal
            • 9 years ago

            Unfortunately I don’t live in Canada nor in Nevada so no, I can’t be your pimp just your bff. Maybe you can find someone to make an indecent proposal for your wife?

            • eitje
            • 9 years ago

            I thought that’s what the comment about getting two Canadian quarters from his wife was aboot…

            • flip-mode
            • 9 years ago

            I didn’t know street corners went out of business, but I thought they were always going “tits up”.

            • sweatshopking
            • 9 years ago

            the did. they shut mine down. Too racy i think. our pitch was pretty rude.

            • SomeOtherGeek
            • 9 years ago

            Hey man, sorry about that. but hang in there. All you have to do is be cool and you’ll get hired on the spot!

            • sweatshopking
            • 9 years ago

            also, you know meadows is a 22 year old female model, don’t you?

            • Meadows
            • 9 years ago

            Some people wish, but I’m neither that age nor that gender.

            • sweatshopking
            • 9 years ago

            listen, veronica, you can’t fool me! I know a hot babe when i read their posts!

Pin It on Pinterest

Share This