Nvidia sets GF100 launch for March 26

Well, this is as official as it gets. Nvidia has put up an agenda page for its appearance at the Penny Arcade Expo (a.k.a. PAX East) in Boston, Massachusetts next month, and it turns out the company will host a special, "must see" presentation on March 26 at 6:00 PM local time. The teaser paragraph doesn’t leave much to the imagination:

Come see NVIDIA unveil the next generation of PC gaming. Want to see what’s hot and what’s next? If you’re even vaguely a fan of PC games and miss this special event, you’ll likely be spending the next few months kicking yourself. Line up early as seating is limited. ‘Nuff said.

Nvidia has promised to introduce its next-gen GF100 graphics processor in the first quarter, and this looks to be the launch venue. As if the teaser weren’t clear enough, the company goes on to say attendees will get to try its "highly-anticipated, next-generation GPU" at the company’s booth in Exhibit Hall D, and some folks "may even be able to buy one before anyone else."

The GF100, Nvidia’s first DirectX 11 GPU, will power upcoming GeForce GTX 480 and GeForce GTX 470 graphics cards. Those products should go up against AMD’s current Radeon HD 5000-series graphics products—presumably higher-end ones, since evidence suggests the GF100 will be a top-of-the-line chip. Next quarter, Nvidia also plans to take the GF100’s Fermi architecture into Tesla products geared for GPU computing tasks.

In a somewhat unfortunate twist, however, Nvidia has chosen to hype its attendance at PAX East using the exact same tagline that foretold the arrival of the NV30 GPU in late 2002:

Top: the PAX East teaser image. Bottom: the NV30 teaser image. Source: Nvidia.

The NV30 and the infamous, Dustbuster-toting GeForce FX 5800 Ultra that carried it both turned out to be disappointments. Certain media pundits have compared the GF100 to the NV30 due to the delays surrounding both products and, according to some particularly alarmist reports, the GF100’s lackluster performance and excessive power consumption. Let’s hope the GF100 winds up having a better launch.

Comments closed
    • Voldenuit
    • 10 years ago

    No news post on AMD’s paper launch of the 5830?

    A card that costs more than a 4890, but performs like a 4870. It has only *[

    • anotherengineer
    • 10 years ago

    YES POST 200

    THAT

    TWO HUNDRED

    kthx bedtime 🙂

    and another hijack
    §[<http://techpowerup.com/downloads/1763/ATI_Catalyst_10.3_Beta_Preview_for_Vista7_32-bit64-bit.html<]§

    • d0g_p00p
    • 10 years ago

    BTW way people. The 3X series are OEM (rebaged 8X series) only, that is why the new series is 4X.

    • axeman
    • 10 years ago

    Yes, I am ready to be underwhelmed.

    • green
    • 10 years ago

    i hope both nvidia and amd/ati fail
    so there’ll be less of these damn posts

    i liked it better when it was quiet(er) around here…

    having skimmed through a bunch of this it’s amazing how much there is with practically nothing of value having been said

    • Pax-UX
    • 10 years ago

    OMG I won’t be able to sleep with the excitement!!!!

    • kravo
    • 10 years ago

    I wonder what that image supposed to represent..or foreshadow. Should the apparently medieval barbarian fighters indicate the NV fan-masses as they are trampling down each other while trying to get their hands on the holy grail of GP-computing (ok, that’s sort of a catachresis)?
    Or does the guy with the cloak depict J.H.H., sitting fearlessly in the middle of the bestial, angry mob of users, who got insanely mad about the repeated stonewalling, attacking the gates of PAX East?
    Or are they ATI-fanboys….?

    you can say I’m trolling here, but I never really got the whole point of pretty imagery on a seemingly useless place like this, or those on graphics card,mobo etc packages… they just don’t matter.

    The former image with the japan gate and Eiffel-tower didn’t make any sense, but it was better that way.

      • Voldenuit
      • 10 years ago

      JHH is Darth Sidious?

        • JustAnEngineer
        • 10 years ago

        {waves hand}
        “These are not the GPUs you’re looking for.”

        • kravo
        • 10 years ago

        rofl! It was a good laugh, thank you!

      • tejas84
      • 10 years ago

      JHH is Dear Leader! aka Kim Jong il

      • MrBojangles
      • 10 years ago

      I think he’s meant to be a semi insulting representation of your average gaming nerd.Who’s so into is his character he sits at his desk in costume while playing his latest fantasy game But know he somehow truly holds the power to be in the game, and over come his enemies with ease.Since he bought a nvidia card.Kinda like how all d&d nerds are characterized as walking around in costume only talking as there in game character.Even though i was a d&d nerd in high school and never once seen a fabled “costumed player”.On a side note though is the fantasy stereotype even relevant anymore seeing how mid evil fantasy games have not commanded the lion share of releases for along time now.

    • Fighterpilot
    • 10 years ago

    Its all going to come down to price and availability.
    If Nvidia can debut a $300 Fermi card that’s pretty even with or better than an HD5850 then they will sell heaps of them.
    A $400 odd version that is a bit quicker than HD5870 will sell pretty well too.
    They still won’t beat Hemlock 5970…..it owns the open class like a Bugatti Veyron 🙂

    • internetsandman
    • 10 years ago

    I’m probably more interested in that green LED HAF 922 than I am in Fermi

    • AMDisDEAD
    • 10 years ago

    Exciting.
    The world needs a new CPU company to push Intel forward.

    • ub3r
    • 10 years ago

    PRIME1 is a fagot NVIDIA -[

      • SomeOtherGeek
      • 10 years ago

      Neither, maybe a midlife crisis, but that is about it. Sh/e will calm down after March 26th.

      • ssidbroadcast
      • 10 years ago

      dude, you just used a bigoted word. Not cool to use here, even on fanboys.

        • MadManOriginal
        • 10 years ago

        Yeah, using the word fagot to describe a fanboi is f-ing retarded.

      • reactorfuel
      • 10 years ago

      Not cool, man. Gays and transsexuals don’t deserve to be lumped in with /[

      • imtheunknown176
      • 10 years ago

      Comments need a report button.

        • ub3r
        • 10 years ago

        Comments also need a wanker button.

      • Rakhmaninov3
      • 10 years ago

      Nukeworthy.

        • tejas84
        • 10 years ago

        PRIME1 is a fanboi yes.

        The fagot thing is a bit unwarranted but I see what you mean!

      • steelcity_ballin
      • 10 years ago

      This coming from a front page troll with a 3 for an ‘e’ in his name. You’re so cool, bro!

        • axeman
        • 10 years ago

        You know, if everyone didn’t respond his post wouldn’t stay at the top…. uh… whoops.

    • jdaven
    • 10 years ago

    Two of my posts were deleted. One about what PRIME1 thanks about Fermi performance and one about the funny use of the words hot and limited in the PR release. First time I’ve seen that. Oh well.

      • MadManOriginal
      • 10 years ago

      Posts get deleted if you don’t pay the Apple Tax!! ooooh (lol, sorry)

    • geekl33tgamer
    • 10 years ago

    Gosh, were knee deep in PRIME1 grade BS again. Finding quality posts this past week has been difficult to avoid your spewing crap / spam.

    Your really are a ****. Do you read anything you type before posting it, or do just string together random crap from Wikipedia articles to make yourself sound -[

      • flip-mode
      • 10 years ago

      Well, one way to tamp down demand to keep it more in line with low volumes of product is to price it in the exosphere… course, if the rumored heat of the chip holds true, it may be stuck in the thermosphere.

        • geekl33tgamer
        • 10 years ago

        Hmm, true. I know over here, when the 8800GTX+ came out it was almost £650 + Tax. Nvidia in the past was not afraid of high prices to limit supply, but for me, even a price anywhere near that these days (when AMD’s competing product is half price) would be suicide.

        Forget exosphere pricing – It would be more like surface of Mars prices. Just like the red planet, it’s un-obtainable, and so will their cards be to the average PC owner.

        The heat thing might be an issue until a die-shrink is carried out – I wont bother trying to prove otherwise – It just don’t work out. We know what fab process it’s built on, and know the rough size of the chip. Wont take a genuis to figure out that a hot-clocked chip as big as this will be a PCI-E 16 toaster. On high-end these days, most of the cards under load have dust-buster noise levels and no one complains like they used to. Maybe that will be ok, and the chip can properly stand the heat…

        This chip might be better not at the top end, but in the mid-range like the 5770/5850 area? The previous gen 8800GT was a pretty pepply little card – First ones of those run hot, and made quite a noise, but no one was bothered to much because it performed well and cost a lot less than the 15-25% faster 9800GTX+ at the time.

        I’ll still keep an open mind, and see how this new hotness (literally) benchmarks before going either to ATI or Nvidia.

          • SomeOtherGeek
          • 10 years ago

          Taking your info into consideration, I would say that the 8800 at the time it came out was exactly what we could expect with this “Fermi”.

          The 8800 was hot, expensive as hell. But it was fast and people bought them up like hot cakes on that fact alone. Of course, ATi didn’t have anything even close to competing with the 8800.

          Now, it is a different story. ATi has some really fast cards. So, it is going to be hard for nVidia to blow them out of the water, but then we could be surprised.

          It all has been nothing but speculation, so until someone, anyone, can get their hands on one and do some testing, it will continue to be speculation. So, anyone’s thoughts are wrong until proven otherwise.

            • geekl33tgamer
            • 10 years ago

            Completly agree, hence the fact im still in the middle ground. A few retailers have cards on their sites at $700 for pre-order already. Straight away I can hear ATI laughing if they do go on sale at those prices.

            Just like the 8800GTX+, I expect this to be hot, faster than the competition and also really expensive. Like you said tho, ATI/AMD have a good card out this time round at half the price.

            Nvidia got away with £600+ before as no one could challenge it. It would be suicide for them to do the same again if the card offers a 0-10% boost over a 5870, but carries a 40-50% price premium.

            I don’t want to sound fanboy-ish (I’ve had ATI and Nvidia cards in the past – I buy whats best for the

            • SomeOtherGeek
            • 10 years ago

            Yep, I’m with you there. Not being fanboy-ish either, but I’m rooting for nVidia only to keep the competition healthy.

            • MadManOriginal
            • 10 years ago

            The other big difference, looking purely at the performance and no external factors like ‘the economy,’ is that when the 8800GTX launched it was vastly more powerful than anything at the time AND it was i[

            • SomeOtherGeek
            • 10 years ago

            Totally, right on the spot. I just don’t see anything that needs to be more powerful than the HD5k that can be useful, like you said. I have all the older cards, 4870, GTX260 and 8800 – they are all plenty for everything that I throw at them. Oh there are hiccups here or there, but nothing that can be overcome with lowering the eye candy.

            So, I must appreciate your comment about the usefulness of such a card, maybe down the road.

            • Applecrusher
            • 10 years ago

            Have to agree also. I have a 4870 512mb in my own rig and at work i get to play with all the new cards that come out (PC Store FTW) and I have little to no reason to justify an upgrade.

            Heck, I can barely convince our customers to go over a 5770 unless I have been able to get them to try Eyefinity. And even then its only the customers who like to try new things and have huge money to burn. So the point of these? If they are Nvidia loving ill just get them a 285 and be done with it because, honestly, it will play anything they throw at it without trying.

          • mcnabney
          • 10 years ago

          Die shrink is a long way off. Think 2011.

          Also, Nvidia hasn’t even taped out lesser versions of Fermi. Everything new they will be selling until late summer at the earliest is going to a Fermi chip in various degrees of crippling. Those chips cost a ton to make and I am hearing horrible stories about yields. ATI has a numerous chips in sizes ranging from big and hot to small and cool. More interesting is that the next generation from ATI, Many Islands, will be out later this year – hopefully on 32nm, but there is a chance of 28nm in a Global Foundries fab.

    • MadManOriginal
    • 10 years ago

    Hey, I don’t know if it’s been said yet but at least NV is being consistent here – rebranding a whole ad campagin this time!

    • WaltC
    • 10 years ago

    “Are you ready?” smacks of desperation. If nVidia was ready it would be launching the product right now. The real question is, “Is nVidia ready?” I don’t think so.

    nV30 went through a similar and ridiculous “count down” to release that masked the fact that behind the scenes nVidia was working furiously to squash all the nV30 bugs before showtime–a feat which proved impossible. This “Are you ready?” teaser campaign is unfortunately yet another stalling tactic. Come on, nVidia–if you’ve got something, then release it, for crying out loud! Sheesh!

    Next up: It’s coming soon!” and then we’ll see the “It’s almost here!” campaign, and then we’ll see the “It’s right around the corner” campaign and then well see the “Hold on to your hats–it’s almost here!” and so on and so on, ad infinitum. And by the time it actually does get here, at long last, it probably won’t matter a whole lot…;)

    • slaimus
    • 10 years ago

    It seems like they were unable to make the original CeBIT launch date that most expected.

    • clhensle
    • 10 years ago

    Can’t wait to see some real benchmarks on the final silicon. It’s going to be a win/win ifsupp it either blows away 5xxx or it at least makes ATI drop their prices to entice me to buy my first discrete ATI in a few years.

    I have never been a “fanboy” of any CPU or GPU, I got either what was given to me, a great deal at the time, or if I actually had to pay (retail) for it; whatever was the best performance/dollar or what I could afford at the time. I got stuck with some P4’s simply because I picked them up for free/cheap, normally full of malware, formatted, fixed, sold.
    Ive owned (desktop):
    -Whatever my 486 IBM PS2 had on it on the board (doom II’s last level struggled).
    PII 400 (built for myself/parents, I still have it for a win98 retro game machine)
    -Diamond Stealth II G460 (intel i740) + Voodoo II (free)

    P4 1.5 (Willamette, very ugh)
    -GF2
    -GF3 TI (free)
    -GF5200 (we all make mistakes, I needed a card for really cheap that week to sell the computer, then the deal fell through, card still sits in a box somewhere).
    -ATI 9600xt (birthday present from my g/f)

    AMD T-bird 1.0 (free sony vaio pc)
    -GF4 mx440

    P4 3.4 prescott (Was great for HL2, then sold it for my x2 build)
    -GF 6600gt

    Athlonxp 2800 @ 2.25 (is now my home server)
    -ATI9800pro (Sold, needed cash for my X2 build)
    -GF7600gt (was the best value AGP at the time, friend was upgrading to PCIe and picked it up for cheap)

    Athlon64 X2 3800 @ 2.4, then x2 4800 @ 2.75
    -GF 7800gt

    Xeon 3350 (q9450) @ 3.4-3.52
    -GF 9600gso (EVGA B stock, after coupon paid $12)
    -GF 9800gx2 (roommate who had quad sli, sold it to me for $100 last year, sold my GSO for $40, lol). Will be replaced by a HD5850 or GTX470 this summer.

    Phenom II 3.2 (my game server)
    -Onboard ATI

    I guess my point is; what is with all the “ohhh suck it NV, Intel, AMD, ATI” comments on this site, I love this site for its news, but some people are too defensive/obsessive on here. Who cares what the latest rumors are, lets see what happens and make an objective decision when/if it ever real numbers/data comes out, not argue what company pwns the most… facepalm.

      • tejas84
      • 10 years ago

      geez we don’t want your life history kthanxbai!

    • duffy
    • 10 years ago

    ‘Nuff said? Are Sgt. Fury and the Howling Commandos going to be at the launch?

    • flip-mode
    • 10 years ago

    Some PRIME1 fuzzy math for y’all (he studied with Dubya)

    Since launch of 5850, AMD’s market value has increased more than twice as fast as Nvidia’s (13.8% to 28.3%)

    PRIME1 would tell us that, obviously, this means Nvidia is on the precipice of bankruptcy.

      • PRIME1
      • 10 years ago

      10% of nothing is still nothing. Why not list that NVIDIA had nearly 3 times the revenue of ATI? Or that the 5zzz series actually lost ATI market share.

      If you are going to call me out at least have a leg to stand on.

        • SubSeven
        • 10 years ago

        5zzz series lost market share for ATI? What freaking planet do you live on? All those years of of getting cheap highs from sticking your head in the toilet after defacation are showing their ugly head in mass at last… buddy, you need to seriously re-evaluate your reality because from where I’m standing, your reality doesn’t even classify as extreme science fiction. Stop thinking which part of your body should get its next Nvidia tattoo and start thinking about getting some serious psychological help on the subject matter.

          • PRIME1
          • 10 years ago

          ATI fans type with such prose.

          §[<http://investorvillage.com/smbd.asp?mb=476&mn=167141&pt=msg&mid=8483074<]§ Yet they still have trouble with simple things like facts.

            • SubSeven
            • 10 years ago

            I’m not an either fan… I go for the best bang period. I don’t devote my life nor dream of either unlike certain people. I can read facts quite well thank you. Also, unlike certain people, I can read them and interpret them and try to understand what the figures are saying rather than to make absurd derivations to support intrinsic biases. I assumed that the data was factual and legitimate. With that assumption, I’ll ask where from that data do you see or can infer in anyway that the 5zzz series lost market share for ATI? Secondly, assuming that it was the 5zzz series, why do you attribute the loss of market share to ATI and not to shortage problems resulting from yield issues at TSM? The data is based on units… revenue based data would tell another story yet. So I say again, you see facts the way you want them to be seen (to suit your stupid biases and irrationality). With that said, I’ll reiterate, my comment from my pervious post, GO SEEK SERIOUS PSYCHOLOGICAL HELP, YOU ARE DELLUSIONAL.

            • MrBojangles
            • 10 years ago

            I keep asking myself the same thing.I just can’t comprehend how he can keep claiming that 5xxx series launch was a fail, when they pretty much stayed sold out of there limited supply through the entire launch.God nows how well they would have done if there had been no shortage, allowing a constant supply of instock cards at there launch prices.I mean you would have to be delusional to claim that a sold out launch was a failed launch.

            • willyolio
            • 10 years ago

            It’s all about misinterpretation. it’s how Flat-earthers can take a plane to travel and still vehemently deny that the earth is spherical.

            The 5xxx series sold out, people can’t get them and retailers are raising prices because demand is staying high. PRIME1 decides to reinterpret this as AMD failing to get any yields, people are NOT buying AMD cards due to these failures (thus losing marketshare) and AMD is the one raising prices because it was losing money, thus they need to increase margins to remain profitable.

            • flip-mode
            • 10 years ago

            That’s just the first turn on the trip he’s about to take us on. At the 5870 launch and in the subsequent weeks he was lobbying at every opportunity to have the launch labeled a paper launch, so it’ll be interesting to see what happens with this March 26 “launch”. If it’s not a hard launch with plenty of retail availability and cards staying in stock, then by his own measure, it’s a paper launch. Who here want to wager that PRIME1 will ever call it such if it goes that way… I’m all in favor of a return to competition and of Nvidia getting some competitive cards out there, but I’d also love to see PRIME1 squirm away from acknowledging a paper launch…

            Doesn’t matter much… the 4850 is setting a record for the longest I’ve ever owned a card and at 16×12 and with me not caring much about anti-aliasing, I suspect it will last me a good bit longer… I imagine by the time I’m ready to replace it Nvidia will have some cards worth buying.

            • MadManOriginal
            • 10 years ago

            Hey, just FYI, TROLL1 calls it the ‘5zzz’ to imply that it’s boring, like ‘zzz=sleep.’ Non-trolls do what nerds have always done by substituting ‘x’ and call it ‘5xxx.’ So don’t call it 5zzz just because he does 🙂

            • SubSeven
            • 10 years ago

            Hah MadMan, I actually didn’t catch that analogy. The use of 5zzz to imply that it was a boring launch just flew way over my head. His humor is much too clever for me :). I just went along with his format to make sure we were on the same page because you give this guy oranges and screams pineapples and pappays. At any rate, the last few days have been insane. Any article related to Nvidia resulted in a 150+ bashathon where prime is the clown and the rest of TR is running around with bats trying to hit him on the head. Can’t say he didn’t ask for it, but still. I am unproud to say that I too took the bait and stooped to that level and joined the bashing mob. I give up; no point arguing with this guy (heck i’m not even sure he is a guy, probably some 15 year old whose dad works for Nvidia or something). If he wants to claim Nvidia is the greatest… who am I to refute that?

            • NeelyCam
            • 10 years ago

            Beautiful. Trap was set, bait was in… and SCORE!

            Well done.

        • flip-mode
        • 10 years ago

        Facts are fact. What I said is factually correct.

    • Madman
    • 10 years ago

    I don’t get why everyone is so against Nvidia… The Fermi seems to be a very powerful general purpose architecture that is capable not only in graphics processing, but also as a add in processing board.

    5800U was a failure from gamers standpoint, but it was a good step towards later architectures. ATI9700 of that time had a ton of stupid limitations that 5800 didn’t had. And most technologies that were invented in 5800 days were used as a foundation to make technologies for these cards available.

    Even if Nvidia fails to deliver a killer gaming card, I still applaud them for trying to innovate, like they’ve always done before.

    I think only gamers that have no clue are the ones who want Nvidia to fail.

      • swaaye
      • 10 years ago

      GFFX 5800 had too much legacy hardware, required an incredibly good shader compiler, but also simply didn’t have enough ALU hardware to actually use its SM2.x features to any reasonable degree. That was the problem.

      Radeon 9700 was shockingly more efficient. The only disappointment was its FP24 precision, but I’m not sure if it was ever possible to see the missing precision in a game.

      §[<http://alt.3dcenter.org/artikel/cinefx/index_e.php<]§ It is wrong to compare Fermi to NV30 though. NV30 really was messed up. Fermi looks like it's just too big for manufacturing but it's likely a beast otherwise.

        • Madman
        • 10 years ago

        Still, it was the first generation of cards you could just write high level shader code without worrying about texture fetch count and other limitations on that front. R9700 F-buffer anyone *rollseyes*

        The pure math performance sucked, and that’s why Cg was down compiling to 16bit half fp wherever it could. But I remember how easy it was to develop shaders on that thing when you were not going for 60+fps throughout the 5200 – 5800 range.

        Using precalculated lighting lookup textures was a bit annoying if you wanted performance, and the fact that you could do math all the time on 9700 was cool. But when I remember 5800 I still remember the advances it brought to 3D programming.

        Even if Fermi fails right now, we can expect next generation to be way faster and all those goodies that you will be able to start experimenting now, like HW tessellation, will come here to stay, at least I hope.

        And Cg is now a backbone for HLSL and GLSL, it came here with 5800U

        • Sahrin
        • 10 years ago

        Well, you can see a lot of the same bets being made. nVidia is wagering a lot on geometry power and developers’ willingness to support a radically changed execution model (basically eliminating fixed-function; something that, while progressive also has been met with a lot of backlash in the past – not to mention the performance hit from emulating the same for all existing stock). They are bold moves that seem more at home in HPC than in GPU (look at Itanium, SPARC, POWER – those architectures continually make the same kind of bets – but they are also niche).

        For an unreleased product for which the only source of information appears to be Charlie – I think it’s way to call the pot “NV30” too early. That said, there are a lot of metaphors even now (die size, jumping processes, the overall lateness of the GPU) that give credence to the Prophets of Doom.

        I guess my thing is, there’s not a truly integral feature set that nVidia has introduced to gaming. ATI introduced (forgive me if this list is totally bogus, I’m going off of memory here) Video acceleration, DX9, HDR, Eyefinity (jury’s still out on that one, I admit) Unified Shaders, compute shaders (via CAL and CTM and F@H). nVidia’s contributions have largely been in the form of features that were ultimately abandoned and or ignored by the gaming community (PhysX, CUDA, 3D Stereo, SLI).

        ATI’s got some “me too’s” but, as a gamer, when I look at the company that has done the most for the gamer in the last 7 years – I don’t think of nvidia. I think nVidia’s done a lot to help their business; ATI’s done a lot to help their customers. Not everything ATI’s done is golden (not by a *long* shot); but ATI appears to be trying to take care of the GPU customer – which right now, is the gamer. nVidia seems to be doing everything it can to sell GPU’s to more customers.

          • PRIME1
          • 10 years ago

          l[< Video acceleration, DX9, HDR, Eyefinity (jury's still out on that one, I admit) Unified Shaders, compute shaders (via CAL and CTM and F@H)<]l Funny how many of those actually were by NVIDIA. HDR, Purevideo, unified architecture. Not to mention DX10. The Eyefinity thing was supported on NVIDIA's quadro line and by Matrox long before ATI followed. Really ATI rarely if ever brings anything to the table. Other than a misinformed fanbase.

            • Sahrin
            • 10 years ago

            Insert name of popular DX10 game here:

            Video acceleration goes back to Rage 128 days – nVidia had no competitors. HDR was first supported by ATI on X1900XT either in Oblivion or Lost Coast; ATI and nVidia were fighting back and forth on who would ultimately release it first, but nVidia’s drivers held them back (I think it was literally weeks, so a pretty close margin).

            • PRIME1
            • 10 years ago

            Boy you are so full of wrong.

            Purevideo was true hardware based acceleration and came out a generation before ATI’s (whatever marketing name the called it).

            HDR came out on the 6xxx sereis while ATI was still struggling with the epic failure that was the x1800

            As for your DX10 line…. DX11 is doing worse so far.

            Keep spinning.

            • swaaye
            • 10 years ago

            ATI was very aggressive with their video support. Some of the ancient Rage II chips do MPEG2 acceleration. Rage Pro in its crazy popular Mobility variant can get you full DXVA support for DVD with 9x, 2k and XP. I’m pretty sure that they were ahead of everyone else on that front for awhile.

            Purevideo was a partial NVIDIA screw up. Remember how it was broken in 6800? 😀 But they had DXVA for DVD back to RIVA TNT, I think….

            Real non-bloom HDR was made possible with shader model 3 so yup NV was first on that because R4x0 were SM2b. The Far Cry patches with their horrible beta-level HDR implementation for the then new NV 6xxx comes to mind.

            • Voldenuit
            • 10 years ago

            Don’t forget, in the old days, nv made you *pay* for Purevideo. Dicks.

            • YeuEmMaiMai
            • 10 years ago

            lay off the crack pipe man ATI has been doing video accelleration since at least the rage 128 days Nvidia didn’t innovate anything in that area as S3 was one of the first (if not the first) to have hardware assist video decode for DVD (MPG2)………in the ANCIENT savage 3D…..

          • coldpower27
          • 10 years ago

          Yes the 9700 intrduced a HDR format, but since only ATI hardware used it not many companies developed for it. ATi tends to be really weak in getting developers to user their specific features.

          It wasn’t till the 6800’s that nVidia, released a HDR format that was adopted my the industry, as it was a required thing in SM3.0 vs optional on SM2.x

      • StashTheVampede
      • 10 years ago

      It’s not that we’re against Nvidia or their new chip — I very much want them to ship their next-gen chip. I’m more of the strong disbelief that the product won’t perform as suspected, will be of high cost and will be a paper launch.

      I’m a skeptic by nature and there isn’t (yet) a ton of proof showing this card will be in OEMs hands for Q1 2010.

        • PRIME1
        • 10 years ago

        l[

          • swaaye
          • 10 years ago

          The arrival of GF100 performance numbers, price and availability will be the first time that anyone can judge 5xxx.

          Well, aside from grey screen and megacursor, anyway. lol.

          • [+Duracell-]
          • 10 years ago

          Most stock HD 5870 cards are at or a bit below MSRP, which is much better than what it was a few weeks after launch.

      • Sahrin
      • 10 years ago

      I think you’re confusing ‘marketing bullet points’ with ‘innovation.’ It’s one thing to implement a new feature – it’s another entirely to say you did but deliver no substantive change to the user. CineFX is a perfect example of that. (This was probably the biggest ‘bullet points’ for FX5800U). Ultimately, it didn’t deliver the visuals nVidia promised – which was good, because performance in those features sucked. I’m sure you remember the notorious series of nVidia benchmarks which were hacked to run on ATI cards (first R300, then RV370, then R420, then R480) – and actually ran *better* on the ATI cards than on nVidia’s.

      ATI, on the other hand, didn’t spend a bunch of time pissing about with fancy marketing names. They built a card that hard the right balance of hardware power to support DX8.1 – as well as enough power to support future DX9 games. My R300 card (AiW 9700Pro) lasted until 2006 (fully three years) and ran every game I could throw at it like a dream.

      Performance > Features. Features > Features that aren’t actually implemented.

      R300/9700 Pro had performance first of all. Then it had actual features (DX9) that gamers cared about.

      And finally, it all came in a package that was *vastly* superior to what nVidia was selling (Dustbuster two slot v. ATI’s slim single-slot design?). nVidia tries to use clock rates and Texture Units to overcome the pixel shader deficiency – which in the end, cost them dearly because the market was moving all the faster to embrace pixel shaders.

        • PRIME1
        • 10 years ago

        Actually all ATI did was buy a company called ArtX and rebadged the GPU they were working on as the 9700. It is sad though that ATI fans have to go back that far to talk about anything impressive that ATI did.

          • swaaye
          • 10 years ago

          Huh?

          Artx’s CEO got to be ATI’s COO, eventually CEO. Some other Artx guys got put up top too. Seemed like a nice merging to me. 😀

          Oh and that was over 2 years prior to R300’s launch.
          §[<http://ati.amd.com/companyinfo/ir/2000/2000.html<]§

          • Sahrin
          • 10 years ago

          …wow, dude. Just wow.

          So really what you’re saying is, ArtX > ATI > nVidia. OK. I can agree with that.

          • flip-mode
          • 10 years ago

          I’ll let you respond to yourself:

          q[There is no -[http://www.techreport.com/discussions.x/18515#jazz<]§

        • Madman
        • 10 years ago

        It didn’t brought actual performance to end user, but it was a foundation on which later architectures 6×00/7×00/8×00 was built, when you code for any of the later cards, it’s still good old 5800U, streamlined, but the idea is there. Technology is there. FP textures, shader compilers, unlimited length shaders, looping and so on. It was a breeze to test and code for 5200,5800U.

        Concerning benchmark cheating, both parties were cheating at that time, Nvidia did that harder. I hope those times are not going to return.

          • Sahrin
          • 10 years ago

          “When Half-Life 2 was released a year later, Valve opted to make all GeForce FX hardware default to using the game’s DirectX 8 shader code in order to extract adequate performance from the Nvidia cards.”

          (I remember when this happened).

          I get that there were a lot of changes introduced in FX5800U; that may have made it a great dev kit. But it was a lousy graphics card.

            • Entroper
            • 10 years ago

            I think I agree with both you and Madman. My first impression of GF100 was that it was another FX — a number of good steps forward in technology, but too ahead of its time to be a good consumer product. And I’m fine with that — I don’t have to buy a graphics card in the next 6 months, anyway. This is how nVidia tends to operate; TechReport has called it the tick-tock method. Make a big leap in technology — see GeForce 256, GeForce 3, FX, GF100. Then refine the design and make a followup with excellent performance — see GeForce 2, GeForce 4X00 series, 6000 series, and I expect the followup to GF100 at the end of this year. Hence why my graphics cards always tend to start with an even number. 🙂

      • MrBojangles
      • 10 years ago

      No ones “against” nvidia. Alot of us are just getting tired of having to see fanboi’s (not going to list names but you know who you are) post all kinds of inaccurate nonsensical BS, every time a news post, review, or guide comes up.That “GASP” just might suggest that there favored company isn’t the top dog in all fields of competition atm.

        • Madman
        • 10 years ago

        I just hope that Fermi gets out, that Ati releases new Radeons to compete with it, prices go down and innovations come in.

          • Welch
          • 10 years ago

          I second that…. I don’t give a damn either way…… I just want Fermi to releases to see prices get lowered, I the consumer benefits. If Fermi releases and its not a better card for GRAPHICS (that is what im buying a GRAPHICS card for…) then i’ll buy an ATI card. I Don’t give a damn about its ability to run other software, i’ll have a 6 or 8 core CPU for that.

          Who cares whether one company did something before the other 5 years ago… its all a matter of where they are at NOW…. We live in the present, get the hell out of the past people.

      • Umbragen
      • 10 years ago

      Why? Derek Perez. Yes, I know he no longer works for Nvidia, but they still have a Crave Case of karmic shit-sandwiches left on their plate; one for every time marketing won out over substance.

    • setzer
    • 10 years ago

    Sadly what this teaser doesn’t state and very craftily, I should note, is when the general availability of said cards will occur, nothing in that press release says anything about anyone buying one, and no, the article says “you may even be able to buy one before anyone else.”, which could mean you can pre-order now and get it delivered next quarter 😛
    well time will tell

    • indeego
    • 10 years ago

    This is the single dumbest thing you all argue about.

    Thanksg{<.<}g

      • RickyTick
      • 10 years ago

      When I saw this Discussion Topic go up yesterday, I knew it was chum in the water. Some folks really get worked up over this.

    • crsh1976
    • 10 years ago

    “Are you ready?”

    We were ready for it last year, now we got ATI cards and don’t care about your vaporware.

      • JustAnEngineer
      • 10 years ago

      Sad, but true. We were ready for NVidia’s DirectX 11 graphics cards last year.

      GF100 should have been available when Windows 7 launched.

      GF100 should have been available before Thanksgiving.

      GF100 should have been available before Christmas.

      GF100 should have been available months ago.

      Fermi might have been an astoundingly good product six months ago. Today, it’s a symbol of how far NVidia’s engineering has fallen behind their evil marketing geniuses.

    • SoulSlave
    • 10 years ago

    Stupid comment of the day (or is it?):

    Has anyone noticed that the GF in GF100 are the initials for Global Foundries?

    Makes no sense, I know, still it’s kind of ironic the same letters that might mean salvation for one company, might also mean damnation to the other…

    Only if it were 2012 already…

    Nostradamus mode: OFF

      • jdaven
      • 10 years ago

      I think GF stands for GeForce which has been around as a brand much longer than Global Foundries.

        • SoulSlave
        • 10 years ago

        Now, that does make sense.

        • flip-mode
        • 10 years ago

        No, it means Girl Friend, and I’m sure many of the Nvidians like it that way.

          • SoulSlave
          • 10 years ago

          Hehehe…

          couldn’t help but to read a bunch of the other posts it sure looks funny from the outside, hehehe…

          But seriously now, I really hope NVidia does at least ok, 5xxx need some competition, don’t get me wrong I have 4850 hard at work right now, but I also had a 9600GT a 7600 GT and a 6600 GT before that, and I loved the value on those cards (not so much on the 9600 but it was ok), as I love the value on my 4850…

          So I was hoping I could get that kind of value back you know?

      • yogibbear
      • 10 years ago

      Get F*(ked

        • SoulSlave
        • 10 years ago

        Wow, sensing some tension around here…

          • MrBojangles
          • 10 years ago

          Sad part is i don’t even understand what was said in post #76 to even warrant that.I mean unless he is just really passionate about initial comparisons and Nostradamus.

            • RickyTick
            • 10 years ago

            I don’t think he was telling anyone to “Get F***ed”, he was just saying that GF was the initials for “Get F****ed”.

            • SoulSlave
            • 10 years ago

            Oh, I see…

            Even funnier!

        • Nitrodist
        • 10 years ago

        Yeah! Go fork!

    • anotherengineer
    • 10 years ago

    smelly cat ……………smelly cat…………..what are they feeding you?

      • tejas84
      • 10 years ago

      I like smely cats!

    • Krogoth
    • 10 years ago

    Delicious tears are shedding from the fanboys.

      • BoBzeBuilder
      • 10 years ago

      Tears are salty, but tasty.

    • Fragnificent
    • 10 years ago

    I have purchased tickets to this as I live nearby, and will be reporting what I see and hear to Damage.

      • SomeOtherGeek
      • 10 years ago

      Awesome! Thanks and looking forward to the report.

    • phez
    • 10 years ago

    Not quite sure about all the nvidia hate, considering none of you seen a benchmark review of the card?

    If the card fails hard, AMD will surely have a price drop ready. If the card is a winner, AMD will surely have a price drop ready. And then nvidia will have to stay competitively priced either way.

    So … win-win for everyone?

      • SomeOtherGeek
      • 10 years ago

      nVidia hate? No, more like nVidia silliness.

      But you do have a point. But I have to disagree with the nVidia fail/AMD price drop part. It nVidia fails, I think the prices will stay as they are cuz AMD will know people will flock to them.

      So, let’s just hope that nVidia does not fail.

    • SomeOtherGeek
    • 10 years ago

    <conspiracy theory>

    4xx card branding means that 3’s must be used somewhere:

    3 slot card with a Dustbuster fan
    3 power plugs
    3x the heat
    3x the price
    3x more bull***tting to keep the fans happy

    </conspiracy theory>

    • ClickClick5
    • 10 years ago

    “Are You Ready?”…. for heat, massive power draw, shotty performance and a massive wad of cash to leave your bank account?!

    HELL YES!!!!!!!!

    (note irony)

    • Ryu Connor
    • 10 years ago

    Interesting difference in tone between Fuad and Charlie’s writing.

    • anotherengineer
    • 10 years ago

    “Top: the PAX East teaser image. Bottom: the NV30 teaser image. Source: Nvidia.

    pic
    “are you ready”

    lmao Prime1 has been ready for over a year now along with all the other nvidia fanboys 😀

    • Mat3
    • 10 years ago

    Are You Ready? tagline being the same as the NV30… I’m thinking it’s gotta be a ruse.

    They wouldn’t really use the same tagline from their last infamous failure if they knew Fermi was going to be a lemon too….. right??

    • PRIME1
    • 10 years ago
      • MadManOriginal
      • 10 years ago

      Opinions are like you, except not everyone has one of you.

      • Jigar
      • 10 years ago

      Do you realize that after one day they had one more news that Trillian is just an upgraded HD5870 with 6 monitor support ?

        • Game_boy
        • 10 years ago

        And those of us reading better websites knew that months ago?

        (That Trillian was the codename for the “Eyefinity Six” 5870)

    • PRIME1
    • 10 years ago

    My guess is a single 480 will beat the duct taped 5970

      • swaaye
      • 10 years ago

      You’re just playing with the TR folks, aren’t ya. 😀

        • spigzone
        • 10 years ago

        ‘playing’ implies imaginatively sly wit and irony.

        not Prime1’s strong suit.

      • anotherengineer
      • 10 years ago

      well for the price it should be at least equal, and since its a new architechture it should be at least a 1.5x increase on top of that, however we will have to wait for some real official pricing and benchies first I suppose.

      edit: duct tape is good stuff, isnt that what intel used on its core 2 quads??

      • ish718
      • 10 years ago

      *[

        • PRIME1
        • 10 years ago

        Fermi is a new architecture. 5zzz is not.

          • poulpy
          • 10 years ago

          PRIME1 vs (logic & the world): are you ready?

            • SomeOtherGeek
            • 10 years ago

            Got my couch and popcorn! Care to join?

          • swaaye
          • 10 years ago

          Just what is a new architecture? Nothing is new anymore, it’s all evolutionary.
          §[<http://forum.beyond3d.com/showthread.php?p=1396025#post1396025<]§

          • Sahrin
          • 10 years ago

          This is true – many people don’t give ATI credit for this, but they’ve doubled performance without dramatically changing the layout and design of their ALU’s. That’s what makes the whole thing so impressive.

          RV970 (Northern Islands) will be all-new; so we’ll see what happens there. It’ll be interesting to see mature (and hopefully palatable) Fermi referesh against ATI’s all-new architecture. ATI has really struggled with some “all-news” (R400, R500, R600 sort of) and had great success with some “All news” (R300, R700, R800).

            • swaaye
            • 10 years ago

            Careful with those codenames. 😀

            R400 was an interesting unreleased chip. R420 was clearly a R300 derivative.

            R500 is/was known as what is used in Xbox 360. R520 is X1800 and yeah it was fairly fresh stuff and blew up in their faces basically.

            • Sahrin
            • 10 years ago

            The R400 point is well taken; but R500/C1/Xenos (there were multiple codenames – I don’t recall if any one in particular was ever settled upon publicly) were the technological base for the R5xx series).

            • swaaye
            • 10 years ago

            Actually Xenos is somewhere in between R600 and R580. It is a unified shader design like R600, but the shader architecture is vastly different than that of R600 and later desktop hardware.

          • cegras
          • 10 years ago

          That’s not the point.

          I’m making a large and prodigious assumption that you know what the point is. But you’re probably too scared to admit it.

        • Grape Flavor
        • 10 years ago

        Actually, to be fair, under most circumstances GTX 280/285 outperform 4870 X2 in Crysis. At least according to bit-tech.

        It’s always frustrated me how crappy ATI performance is in Crysis relative to other games. Crysis is still top of the heap graphically and whether it’s a dealbreaker for you or not, it’s a shame that even their high-end cards suddenly look positively midrange when you run the Crysis benchmark. Yeah, 5970 brute forces its way through, but the rest require some significant toning down of settings.

          • Meadows
          • 10 years ago

          Sure, 1 game is all-powerfully relevant.

            • Grape Flavor
            • 10 years ago

            I never claimed it was. But it’s still the graphics king, and some people, myself included, might be looking forward to playing Crysis maxed out with a new high-end card.

            And for those people, the fact that you have to take a step backward Crysis-performance-wise, to get a DX11 card (HD5xxx series), is pretty disappointing.

            As I said, how big of a deal this is depends on the individual. But to dismiss it as totally irrelevant, when the game is still one of the top games with which you could justify high-end hardware, is wrong.

          • Voldenuit
          • 10 years ago

          xbitlabs found otherwise.

          §[<http://www.xbitlabs.com/articles/video/display/radeon-hd5870_9.html#sect2<]§ In their tests, the 5870 and 4870X2 were generally faster than the 295 and 285 in every resolution except 2560x1600, where the 295 powers through. Not only that, in their 5850 review, the 5850 edged ahead of the 285 at every resolution. And this was with the initial release drivers. §[<http://www.xbitlabs.com/articles/video/display/radeon-hd5850_6.html#sect0<]§ I've been bitching about the price of the 58xx cards, but you won't hear me complaining about performance.

      • yogibbear
      • 10 years ago

      Agree. /10char 🙂

    • darryl
    • 10 years ago

    well which ever way the Nvidia winds blow, I’ve been patiently waiting for the new GF100 series to officially launch. I’ve owned both ATi and Nvidia cards in the past, and enjoyed them both. The thing was to buy the card that suited my desires at the time (bang-for-buck being a major deciding factor, considering what sims I was using at the time). This time may be no different. Will a GTX470 out-perform an ATi 5850 by sufficient margin and still be price-competitive enough to be worthy of consideration? Maybe if I could get my hands on some Canadian bud I’d say screw the price and go for the GTX480! 😀 (j/K Mr. FBI man)

    • rUmX
    • 10 years ago

    Here’s a piece of paper with the words “Fermi” written on it. Please… take one and wait in line.

    • SoulSlave
    • 10 years ago

    As much as I like AMD/ATI (and I really do), I really don’t want NVidia to fail, 4850/70 days were great! Let’s hope NVidia does reasonably well and ATI drops the price on 5850…

    • PRIME1
    • 10 years ago

    Charlie will probably spend all day in bed eating cookie dough and crying softly to himself.

      • Meadows
      • 10 years ago

      Why, for being right?

        • HighTech4US
        • 10 years ago

        Charlie has stated absolutely that the GTX480/470 is un-manufacturable.

        So how can nVidia have a launch of them – unless (ding Ding DING) CHARLIE IS WRONG.

          • swaaye
          • 10 years ago

          He said that they’ll make a few thousand of them and then drop it essentially until they can refresh the chip in 8 months or whatever.

          §[<http://www.semiaccurate.com/2010/02/20/semiaccurate-gets-some-gtx480-scores/<]§ That would not be a profitable run or a very successful manufacturing situation. I'm waiting to see how NVIDIA plays with this launch. We'll see who gets cards first. It might be wise to be suspicious of things.

          • ub3r
          • 10 years ago

          They are launching with the cards they managed to salvage from the 2% yield rate.

          Even if they do launch a working product, prepare to wait yonks, and prostitute yourself for it.

          • mcnabney
          • 10 years ago

          Fermi is designed as a GPGPU, not as a video card.
          Although Nvidia will be happy to sell them at grossly inflated prices to suckers–ergh—fanbois—ergh—customers, they did not design Fermi to game.
          Fermi is designed to be put into high-performance computers as a GPGPU and those cards will sell for THOUSANDS of dollars each. So Nvidia might lose money on the gaming side, but make it on the compute side.

            • swaaye
            • 10 years ago

            Someone brought up a good point the other day and that’s that DX11 requires a lot of compute hardware in the GPU. So that hardware is useful for gaming.

          • Game_boy
          • 10 years ago

          Charlie has also said they paid for risk wafers in 2009 and since they have them they might as well sell them, though still for less than they paid. They won’t be manufacturing (many?) new ones.

          • StashTheVampede
          • 10 years ago

          The proof is in the pudding, right?

          Let’s see how far AFTER that demo cards are in stock at Newegg.com. If the card is indeed profitable, we’ll have in the hands of Newegg and it’ll be sold at whatever prices the market can handle.

          If ONLY the high-end OEMs get this card in their shipping computers, then we know it’s a paper launch.

          • designerfx
          • 10 years ago

          considering that we’ve seen 0 of them in the public, guess what? you, my friend, are wrong.

      • can-a-tuna
      • 10 years ago

      Tears of joy perhaps?

      • spigzone
      • 10 years ago

      giggling softly to the point of tears …

      • blubje
      • 10 years ago

      Judging by the /[

      • 2x4
      • 10 years ago

      charlie – thank you come again

    • SecretMaster
    • 10 years ago

    I like how they try to be all hip/cool with the last line of the paragraph.

    “Line up early as seating is limited. ‘Nuff said. ”

    :rolls:

    • jdaven
    • 10 years ago

    Semiaccurate has links to early postings of Fermi on SabrePC.com.

    §[<http://www.semiaccurate.com/2010/02/22/first-gtx-480470-retail-listing-appears/<]§ Reading the "comments" to people who have "bought" the cards are hilarious. A must read (unless you are an nvidia fanboy). §[<http://www.sabrepc.com/p-184-pny-nvidia-geforce-gtx-480-2gb-gddr5-pci-express-x16-retail.aspx<]§ §[<http://www.sabrepc.com/search.aspx?SearchTerm=GTX+470<]§

    • Helmore
    • 10 years ago

    Just read the reviews of this card on this site here: §[<http://www.sabrepc.com/p-174-xfx-geforce-gtx-480-2gb-gddr5-pci-express-x16-retail.aspx<]§ Simply epic :D.

      • derFunkenstein
      • 10 years ago

      it’s been pulled. Product not found.

        • Meadows
        • 10 years ago

        The product is there – the reviews get pulled.

      • Vaughn
      • 10 years ago

      LMAO i’ve been reading this all day since I saw the link….

      those comments are the funniest i’ve seens in quite sometime.

      • flip-mode
      • 10 years ago

      700 lolz

    • anotherengineer
    • 10 years ago

    March 26, I will be reading the catalyst 10-3 improvements 🙂

      • Meadows
      • 10 years ago

      On how it breaks your cursor again?

        • ew
        • 10 years ago

        Your not ready to handle a cursor as large as the one the 10.3 drivers will give you!

          • flip-mode
          • 10 years ago

          He he… it’s a feature, not a bug!

        • anotherengineer
        • 10 years ago

        lolz

        actually I never had the mouse cursor problem that people talked about, either on xp or w7, maybe I am blessed I guess.

    • jdaven
    • 10 years ago

    This comment is for PRIME1 in response to all of his/her comments in the story entitled:

    “Nvidia on GF100 derivatives: Our current-gen GPUs are fabulous”

    Almost all of your comments are about financial reports and past performance. The topic was whether or not Fermi will be another FX chip: late, hot, expensive and underperforming not to mention poorly developed for good yields which will hurt availability and lower cost variants. The story above is also talking about Fermi and relating it to the FX chip launch.

    Would you care to comment on Nvidia’s execution of Fermi and how you think it will perform? Ignore ATI/AMD at the moment and just concentrate your thoughts on Fermi. I’m sure many of us here are dying to know what you think about the Fermi controversy and not 2005/2006 financial reports.

    • elty
    • 10 years ago

    As if the teaser weren’t clear enough, the company goes on to say attendees will get to try its “highly-anticipated, next-generation GPU” at the company’s booth in Exhibit Hall D, and some folks “may even be able to buy one before anyone else.”

    This means they will have a handful of demo card available, with no stock on the market, but a few “selected people” (most likely nVidiot) will be able to buy one.

    Paper launch confirm?

    • Jigar
    • 10 years ago

    Are you ready ? – I was about to buy HD5850 … World is just not fair 🙁

      • Meadows
      • 10 years ago

      Please proceed to buying it.

      • jdaven
      • 10 years ago

      Go right ahead. You wouldn’t be able to afford a Fermi chip anyways unless you have $600+ to drop on a GPU not to mention you will have to wait until June/July to buy one.

      • Gerbil Jedidiah
      • 10 years ago

      World is not fair? Have you seen how well the 5850 performs? Life is good right now my friend. 😉

      • odizzido
      • 10 years ago

      If you wait for computer parts you will be waiting forever. I remember when people were holding off buying a 58xx to see what nvidia would release many months ago….and nothing.

      • BobbinThreadbare
      • 10 years ago

      Either go ahead or wait a couple months and get a better or cheaper card.

      Pretty much the same rule as always applies to technology.

    • BoBzeBuilder
    • 10 years ago

    Are you ready?

    ARE YOU READY???

    FAIL.

    Jen-Hsun frantically dancing to keep the crowd entertained.

    • Hattig
    • 10 years ago

    “Are you ready?”

    Hint to Nvidia: the starting gun was fired five months ago, and ATI have lapped you several times (juniper, redwood, cedar) since. Maybe you should stop posturing on the starting blocks?

    • ssidbroadcast
    • 10 years ago

    Okay, I can’t hold my tongue on this one. The hubris, btw, is overwhelming.

    I’m slightly confused, “GTX _[<4<]_80" ?? Did nVidia decide that they were too far behind on marketing bull__t numbers and /[

      • Kulith
      • 10 years ago

      So what? They skipped GTX 214 and GTX 267, isn’t that the same thing? It’s just the way they want to market their cards. They skipped 1xx entirely, so it makes sense to go with even numbers. Or maybe they are going to introduce more midrange 3xx cards AFTER they high end cards, like they have been for a while now…

      Either way, who cares. Also, I doubt any [PC user] is dumb enough to pick a Radeon 5xxx over a GTX 4xx just because it’s a bigger number with an extra digit….

        • can-a-tuna
        • 10 years ago

        It would be stupid if ATI skipped next to “HD7000” series, wouldn’t it?

          • Kulith
          • 10 years ago

          Yea… damn, there goes my argument.

      • designerfx
      • 10 years ago

      I thought you liked nvidia? I keed I keed. Anyway, it’s probably just so they can have a GF400 as their top end and then GT3xx products for cheaper/other products.

      Don’t get me wrong, I hate the branding as it exists currently.

        • ssidbroadcast
        • 10 years ago

        There are things to like about NVIDIA and things to dislike about NVIDIA. Their marketing falls in the latter.

      • PRIME1
      • 10 years ago

      300 series is their dx10.1 line (I think)

        • Meadows
        • 10 years ago

        GT 240 does Dx10.1. Learn your own mojo, boy.

          • PRIME1
          • 10 years ago

          What part of 300 don’t you understand?

            • spigzone
            • 10 years ago

            The second zero *sigh* … please help.

            • Meadows
            • 10 years ago

            Where the guy is kicked into the pit?

            • LiamC
            • 10 years ago

            I literally LOLed. Thanks. That hasn’t happened in a while

            • poulpy
            • 10 years ago

            The instant spray on abs?

            • Krogoth
            • 10 years ago

            TONIGHT! WE DINE ON FANBOYS!

      • crabjokeman
      • 10 years ago

      The 300-series model #’s are reserved for rebadged older products.

      • MadManOriginal
      • 10 years ago

      This launch will be known in the history books as the culmination of a long standing movement in interGPU relations with the realworld application of the ‘Huang doctrine.’ To put it simply the Huang doctrine involves preemptive renaming of a product against GPUs that present a perceived threat even if said GPUs don’t exist. In this case they have preemptively renamed the GTX 300s as GTX 400s.

    • mdk77777
    • 10 years ago

    q[

    • Game_boy
    • 10 years ago

    “If there is a second respin, then you might have a hard time hitting Q1 of 2010.” – Charlie; last May

    I think March 26th counts as a ‘hard time’.

      • jdaven
      • 10 years ago

      They made it in just under the Q1 deadline. However with wording like

      “special, “must see” presentation”

      and

      “may even be able to buy one before anyone else”

      I’m still very skeptical that this will just be another paper launch with a lot of powerpointery and demos. Those “samples” that you “may” be able to buy might be the only availability until deep into Q2.

    • Kent_dieGo
    • 10 years ago

    Forget the FX 5800, if the idle power consumption figures are correct this will be nVidia’s Voodoo 5 5500.

      • TheEmrys
      • 10 years ago

      I loved my Voodoo5 5500. Still have it somewhere. And nVidia is far too strong right now to go under, unlike 3dfx.

        • Kent_dieGo
        • 10 years ago

        I should have said Voodoo 5 6000. I liked my 5500 as well. Paid a fortune for it but that did not seem to help keep 3dfx from the dumpster.

          • JustAnEngineer
          • 10 years ago

          I had two Voodoo5-5500s. One was AGP.

            • Kent_dieGo
            • 10 years ago

            Having a “State of the art” PCI card for a dual monitor setup must have been sweet. That card should have been useful for many years.

            • Thrashdog
            • 10 years ago

            Hell, I’ve still got a Voodoo3 that gets dusted off from time to time for third-monitor duties. So what if they weren’t that good? I still got to play Red Baron 3D in Glide mode…

    • thill9
    • 10 years ago

    It’s ironic that they’ve chosen the “Penny” Arcade Expo to make the card’s debut, considering that it’ll probably cost over 60,000 cents.

      • Meadows
      • 10 years ago

      $700 to be precise. It’s a joke.

    • jdaven
    • 10 years ago

    “what’s hot”

    “seating is limited”

    Yep this is the launch of Fermi alright. Lol!

Pin It on Pinterest

Share This