Rumor: Nvidia tapes out next-gen GPU

From deep in the bowels of the rumor mill, SemiAccurate’s Charlie Demerjian reports that Nvidia has taped out its next-generation graphics processor. In semiconductor industry jargon, tape-out means the blueprints have made their way to the foundry—TSMC, in this case—for the first round of manufacturing.

What does this mean for the GT300’s release schedule? Going on word from anonymous sources, Demerjian says Nvidia should have to wait roughly 10 weeks for the first prototype cards using the new silicon. From that point, "if everything goes perfectly, not so much as a hiccup," the first production parts could make their way into stores in mid-December.

As is often the case, though, Demerjian isn’t optimistic about Nvidia’s prospects. He says the GT300 will be a big GPU—530mm², or slightly bigger than the 55nm GT200—and it should be Nvidia’s first DirectX 11 design. (The article also claims the GT300 will be Nvidia’s first GDDR5 part, but as far as we can tell, that’s not accurate.) Since Nvidia has yet to deliver smaller 40nm parts in volume, the firm may spend a while debugging and fine-tuning the GT300 before it’s ready for production.

We may therefore not see a launch until next year—perhaps a few months in. By contrast, AMD showed wafers of its next-gen DX11 GPU in June, and our sources suggest a launch is coming this fall.

Comments closed
    • Silus
    • 13 years ago

    And why would they use it ? It makes no sense for them to use a third party chip. It makes more sense for them to create a similar SoC and “include” its use in multi-GPU setups.

    • RagingDragon
    • 13 years ago

    q[

    • RagingDragon
    • 13 years ago

    Maybe Nvidia or AMD will use the hydra chips (or something similiar) in future single card multi-GPU products?

    • Silus
    • 13 years ago

    AFAIK, no, but the scaling that was shown was better and more consistent across the board. The concept is interesting, since this SoC is doing what software (drivers) is doing right now. But I don’t see this every catching on. Maybe for some companies that really need almost perfect scaling, for whatever reason, but I doubt it will catch on for regular consumers.

    • SPOOFE
    • 13 years ago

    Tell me, were there graphically unimpressive games on the PC before consoles became as huge as they currently are?

    • Silus
    • 13 years ago

    Wow, the “missing the point? there was no point” line…amazing…it’s almost as good as “you’re wrong, because I say so”!

    Great argument there!

    • MadManOriginal
    • 13 years ago

    Were there ever any actual independent reviews of Lucid hardware or is it just a theoretical ‘goes deeper in multithreading GPU load therefore it’s got to be better?’

    • SomeOtherGeek
    • 13 years ago

    Missing the point? There was a point? I definitely missed it cuz there was no point!

    Feeding the troll, yea, I have to stop talking to you.

    • Silus
    • 13 years ago

    Thanks for missing the point and do keep feeding the troll!

    • SomeOtherGeek
    • 13 years ago

    I was going to say that, it was “semi-funny”, but then I didn’t know that there were sensitive people about people writing BS.

    So, you are saying that “Onion” news is bad journalism? Do you believe everything you read? Do you watch TV? I would say that all of the above is about 90% BS. Hench, I come to TR for sanity’s sake. TR is a good site and you have to give them credit for their hard work and stop the nick pick, that kind of negative energy is a waste of life.

    So, I would just kinda relax and just enjoy life as it is cuz a lot of it is BS. But we are tough and can survive it.

    • TardOnPC
    • 13 years ago

    lol @ Prime1. That bike comment was terrible but so funny. Nice.

    • Pax-UX
    • 13 years ago

    Welcome to my world!

    • Silus
    • 13 years ago

    Heh, good one. Of course it doesn’t exist 🙂

    • Silus
    • 13 years ago

    I forgot to comment on this one.

    As for your guessing of NVIDIA needing to show anything at Computex, if GT300 was ready, why should they ? Because AMD did ? It’s been a long time since they showed any of their new chips at Computex. They didn’t show G80 and they didn’t show GT200, so why should they show GT300 ?

    • Silus
    • 13 years ago

    I’m sorry, but what kind of journalism do you have in your country ?

    How can you give credit to someone that made up so much about the chip failures ? He went on in saying that the “problem” affected ALL NVIDIA chips, not just the 8400m and 8600m chips and I’m sure even you know that’s not true.

    Most of the people that give him credit for that are either AMD fanboys, simply Anti-NVIDIA or don’t know how to read.
    Check the dates and see when it all started. He wrote his first article AFTER NVIDIA had already confirmed the problem with the mobile chips and that they would set aside $200 million to cover for warranty services and what not. Then the spree of lies began, with countless articles about failures (that never existed) on every other chip. IIRC he even included the GT200 cards in the mix too.

    And you give this guy credit ? Why ? Again I wonder what kind of journalism you have in your country…

    Also, where is the bit of truth in any of his “articles” concerning NVIDIA ?

    • Silus
    • 13 years ago

    Read #37 and #38 please. You are too much “fail” for me to repeat myself.

    • Silus
    • 13 years ago

    Oh really ?

    So let me try and understand how your logic works:

    “Charlie that writes FUD about NVIDIA every time he gets, creates rumors an lies to do so (let’s not forget the lengths he went to with the 8400m and 8600m chip failures, going as far as saying the problem affected EVERY NVIDIA chip, which…it doesn’t) and an undeniable AMD fanboy”

    or

    “Digitimes, which also seems to have many rumors, but just writes the rumor”

    And you compare the two as as reliable ? Am I missing something here ? Are you Charlie’s relative or something ?

    Sure both are rumors, but the obvious bias of one of the sources, should be enough to put the other source above it, no ?

    • Silus
    • 13 years ago

    A rumor spread by one of the biggest morons of what some seem to call “journalism” ?

    Should it even be linked to ? This was the point I made in #2, but you obviously didn’t “read” it.

    • blubje
    • 13 years ago

    “what is fake and what is real”

    you seem to have some paranoia about what is actually the truth and what isn’t. maybe you should relax, stop seeing things as absolutes, and either enjoy rumors or ignore them.

    my complaint with this article is that it doesn’t even add useful speculation other than die size (who cares, if it idles well?).

    • HighTech4US
    • 13 years ago

    nVidia sued by insurance company; Charlie jizzes in own pants

    Oh man thanks for the link, I still have tears from laughing so hard. Thank goodness I wasn’t drinking when I got to this headline or I would still be cleaning the monitor.

    • Saribro
    • 13 years ago

    Trying to discredit someone by refering to digitimes…. fail of the month to you.

    • Lans
    • 13 years ago

    /[

    • d0g_p00p
    • 13 years ago

    I guess you did not read the blog title .Maybe if you “read” the title, you would “see” that this is not “fact” but a “rumor” read on a “different” tech site.

    • Lans
    • 13 years ago

    First of all, I must admit I give Charlie lots of credits for the bad bump issue but sure theinquirer.net and “Semi-Accurate” are neither reliable sources. My bias is Charlie usually has some bit of truth in what he writes and, for me at least, trying to figure out what is the fun of it.

    Why not just don’t click on the link? TR clearly marked it as *[

    • Wintermane
    • 13 years ago

    I don t even know who the heck it is or the site and when I read it I went.. ok.. NEXT!

    About as useful as a vatican brand condom.

    • Kallstar
    • 13 years ago

    Did Prime miss the point or did you miss his? We all know what Charlie brown does.. he just shits on nvidia. Just because you happen to stumble upon one or two things amidst his torrent of misinformation doesn’t change the uselessness of his drivel.

    The fact that someone would even attempt to sift through his garbage and dig out a nugget of accurate info doesn’t validate it in the least. Even a broken clock is right twice a day; but even in those two moments where its accurate doesn’t mean its working. So enjoy your semi-accurate timepiece from the bowels of charlie… I for one need more objectivity/accuracy.

    • flip-mode
    • 13 years ago

    q[

    • Maddog
    • 13 years ago

    Second your comments. I still check TR daily but it is down on the list of sites to check.

    • cegras
    • 13 years ago

    Sometimes there are gems within the dross?

    • Meadows
    • 13 years ago

    No, I meant it, and the Hydra chip from Lucid is the only thing that makes Crysis enjoyable even for picky geeks at maximum detail, because it makes load balancing tap deeper and get far more specific. It’s two leagues above conventional multi-GPU methods because you know as well as I do that those didn’t help the performance of Crysis much.

    • MadManOriginal
    • 13 years ago

    Lucid Hydra is that ‘any multi-GPU on any chipset’ bridge chip right? I don’t think you meant to be quite that specific, perhaps you should do an edit.

    • PRIME1
    • 13 years ago

    Impossible as argument was pointless.

    • flip-mode
    • 13 years ago

    Yep, you did miss the point.

    • Meadows
    • 13 years ago

    Because the shaders and the rendering methods were screwed up, almost like a really rushed layer of polish. The game itself needed something like a Lucid Hydra chip to push the horrible performance back to nice levels, after which however, it really stood as testament to the power of the PC.

    Too bad the average person has no access to that sort of thing, which kind of devaluates it completely.

    • PRIME1
    • 13 years ago

    His point was that he likes Charlie’s BS.

    If you can’t go to his site without knowing what is fake and what is real why bother going? Unless you are the type that like reading the daily globe and actually believe when they tell you Elvis has returned from outer space.

    • eitje
    • 13 years ago

    I believe Silus’ point on the amount of rumor & postulation, regardless of Charlie’s stance on many things, stands.

    What were we reading 5 years ago?

    §[<http://www.techreport.com/archive.x?date=2004-7-30<]§

    • ironoutsider
    • 13 years ago

    New name for TR, The Tech Globe News. Yeah, more factual news please. Rumors are for tabloids.

    • MadManOriginal
    • 13 years ago

    Yes, for PC gaming in the near future GPUs don’t have much meaning. The next gen of consoles (ugh) will push us again although some games at higher res and settings do benefit from higher end cards atm.

    #7 has it nailed down though – faster GPUs are as much about GPGPU as they are about graphics performance. As Jen Hsun said it’s a ‘fight for the soul of the PC’ between a general purpose CPU and the GPU. The more GPUs can do the less a powerful CPU is needed (to a point – Atom + GF9300 still choked on Flash video) I’d be just as happy with a good dual core CPU for my general and gaming tasks if it’s sufficient along with a powerful GPU for say media encoding as with a quad core CPU and a lesser GPU. It will be interesting to see how things settle out in the next few years.

    • Silus
    • 13 years ago

    Oh and one other thing. He wrote for the inquirer FYI. I’m sure you know that, but you dismissed that tidbit, when you used his site name as a sort of defense for his FUD, which was laughable at best.

    • no51
    • 13 years ago

    Crytek brought us Crysis, but people bitched and moaned how their current setups can’t run it full blast.

    • Silus
    • 13 years ago

    I sometimes wonder if people like you ever heard of sensationalism ?

    I’m guessing not, because it seems you give this guy credit for…? What exactly are his uses ? It’s probably just what you said yourself…his articles are “funny”…apart from that…what are his uses ? To spread disinformation ? How many things did he get right ? Do enlighten us, on Charlie’s spree of “good information”. Also, do try to include any other site that has sources to back his info, since he only points to his own articles as sources.

    And as you questioned me, I’ll have to question you on your bias aswell. How can someone believe anything this guy says about NVIDIA, given his incredible bias against NVIDIA ?

    • flip-mode
    • 13 years ago

    Couldn’t have said it better myself. Thanks for saving me the time.

    • cegras
    • 13 years ago

    You missed his point and then smacked him with a terrible, irrelevant analogy.

    • PRIME1
    • 13 years ago

    That’s a ridiculous reply. Sure, maybe we should not dismiss him because he writes bullsh** all the time.

    Instead of bringing a salt shaker to his site, don’t go.

    It’s like if you have free access to 10 bikes, but you keep choosing the one without a seat.

    • Shinare
    • 13 years ago

    One word.. “GPU2”

    Can’t wait to see the PPD for this thing.

    🙂

    • flip-mode
    • 13 years ago

    Maybe you should not dismiss him just because he has bad relations with Nvidia. Is his stuff always completely accurate and free of bias – no – but does he get a lot of stuff right and even manage to break a lot of news before others – yes. Exactly how much of a bone can you manage to pick with the guy when he calls his own site “semi-accurate”? So while I do bring a salt shaker with me when visiting his site, his articles are fun to read and they just might be full of good info. Perhaps your own Nvidia bias is blinding you from the fact that the guy has his uses?

    • Namlous
    • 13 years ago

    Soon buying a new GPU will be pointless, all the PC seems to get is console ports which offer no new graphics innovation since they’re limited by the PS3/360 GPUs. Nvidia/AMD need to think of a way to reverse the Microsoft-induced flow of PC developers to the 360 platform, or else there will be very few reasons to purchase their top-brass offerings.

    • PRIME1
    • 13 years ago

    Charlie and bowels in the same sentence, how appropriate.

    • mongoosesRawesome
    • 13 years ago

    maybe this is the second tape out?

    • Silus
    • 13 years ago

    I’m sorry TR, but I have to say it. As a long time TR reader, I’m losing respect in some of TR’s coverage of the “outside news and rumors”.

    How can someone like “Charlie” have his “articles” (if you can even call them that) be spread on a well known and what usually is a trustworthy tech site like TR ?

    That’s what everyone usually calls “feeding the troll”. The more his “articles” are posted, the more he will “write”. And when it comes to NVIDIA, then all hell breaks lose. Even the most biased “red” reader, sees the obvious bias of this moron towards NVIDIA. How does that define any sort of credibility that allows his “articles” to be posted on this front page ?

    Also, digitimes had already mentioned GT300 tape out in May:

    §[<http://digitimes.com/news/a20090520PD211/nvidia_to_increase_40nm_orders_with_tsmc_for_2009.html<]§

    • UberGerbil
    • 13 years ago

    g[

Pin It on Pinterest

Share This

Share this post with your friends!