Well, it finally happened, after what seems like forever, the Geforce 3k series has been announced, and don’t worry, we did get a good look at the leather Jacket. In a relatively brief announcement, (pay attention Apple) Jen-Hsun Huang presented the 3090, 3080, and 3070 in his kitchen.
The presentation began with a host of new applications and technologies, including the ability of the GPU to get direct access to an SSD, which will be covered in a separate post, and some further details on Ampere, including significant changes to its 3 different “cores,” the Programmable Shader core, the RT core and the Tensor cores. Nvidia claims purportedly massive increases over the previous generation, as high as 2.7x the power of the 2k series, and potentially more than 2x the relative performance.
Jen-Hsun claimed that this is the largest generation performance increase in Nvidia’s history, and throughout the presentation he repeatedly told Pascal gamers it is finally “safe to upgrade”. Clearly Nvidia sees Pascal gamers, which would include myself, as a large market that was not tempted with the 2k series.
The first new GPU he introduced was the 3080, Nvidia’s new high-end GPU. Running 8704 Cuda Cores at up to 1.71GHz, with 10GB of 320-bit GDRR6X, a new class of super-fast memory Nvidia says it invented in conjunction with Micron. Rocking a new reversed fan cooling system, Nvidia claims it can cool an additional 90w worth of cooling, keeping similar cards 20 degrees cooler than Turing’s system. Jen-Hsun also stated the new system was 2x quieter than Turing, but I am not an expert on sound, and I’m not exactly sure what 2x quieter means. I would assume it means half as loud, but it is an odd way to phrase it. Clarify for me, sound nerds, please.
Nvidia says this 320w card (requires two of the new 8-pin connectors) is 2x the speed of the 2080 and is coming in at the same price, $699, and will be available Sept 17th. Nvidia considers this card the new flagship, even though it is the second fastest card Nvidia will be selling to gamers.
The new GPU performance crown goes not to a Titan, rather to the already leaked 3090. Please understand this card is LARGE. I know what you are imagining, but it is bigger than that. Jen-Hsun seemed to almost struggle picking it up, and quietly called it numerous synonyms for huge as he did so. This 350w card is a beast by any metric, 313mm in length, 138mm in width, and it takes up a full 3 slots in your tower. If SLI is an option you’ll need a big motherboard to accomplish it. 24GB of GDDR6X comes aboard this monster riding on a 384-bit bus, alongside 10496 Cuda Cores at 1.70GHz. Jen-Hsun claims it is capable of 60FPS 8k RTS gaming if DLSS 2.0 is enabled. It also includes the new fan setup announced with the 3080.
At $1499, launching sept 24th, this is the card I would like to have, and perhaps if I was unmarried and less responsible I might. However, I like being married.
The mainstream 3070 was announced, and Jen-Hsun stated the *70 is an extremely popular level of performance. Purchasers of this card may be cautiously optimistic as Nvidia claims it’s faster than a 2080 Ti. Not too shabby, if true. This more traditional looking card has 5888 Cuda cores at 1.73GHz on tap, though it doesn’t get the fancy new GDDR6X ram, with a 256-bit bus, this card supports a plebeian 8GB of GDDR6. It requires 220w of power, meaning a single 8-pin connector will suffice. The 3070 will be available in October.
Each of these cards comes with host of new hardware upgrades, including support HDMI 2.1, HDCP 2.3, and PCIe 4.0. Hopefully we can put together a full review of these new cards and see whether the proof is in the pudding. I’d love to have a review to make you all proud. Stay posted.
Update to fix spelling
Mmh looks tasty.
Now if only Intel would get their act together I could start making plans for a system upgrade.
I don’t think plopping one of these puppies into my 8.5yr old Ivy system makes much sense.
I’m on Ivy too. You would get raytracing and at higher resolutions, you are gpu limited anyway. Try it. At this point with DDR5 only about a year away, you would be saddling yourself with a DDR4 platform near its end of life.
Yup, I thought so as well. Though I’d probably need a new power supply.
And a new case if these are bigger than an Asus 1070 OC Strix 😀
You married the wrong wife, sweatshopking… Mine is buying the RTX 3090 for me as a birthday present 😛
As usual, the super high end is a terrible value double the price at $1500 for probably a 15% increase in real world performance…
But bragging rights!
10496 CUDA cores
Mother of God! Is there some technicality why all the pre-announcement rumours settled on 5248 cores as the likely number – exactly half? Or are these entirely equivalent cores to previous generations, and this number is genuinely as high as it sounds?
Either way, for $1500, I’m really hoping I get some work in soon so I can afford one for the likely-colossal increase in GPU-accelerated renderer performance over my somewhat-long-in-the-tooth 1080ti. Titans suddenly got borderline-affordable 🙂
The cores are not comparable in terms of raw count.
The potential here is that these GPUs start a new frontier of gaming, with similar but more advanced hardware than what’s in the PS5 with that dedicated SSD IO.
Hopefully cross-platform games can carry over that same great tech into the PC realm in lieu of PC-exclusive developers coding games that can take advantage of it – something we may not see for several years still.
And because it is probably several years away, then as a family man myself I will just stick to the PS5 which will retail for about the price of the cheapest Nvidia card announced.
Amphere? WTF.
SPELLING IS HARD
Amphere?
If Jensen sells more Amphere, he can buy more Teshla
I was writing this with a four year old climbing on me, but point taken. Will triple check next time
Didn’t mean to poke. Spellcheck gets us all. 🙂
Aww. So adorable. By the way, it might as well be AM-fear for AMD.
I had a fascinating business meeting during which my 5 year old was bonking me over the head with a pool noodle while I was making a presentation. Thanks Rona.
Fortunately school has started.
I’ve home schooled my kids for the last few years for a number of reasons, but this year my two boys are going. Daughter might just start uni online. I can’t take care of the 4 year old and study law full time. I’ve been studying political economy for the last few years, but it isn’t as heavy as this new program and I was just doing it online when I was able. I’ve been home with my kids since 2013 and idk how it’ll go to have them gone all day. I will be a sad lonely dad who… Read more »
Best of luck! What kind of lawyer do you hope to be?
Ideally, a researcher. I don’t think our current system works that well, and I wanna look at improvements to it, particularly around international “law”.
Be ready for an uphill battle. Us humans don’t take kindly to people who tell us to change our error ridden ways, even if it’s what’s best for us.
Here’s another take:
Be ready for a vertical uphill battle. Them patent trolls don’t take kindly to people who tell them to change their error ridden ways, even if it’s what’s best for us.
Decades later, when you’ve retired as a (hopefully) rich international lawyer, will there be more legal documents written in all caps, or fewer?
Inshallah, more all caps. My wife can make money as a doctor though, I just wanna be productive and true to my values. Doing that won’t make me much money, but it’ll make me happy.
The reveal show was pretty good. After it was over, I saw lots of comments like “RIP, AMD”. Intel and AMD have their work cut out for them.
Don’t worry, we’ll cancel our discrete GPUs on-schedule and under-budget!
It is too premature to say that AMD RTG is out of the game. Intel’s GPU division will most likely get by through datacenter GPGPUs and improving their iGPUs on their future CPUs. They never had a realistic chance with a discrete gaming GPU. They simply don’t have the IP and engineering talent pool to create something that can rival AMD RTG and Nvidia at this stage of the game.
Pretty funny if Scott Wasson is gaming at 8K on a 3090 that AMD’s spies nabbed for him and his job is to find the crack that lets AMD marketing downplay nVidia’s hard work.
Let’s get real here SSK: You really meant to say: I like being alive.
Biggest surprise out of all of this is that the prices weren’t as astronomical as expected outside of the 3090. That may be related to the choice of fab since Samsung is not overrun like TSMC and is willing to cut deals on wafer prices.
Nvidia is launching a pre-empetive price war and wants to avoid getting “HD 4870/HD4850” again and while being confident enough that AMD RTG cannot touch 3090.
I am surprised that Nvidia was first to market with this SSD IO technology. Now I wonder how long the technology has been conceptually around at Nvidia and what actually came about first.
It is just a gimmick that will ultimately only matter to graphical professionals.
System memory is much faster and 32GiB of UDIMM DDR4 is rather affordable and obtainable on current platforms.
Given that it’s been done in conjunction with Microsoft and implemented by windows it wouldn’t blow my mind if it was supported by amd too.
My wife isn’t the killing kind. Once killed the event would be over and she wouldn’t get to enjoy it anymore.
Immediately an image of you being played with by a CAT entered my mind. I’m not sure if I should be amused or terrified for you. Hopefully, you have quite a few tricks up your sleeve to tame her!
She’s a lovely woman, and we’ve got our 17th anniversary coming up in Nov. I jest, but in reality she’s very gentle and patient.
17 is a really cool prime number. Better plan something special!
Lol she would be very confused