Soon, AMD will complement its Radeon HD 5000 lineup with $100-200 cards based on the Juniper graphics processor, and Nvidia will find itself up against a whole range of relatively small, 40-nm, DirectX 11 GPUs. What does that mean for mid-range and high-end 55-nm GeForces? The rumor mill has some ideas, but Nvidia says those products aren't going anywhere just now.
Let's back up a little first. Two days ago, DigiTimes reported that supply of 55-nm graphics processors had run tight. More interestingly, word was that Nvidia didn't plan to do anything about it, even though its 40-nm graphics processor based on the Fermi architecture probably won't be out until next year. Charlie over at SemiAccurate had an explanation: Nvidia was faking the shortages to clear out inventory before a massive price cut, using online publications to spread the false rumor.
Yesterday, Charlie updated his prediction, saying the shortages are actually genuine and herald a complete pullout from the mid-range and high-end desktop graphics markets by Nvidia. Quoting sources "deep in the bowels" of the company, Charlie said the GeForce GTX 260, GTX 275, and GTX 285 have all reached or will soon reach end-of-life status, with the dual-GPU GeForce GTX 295 "almost assured to follow." The GTX 285 has already bitten the dust, he asserted, while the GTX 275 will follow within a couple of weeks, and the GTX 260 will be discontinued by the end of the year.
Independently, we've also heard that the GeForce GTX 275 may be reaching the end of its run soon.
Why would Nvidia kill off these cards? Charlie believes the firm "can compete [with AMD] on performance, but not at a profit," so it could choose to focus on lower-end, cheaper-to-produce parts until its next-generation DirectX 11 GPU, based on Fermi, shows up. That does sound pretty plausible, especially considering AMD's Radeon HD 4000 lineup already forced Nvidia to cut prices across the GTX 200 series last year. Nvidia could simply wait for Fermi before re-entering the mid-range and high-end markets, or it could wait even longer for a smaller, Fermi-derived GPU.
We asked Nvidia to weigh in this morning. Is the GTX 200 series really doomed? Nvidia spokesman Ken Brown responded quickly and told us flat out there is "no truth" to the discontinuation rumor. Ostensibly, then, the company intends to keep using current-gen products to compete with AMD's DirectX 11 parts until its own DX11 GPUs arrive.
While we'd take all of the speculation with a grain of salt, we wouldn't be so quick to discount Charlie entirely—despite his history of badmouthing Nvidia. SemiAccurate got a fairly good estimate of Fermi's release schedule way back in July, and the die-size estimate wasn't too far off from our own. We'll wait and see.
|1. BIF - $340||2. Ryu Connor - $250||3. mbutrovich - $250|
|4. YetAnotherGeek2 - $200||5. End User - $150||6. Captain Ned - $100|
|7. Anonymous Gerbil - $100||8. Bill Door - $100||9. ericfulmer - $100|
|10. dkanter - $100|
|A technology overview of the Aimpad R5 analog keyboard||1|
|Microsoft helps hardware companies make VR more affordable||3|
|Intel P3100 M.2 SSD has datacenters in mind||7|
|Microsoft Surface Ergonomic Keyboard merges comfort and style||20|
|Surface Studio puts the iMac on notice||56|
|Microsoft Surface Book i7 packs a bigger punch and more batteries||37|
|G.Skill KM570 MX keyboard goes back to the basics||4|
|Intel's Purley server platform won't use 3D XPoint memory||4|
|In the lab: EVGA's GeForce GTX 1050 Ti Superclocked graphics card||40|
|Signing your posts is daftly redundant. Meadows||+29|