Rumor: GeForce GTX 560 due on May 17

The rumor mill already foretold the arrival of the GeForce GTX 560, a lower-specced sibling of the GTX 560 Ti, earlier this month. Now, the people at Expreview claims to know exactly when Nvidia will set this newcomer loose upon the unsuspecting masses: May 17.

Expreview refers to that day as a launch date, so assuming there’s any truth at all to the report, the card’s actual retail debut might be a little bit farther off. That’s about it for new information, unfortunately. The site goes on to reiterate the specifications it quoted a few weeks ago, leaving details like pricing undisclosed.

For what it’s worth, the GeForce GTX 560 is rumored to feature an 800MHz or higher core clock speed, 336 stream processors, and 56 textures/clock of filtering power, otherwise mirroring the specifications of the quicker GeForce GTX 560 Ti. Nvidia could introduce the GTX 560 as a replacement to the venerable GeForce GTX 460—or at least as a replacement for some variants of it. Keep in mind e-tail stocks are still rife with the GTX 460 SE, GTX 460 768MB, GTX 460 1GB, and many versions of the 1GB model come with higher clock speeds.

Comments closed
    • Triple Zero
    • 11 years ago

    Hmmm… Perchance TSMC’s yields on the GF114 are such that there are a significant number of chips with one bad cluster. It’s not like NVIDIA hasn’t done something like this before (9800 GTX vs 9800 GT) so this isn’t without precedent.

    • Flying Fox
    • 11 years ago

    You’d be surprised how many people don’t know. Take a look at how many people still think the 1GiB on low end GPUs are good cards.

    • FuturePastNow
    • 11 years ago

    Gimme that old time price war, please.

    • pot
    • 11 years ago

    People who buy GPUs won’t know this?

    • Duck
    • 11 years ago

    nvidia only wins out if you are using the graphics card for CUDA etc. Otherwise it has compromised the efficiency noticably compared to the current radeon crop.

    You can see the results for yourself…

    [url<]https://techreport.com/r.x/xfx6950-vs-zotac560/power-idle.gif[/url<] [url<]https://techreport.com/r.x/xfx6950-vs-zotac560/power-load.gif[/url<]

    • UberGerbil
    • 11 years ago

    Noise may matter more in the mid-range than it does in the high range — at the high end, people are willing to put up with noise to get maximum performance, and/or are willing (and have the budget) to chase other solutions like aftermarket coolers, water cooling, etc. The mid-range is still powerful enough to require significant cooling that may vary quite a bit in noise profile from card to card — perhaps more than performance does. I know noise data would be among the first things I would compare once I’d decided on buying a graphics card in roughly this performance band.

    • PRIME1
    • 11 years ago

    Your crazy rant aside. If your corvette sat running idle for most the day then yeah you may be concerned about it’s idle fuel consumption. But just to stick with your completly ridiculous car analogy. People who buy corvettes and porches I bet don’t give 2 rat turds over fuel consumption. Just as people who buy gaming systems are none to concerned about some extra watts.

    However, you claim of how “power hungry” fermi is….well it’s full of poop. Yeah under full load it does use a few more watts. (depending on the card) but at idle it uses less. Most computers are at or near idle more often than at full load.

    Now go back to the blackboard and write that you will not spread FUD….. 1000 times.

    • derFunkenstein
    • 11 years ago

    The name is what can make the performance confusing. Missing two little letters – “Ti” – can make a huge difference in performance. It also makes (most likely) a difference in price.

    • pot
    • 11 years ago

    People complain about names way too much. I don’t care about the name, I care about the performance/power consumption.

    • dpaus
    • 11 years ago

    What an idiotic comparison! Why don’t we also compare the fuel consumption of the Corvette and the Porsche while at idle? [i<]Because they're not [b<]used[/b<] that way, are they?!?[/i<] But, if you insist: [url<]https://techreport.com/articles.x/20537/11[/url<] Ouch, who's hurtin' now? Well, bluster, bluster, let's look at their consumption under load, then - second graph. Ouch. Again. Like I said above, Fermi is an impressive architecture - just too power-hungry. Now, put your knuckles on the desk beside SSK's - that way the teacher only needs to swing the ruler down once, which lowers her power consumption while at idle. Meanwhile, I'll be at the blackboard: I WILL NOT FEED THE TROLL I WILL NOT FEED THE TROLL I WILL NOT FEED THE TROLL I WILL NOT FEED THE TROLL I WILL NOT FEED THE TROLL... (only 95 to go....)

    • PRIME1
    • 11 years ago

    [url<]https://techreport.com/r.x/radeon-hd-6800/power-idle.gif[/url<] Ouch that seemed to hurt your argument a lot.

    • PRIME1
    • 11 years ago

    Was he feeding you again?

    • dpaus
    • 11 years ago

    SSK, hold your knuckles flat on the teacher’s desk, while the ruler ‘splains it to you’ about feeding trolls

    • sweatshopking
    • 11 years ago

    seems you and duck disagree. He thinks the 6850 is the one to beat. you guys should fight about it.

    • sweatshopking
    • 11 years ago

    no. there’s the 8800gt, gts 320, 540, and the fastest of the gts’ which was the 512, there was also the 8800 Ultra, and the 8800 GTX on the highest end, the 8800 GS was the slowest of the 8800’s.

    • Ushio01
    • 11 years ago

    There’s more than one 8800GT?

    • Farting Bob
    • 11 years ago

    Which version of the 8800GT would that be?

    • ap70
    • 11 years ago

    NVIDIA, you should relax and stop throwing a new card every week. Take a few months and give us a “big jump”.

    • PRIME1
    • 11 years ago

    The 460 is still the best bang for the buck out there. So if this ends up replacing it (faster for the same price), we could have a new champion.

    A shame AMD has been pushing out nothing but low end crap lately.

    • Mystic-G
    • 11 years ago

    Whoever comes up with these names should have been shot since the 8800GT.

    • dpaus
    • 11 years ago

    Of course it’s ‘competitive’, and more than adequate for any and all users out there. Just like the Corvette ZL-1 is competitive with the Porsche 911 GT3, and either are more than enough car for any driver out there. But comparisons between them are made and ‘winners’ are declared around tenths or even hundredths of second on racetracks not even open to the public. So it is, increasingly, with technology products (I guess one could argue that both of those two cars are ‘technology products’ too).

    • swaaye
    • 11 years ago

    I think the Ti 560 is a very competitive card in that performance midrange area or whatever it’s called. It seems to be competitive with even the 6950. But the 550 is not that great, and while the 570 and 580 are pretty great they are probably too expensive to sell well.

    I don’t think power usage comes into the equation though unless it’s really nuts, like with the 480. I’m also not sure how much noise matters. I really think that by far the most important aspect is price/performance. That can even overcome the stupid brand loyalty aspect.

    • d0g_p00p
    • 11 years ago

    Another worthless card. This is really getting out of control.

    • Meadows
    • 11 years ago

    Don’t vote him down, he’s right.

    • dpaus
    • 11 years ago

    The hell you say! Well, the devil’s in the details, but it’ll be a cold day in hades when it out-runs my trusty 9800GT

    • Neutronbeam
    • 11 years ago

    Well, the GTX 666.66 is supposed to be one damned fast card running at hellacious speeds.

    • sweatshopking
    • 11 years ago

    the trick my friend, is to write a short quick post, to be sure you get the first post, and then come back and edit!

    • yogibbear
    • 11 years ago

    Is Nvidia still going on with this naming BS. 560 Ti & 560….. hm….. which one’s faster? The one with titanium implants? Surely…

    • dpaus
    • 11 years ago

    The Fermi architecture is just too damn power-hungry. For all that I admire their efforts to ‘massage’ this generation, they’re going to have to massively re-engineer it if they want to have a real shot at competing in the next round of products.

    The circular irony is hilarious: Nvidia can’t compete with AMD on power management on the current generation of GPUs, but AMD can’t compete with Intel on power management on the current generation of CPUs, and Intel can’t compete with Nvidia on performance on the current generation of GPUs, and….

    • Ryhadar
    • 11 years ago

    nVidia is going to have to do better than that in my mind, but I’m very skewed. I got an XFX 6870 for $175 after MIR which was only $5 more than most reference 6850s at the time — and $15 less than the factory OC 6850 from HIS I was originally looking at. I even got a check (not one of those dumb pre-paid cards) back in less than a month’s time.

    • dpaus
    • 11 years ago

    That’ll teach you to put so many extraneous characters in product names. Now if only the vendors would learn that lesson….

    • JMccovery
    • 11 years ago

    C’mon Nvidia, I’m still waiting for the GTX 666.66 TI Ultra Extreme with 16GB GDDR10 (10x faster per clock than GDDR5).

    If you don’t AMD’s HD7777XTPE will have my money!

    Post that is First!

    Edit: Dagnabit, foiled in my posting of firstness!

    • Duck
    • 11 years ago

    The 6850 looks like the one to beat in terms of performance for money. I don’t think nvidia is going to win this round.

Pin It on Pinterest

Share This