I am not sure which reference you are using regarding 41.5 average FPS, but when i use to use HD 5850 at 1080 resolution, some tweaking without losing the eyecandy would generate 50 FPS average. Its needless to say 660 destroys my faithful HD HD 5850.
"Some tweaking without losing the eyecandy?" Is that like "A Big Mac without the fat?" or "Club sandwich, hold the turkey, lettuce, tomato, and bacon?" You can't "tweak" without losing the eyecandy; max settings is max settings! I will slap you.
kumori wrote:(see below)
And to think a PS3/XBox maxes out at 30fps.
Only an idiot actually uses benchmark "ultra" settings IRL. Even if I had a GTX Titan SLI setup, I'm pretty sure I'd drop down some of the postprocessing.
You just called me an idiot! (╯°□°）╯︵ ┻━┻ The first one who says idiot is the idiot! ヽ(≧Д≦)ノ
Besides, running Titan SLI you'd still drop settings? I think YOU'RE
For everyone else, you can normally double the average frame rate just by dialing back a couple of settings like shadow filtering and AA.
Similarly SSAA is very GPU-intensive and when scenery is whizzing past at 60 frames a second, it's quite okay to only have 4x MSAA turned on (Oh no, the horror! How will you cope?!?!!11)
I realize you're being facetious here, but I'm totally serious when I say that this is wholly your opinion and not applicable to other people. I can absolutely tell the difference between SSAA and MSAA in any given game scene regardless of what's happening.
What this translates to in actually gameplay is that the shadow of that tree starts to blur 80 game-yards away from the camera, instead of 90 game-yards, and the roof of that hut looks slightly more jagged if you printscreen and fire up photoshop to compare side-by-side samples of SSAA and MSAA with an XOR filter ;)
Or, you know, when you get killed because you were distracted by a nasty shimmer effect on said roof. Or because shadows popping in looked like another enemy. Or because you couldn't tell if that was a guy with a sniper rifle or a chicken. Et cetera.
Obviously framerate (or smoothness, since this is TR after all) is king and I would never suggest someone play with an awful framerate to play with good visual settings, but if you can maintain fluidity with high visual settings...
I'm even fussier; I hate artificial motion blur and I'm not happy with SSAO in most implementations, so I actually think the game looks better with them off.
That's great for you! Nobody cares! ᶘ ᵒᴥᵒᶅ
It's been a while since I loaded Metro 2033 up but I'm pretty sure I ran it at max settings on my 560Ti.
Yah, probly -- at ~30fps average maybe. I guess if you're okay with that it's fine? ( ｀ー´)
What's amusing to me is how obsessed we are over eye candy when most of my time is spent *play*ing LoL and Torchlight II, BL2 as well. These games have excellent game play. I still play and greatly enjoy PS1 games. Gameplay!
LoL and TLII are pretty good games (BL2 I'm less fond of.) Most of my time is spent playing Blacklight: Retribution and my 19GB Skyrim install. Isn't it weird how people have different tastes? (•‿•)
My reference was Anandtech Bench, which shows 41.5 average FPS for the GTX660 in Metro2033 1900x1200 VHQ with AAA and 16x AF. Average
FPS. That means on the low end it probably drops to 20fps or possibly even less. There's no way I'd play a game like that.
(◞‸◟；) I play my games on PC for a reason!
That said, console games usually run with very little distribution; they hover right around 30fps most of the time (exceptions like Dark Souls notwithstanding.) That's a lot more tolerable than "30 average fps" in an PC game, which usually means it runs at 60fps sometimes and 15fps other times. That's awful.
The GTX660Ti manages a more respectable framerate and that's probably okay. I did include it in my original remark (which was really facetious anyway), so I apologize for that. I really wouldn't want to play Metro on a GTX660 tho.