Airmantharp wrote:BF4 is a 'bridge' game, not a fully next-gen game .
What will be the first next-gen game?
Personal computing discussed
Moderators: renee, morphine, SecretSquirrel
Airmantharp wrote:BF4 is a 'bridge' game, not a fully next-gen game .
End User wrote:Airmantharp wrote:BF4 is a 'bridge' game, not a fully next-gen game .
What will be the first next-gen game?
Airmantharp wrote:End User wrote:Airmantharp wrote:BF4 is a 'bridge' game, not a fully next-gen game .
What will be the first next-gen game?
Beats me- but if it can run on a PS3 or 360, it really isn't.
auxy wrote:Did you even read my post? That's more or less what I said, or, at least, the obvious recommendation based on what I said.
JohnC wrote:"The Witcher 3 pushing next gen consoles to their limits"
http://www.game-debate.com/news/?news=8 ... its&page=4
JohnC wrote:I don't know what it will "stress" (the dynamic LOD will most likely be manually adjustable on PC version) but even the old (by now) Witcher 2 was pretty "stressful" at max settings (using SSAA/ubersampling) even when playing on my Titan (I've seen FPS drop to around 40fps).
End User wrote:I ... I just did. Is English not your first language?Question away!
End User wrote:We were talking about cars as a metaphor for computers. In the metaphor, a lighter load would be an easier-to-run game. And as far as cars go, there really isn't. There are things you can do to help the situation, but at the end of the day, torque makes acceleration, and if you want more torque, you need a bigger engine. Why do you think they put giant V8 engines in trucks? Why do they use >5 liter engines in really fast cars? Sure, lightness helps, but when you have a ten-ton load to move, saving 600lbs on the vehicle is really of little concern.Silly me.I thought we were talking about cars. And as far as cars go there is a replacement for displacement
End User wrote:Sure! It's a lot faster by a lot of metrics. You have more cores, more memory bandwidth, and a higher memory clock. Your average and maximum FPS will usually be higher. If you're interested in pointless epenis rivalries, by all means, SLI is the way to go. You cannot argue however that a Titan will handle a heavier load more consistently than your SLI rig. That's just the nature of the beast.I think my SLI setup would do just fine against your Titan rig. :P
End User wrote:I imagine it will probably out-accelerate your current car, given the same weight. That's only one metric though, of course. My own car is light and reasonably quick with a small motor, because I don't have a need to carry big loads. Aside from my girlfriend, HA!(≧σ≦)I suggest you read your own post again: (quotes from me snipped for brevity)
I do own a car with a turbo. Maybe I should sell it and buy a 2014 Z28.
End User wrote:It's not crazy or uninformed! And it wasn't an "attack"! Jesus living christ, did EITHER of you even read my post? I know it's long, but surely you aren't so ADD that you can't sit through a few minutes of reading, right? I said SLI was good!Edit: But seriously. You went on a crazy uninformed SLI attack and you expect me not to defend the technology? Sheesh. Perhaps it is you who has insecurities about your choice.
End User wrote:If only.Oh stop! You're killing me.
Waco wrote:No, they aren't, and screw you for saying so! щ(ಠ益ಠщ)I did, and yes, while you come to the same conclusion, your reasons for doing so are totally wrong (both the car and GPU explanations).
Waco wrote:The 9700 was a "miracle" -- a better-than-expected part came out ahead of schedule. The 4870 wasn't even especially good -- power consumption and heat were off the charts -- it's just that Nvidia stumbled with the follow-up to the amazing G80/G92 and AMD's parts (which were competitive, unlike the 29xx/38xx parts) looked better by comparison. It's just like the old Athlon, which was a good part by any estimation, but only seemed as amazing as it did is because Intel couldn't scale up the Pentium III to match, and then tried NetBurst, which was an unmitigated failure not at all unlike Bulldozer. (;¬_¬)Anyway - I wouldn't buy anything till the new AMD cards come out and prices fall into line. Who knows? Perhaps AMD will pull another 4870 miracle. I doubt it but they have done it before.
Airmantharp wrote:You can probably guess why I left! s(・`ヘ´・;)ゞI propose a round of beers- auxy's back!
Airmantharp wrote:JohnC wrote:I don't know what it will "stress" (the dynamic LOD will most likely be manually adjustable on PC version) but even the old (by now) Witcher 2 was pretty "stressful" at max settings (using SSAA/ubersampling) even when playing on my Titan (I've seen FPS drop to around 40fps).
And it's not even that good looking of a game, really
Airmantharp wrote:Beats me- but if it can run on a PS3 or 360, it really isn't.
End User wrote:BF4 is definitely not a game that requires a 6GB+ card. On Ultra settings with max AA @2560x1440 I saw just under 2.8GB of GPU memory used. The game looks great at lower settings and no AA. I got my memory usage down to 1.4GB and the game still looked fantastic.
CPU usage is another story.
Aphasia wrote:But if you would ever think of stepping up to Either 4K, or eyefinity gaming, then you will need more memory because that could easily come up to both 6 megapixels and beyond.
auxy wrote:I make an informative post to help people and then get shouted down by people who have NO IDEA what they are talking about because they can't even be bothered to read.
auxy wrote:Airmantharp wrote:You can probably guess why I left! s(・`ヘ´・;)ゞI propose a round of beers- auxy's back!
auxy wrote:
auxy wrote:And doubly for not even attempting to justify yourself! Your trollish behavior is noted! ಠ益ಠ)凸
Next time, instead of being a jerk and saying "No, you're wrong! This is how it is:" and then stating the same thing, you might try saying "yes, you're right" and agreeing. I cannot believe the hubris on display in your post. I am physically furious right now! You don't even know what you're talking about! Case in point:Waco wrote:The 9700 was a "miracle" -- a better-than-expected part came out ahead of schedule. The 4870 wasn't even especially good -- power consumption and heat were off the charts -- it's just that Nvidia stumbled with the follow-up to the amazing G80/G92 and AMD's parts (which were competitive, unlike the 29xx/38xx parts) looked better by comparison. It's just like the old Athlon, which was a good part by any estimation, but only seemed as amazing as it did is because Intel couldn't scale up the Pentium III to match, and then tried NetBurst, which was an unmitigated failure not at all unlike Bulldozer. (;¬_¬)Anyway - I wouldn't buy anything till the new AMD cards come out and prices fall into line. Who knows? Perhaps AMD will pull another 4870 miracle. I doubt it but they have done it before.
Really, Waco, you're a jerk, though. Seriously.
End User wrote:Obviously, because I just told you... clearly, it was a joke...Edit: Sorry, this is too good to pass up on:auxy wrote:Airmantharp wrote:You can probably guess why I left! s(・`ヘ´・;)ゞI propose a round of beers- auxy's back!
I know why you left:
End User wrote:I don't really get your point here; I have a lot less personal time, so obviously it's a lot more valuable to me. I don't see the problem with posting at work during downtime when there's nothing else to do. \(--)/auxy wrote:
So your personal time is too valuable to waste on TR but at work you can waste all the time you want posting. Nice!
Waco wrote:I didn't buy my Titan, so I have no need to justify it. ヘ(´o`)ヘ I could never afford such a thing and I'm eternally grateful to my benefactor to have it, but even if I had the money, I absolutely would never have purchased it no matter how much I wanted it -- I can't afford that kind of price/performance disparity. Well, I might have bought it, but that's because I have poor impulse control.Complain all you want. I'm not going to point out all the holes in your argument when it's plainly obvious you're wrong if you've ever read a single one of the frame-latency oriented articles here on TR. We get it, you have a Titan, everything else sucks (in your opinion). Just don't go making stuff up to justify your purchase and expect to not get called out on it.
Waco wrote:Yeah, I make personal insults, but anyone would when they're as furious as I was. I didn't "go crazy" -- come to Freenode ##hardware and ask them about me 'going crazy'. Yah, I have a temper problem, but you didn't even approach the limits of that. Don't flatter yourself.Next time when I'm posting from my phone I'll make sure to note it so you don't go all crazy on us again. That, and the personal insults are a nice touch. You should keep that up!
Waco wrote:What claims? What I said is clearly upheld by benchmarks on THIS SITE and others! It's goddamn common knowledge, what I said!You make claims that are clearly contradictory to the findings on this site (and many others) and get upset when people just plainly state "you're wrong".
Waco wrote:That's part of the deal, you know? You buy SLI, you have to contend with the fact that you're only getting half the RAM you paid for. It's not like that's somehow unrelated. Sure, you can buy GPUs with more video memory, but usually they don't even have the memory bandwidth to take advantage of it.Also, if you want to make a case for big powerful GPUs being better than SLI, linking to an article where the "weaker" cards are running out of video memory probably isn't the best strategy. :)
auxy wrote:What claims? What I said is clearly upheld by benchmarks on THIS SITE and others! It's goddamn common knowledge, what I said!
auxy wrote:That's part of the deal, you know? You buy SLI, you have to contend with the fact that you're only getting half the RAM you paid for. It's not like that's somehow unrelated. Sure, you can buy GPUs with more video memory, but usually they don't even have the memory bandwidth to take advantage of it.
auxy wrote:There's a really nice relatively recent article at Tom's[/url] that actually bears out completely what I said: under a heavy enough load, one big GPU is more consistent than two smaller ones. The problem with comparisons like End User's are that the GTX 770 is not "slower enough" to really show this without doing something relatively silly, like playing at 4K, because games just aren't that demanding right now.
auxy wrote:Sure, you can buy GPUs with more video memory, but usually they don't even have the memory bandwidth to take advantage of it.
End User wrote:Again with the pointless eDick-waving. This isn't about you or your purchase, you know? Why are you comparing the memory bandwidth on your GPUs to a Titan? That's not even slightly relevant to this discussion; you're just looking for something to nitpick now. Besides, Titan is a halo product with TERRIBLE price/performance; it should really never be included in any serious comparison of value.My 4GB GTX 770s have a memory bandwidth of 256.4 GB/s each. That is only 32 GB/s less than a 6GB Titan.
auxy wrote:End User wrote:Again with the pointless eDick-waving. This isn't about you or your purchase, you know? Why are you comparing the memory bandwidth on your GPUs to a Titan? That's not even slightly relevant to this discussion; you're just looking for something to nitpick now.My 4GB GTX 770s have a memory bandwidth of 256.4 GB/s each. That is only 32 GB/s less than a 6GB Titan.
End User wrote:The OCZ Vector was the first mass-market SSD to use a custom-designed controller based on dual ARM procesors. See, I can state irrelevant facts too! You don't get to take the moral high ground because you picked a fight you can't win. You might call my "antics" childish, but the reality is that I'm the one with the reasoned, measured, and factual point, while you're standing on the sidelines going NUH-UH! SEE BECAUSE NUMBERS! when you don't even know what the argument is in the first place. It's pathetic, really.I'm sticking to facts. You make statements you can't back up and then deflect valid rebuttals with childish foot stomping antics.
auxy wrote:End User wrote:The OCZ Vector was the first mass-market SSD to use a custom-designed controller based on dual ARM procesors. See, I can state irrelevant facts too!I'm sticking to facts. You make statements you can't back up and then deflect valid rebuttals with childish foot stomping antics.
auxy wrote:Sure, you can buy GPUs with more video memory, but usually they don't even have the memory bandwidth to take advantage of it.
End User wrote:Cute, but for you to be correct, one of you would have had to have posted a valid rebuttal, which I haven't actually seen yet.Thanks for confirming what I just wrote.
End User wrote:How about let's not? You're nitpicking at a sideways comment I made that has no relevance to the larger discussion, nor to the actual subject of the thread, which was whether or not a new build should include SLI or not. Remember? Well, probably not. That's why I kept telling you to PM me if you wanted to discuss this further, because this sad little vendetta you have against me adds nothing to the thread.Lets get back your statement
auxy wrote:End User wrote:Cute, but for you to be correct, one of you would have had to have posted a valid rebuttal, which I haven't actually seen yet.Thanks for confirming what I just wrote.End User wrote:How about let's not? You're nitpicking at a sideways comment I made that has no relevance to the larger discussion, nor to the actual subject of the thread, which was whether or not a new build should include SLI or not. Remember? Well, probably not. That's why I kept telling you to PM me if you wanted to discuss this further, because this sad little vendetta you have against me adds nothing to the thread.Lets get back your statement
auxy wrote:Cute, but for you to be correct, one of you would have had to have posted a valid rebuttal, which I haven't actually seen yet.
End User wrote:I'm not intimidated by your bullying tactics and I'm not impressed by your consistent insistence to stand in the face of my indifference. Get out of my face with your belligerence because your deliberate ignorance and resistance to inference are causing me to investigate disparate means by which to deliver my persistently adamant refusal to capitulate. (´Д⊂ヽYou still haven't answered my question. How much memory bandwidth does a 2GB card need? How much memory bandwidth does a 3GB card need? How much memory bandwidth does a 4GB card need?
I know you don't know. All you do is throw out FUD.
Waco wrote:I'm still waiting for a link to show this "common knowledge" you were posting about earlier.
Lifehacker wrote:I didn't provide any links because it really is common knowledge. You don't do a new build with two midrange cards; you buy one high-end card and SLI it later if need be. Or, if you have the money, like End User, you buy two high-end cards right off.We can't tell you what will work for you, but I almost always try to go with a powerful single card rather than two cards in SLI or Crossfire. To me, it's worth the extra $50 (or whatever it is) to have a card that works without system tweaks, without the extra noise, and without the chance of any micro stutter issues.
Captain Ned wrote:And at this point it's time for everyone to retreat to a neutral corner and take a few deep breaths before anyone says something I might regret.
Thanks for listening.