Etc.

Morning, folks. I have been hard at work on a review of the GeForce GTX 690, hoping to have it posted this morning, but it’s going to take a little bit longer to complete. Although I’ve put in long hours since receiving the card on Monday (and worked through the weekend in anticipation), there’s still work to be done. I’m going to take the time to do it properly. Stay with us, and we’ll have something worth reading when the time comes.

Comments closed
    • pack66
    • 8 years ago

    Take your time. Lots of sites run the same numbers. I read TR because you guys do things differently.

      • plasticplate
      • 8 years ago

      Yup. I think the number of frames that required more than 50ms to render is more important than the same old fps. At the resolution i game (1080p), all the cards from the performance to enthusiast segment are going to give you upwards of 50fps. It is interesting to know which card offers the most fluid and jitter free experience in that particular price segment.

    • mcnabney
    • 8 years ago

    What resolution would you have to run to actually need this kind of horsepower? Since a 560ti runs every game I have at max settings, I am not seeing the need for this card outside of someone with a 4k display.

      • modulusshift
      • 8 years ago

      Multi-display madness.

    • rhysl
    • 8 years ago

    Appears 680 in SLI beats the 690 in most games.. as usual..

      • derFunkenstein
      • 8 years ago

      no surprise since they have to turn down the clocks on the single-card solution to get the power usage to a “reasonable” level.

      • BobbinThreadbare
      • 8 years ago

      Yet probably has higher micro-latency.

    • torquer
    • 8 years ago

    Prediction: It will run Crysis

      • flip-mode
      • 8 years ago

      That got a laugh.

      • UberGerbil
      • 8 years ago

      Hopefully that’s the epitaph on the headstone for that meme.

        • CuttinHobo
        • 8 years ago

        Not until phones are able to run Crysis (in emulation) acceptably.

          • plasticplate
          • 8 years ago

          OMG. here we go again …

    • codedivine
    • 8 years ago

    Your extra analysis about frame rendering times will be extra-helpful in a dual-GPU card like the 690, so take your time.

      • dragosmp
      • 8 years ago

      logged on to +1, I’m waiting for the frame time analysis too

    • kamikaziechameleon
    • 8 years ago

    you guys are always worth waiting for. Please tell me its complete overkill!

    • gbcrush
    • 8 years ago

    Scott, like so many others in the readership, I’m happy to say we’re good with this and we’re looking forward to your review. Thank you.

    In fact, I think if TR were to make a blanket announcement: “Guys, we’re sorry but 98% of our GPUs will be a few days late from now on, and you’re going to get an awesome review out of it” most of the gerbils around here might say, “Eh, we get that, thank you for taking the time to make it right.”

    Which is exactly why I want to thank you for not saying that. Thank you for taking the time to write the “etc.” posts, to let us know you’re working hard, and to remind us that our readership is indeed appreciated. It’s just one more reason I happily spend my time on this site.

    • Farting Bob
    • 8 years ago

    If i had $1000 to spend on making games look slightly prettier, i would totally be interested. Nv really needs to release smaller, mid range cards, rather than try to outdo itself for fastest single card when supply is already so limited.

      • LovermanOwens
      • 8 years ago

      Well then, you should be happy about this card because in about 4 years you should be able to afford it. Limited supply = people will pay higher prices to get it before the next person.

    • flip-mode
    • 8 years ago

    Prediction: the GTX 690 wins, costs so much that arguing whether or not it is “worth it” is completely besides the point, and is going to be incredibly scarce, initially, at least.

    Speaking of scarece, there has not yet been a “availability check” of the GTX 680. Allow me <checks Newegg … all out of stock> The GTX 680 continues to be unobtanium vaporware. “Winning”?

      • gbcrush
      • 8 years ago

      Not trying to invalidate your argument about unobtainability, but if you (or anyone else) are actually trying to get one, I got mine from Amazon (as sold through them, not through 3rd party stories with jacked up prices)

      Still unobtainable, just slightly less so. Basically, I had to search the store for GTX 680 3 or 4 times every morning, sometime around 7.00 AM eastern. After a couple of days of this, I actually got results for a card. Buying window was about 10 minutes, tops, before it was out of stock again.

        • Duck
        • 8 years ago

        680GTX is relatively plentiful over here. Each shop may have 10s of cards + in stock all the time.

          • flip-mode
          • 8 years ago

          Got links?

            • Ryu Connor
            • 8 years ago

            I just randomly went to NewEgg last week and saw MSI 680’s in stock.

            They are around, but it is definitely a situation where you have to do gbcrush’s little daily dance of checking sites.

            EVGA is taking pre-orders for one of their 690s at the moment. Tempted, but there’s really very little reason to replace my 590 at this time.

            • Duck
            • 8 years ago

            The cheapest in stock here is 712 USD thanks to 20% sales tax. I’m sure that helps them stay in stock… [url<]http://www.overclockers.co.uk/productlist.php?groupid=701&catid=1914&subid=2255&sortby=priceAsc[/url<] These are all in stock and cost $777 to $793... [url<]http://www.novatech.co.uk/products/components/nvidiageforcegraphicscards/nvidiagtx680keplerseries/?o=1[/url<]

            • flip-mode
            • 8 years ago

            Heh, crazy prices, but at least it’s available somewhere in the world.

      • rrr
      • 8 years ago

      Wins vs what? 680/7970? Obviously. 6990? Certainly too. 7990? I dunno. I’d wager AMD will make use of second mover advantage, just like nVidia had with 680 vs 7970, and tune 7990 so that it beats 690 just by a hair.

      Updating my post, 6990/590 are indeed soundly beaten by 690, by 50+% margins. In fact 690 pretty much equals 2×680 in SLI

    • Perkest
    • 8 years ago

    Can’t wait to see your review.

    I have been a little disappointed at all the other reviews. I have seen nothing on Improved Frame rate Metering and micro stuttering.

      • Skids
      • 8 years ago

      Check hardocp… they claim they did not notice anything like that happening. Although I too am waiting for Scott’s take.

    • ALiLPinkMonster
    • 8 years ago

    [quote<]I'm going to take the time to do it properly.[/quote<] Which is why I love TR. Nothing but good, honest nerds giving us fellow nerds some of the best nerdy reviews around. 😉

    • crabjokeman
    • 8 years ago

    “It’s coming”

      • Duck
      • 8 years ago

      Ha! Well played 🙂

    • indeego
    • 8 years ago

    Please consider doing it using 98 percentile instead.

      • Mourmain
      • 8 years ago

      Or just plain histograms. None of this percentile sillyness.

      • cobalt
      • 8 years ago

      Histograms are good but may be too little dense. As another reader suggested in the original discussion (https://techreport.com/discussions.x/22666 — you can see one of Scott’s attempts at capturing more info), a cumulative distribution function (http://en.wikipedia.org/wiki/Cumulative_distribution_function) might be the most readable. I’d love to see an attempt at that.

      (Also, if we decide frame latencies are a more appropriate metric than frames per second, rather just switch to them and be done with it. I hate where two adjacent graphs giving nearly identical information (50th percentile, or “average” framerate vs 99th percentile framerate) have different metrics for no good reason, thus preventing a direct comparison.)

        • sweatshopking
        • 8 years ago

        Personally, i’m a fan of crayon.

        • BobbinThreadbare
        • 8 years ago

        They can’t move completely yet, you can’t compare current gen to last gen and 2 generations ago if they did that. Since 5000 series cards are still perfectly acceptable performance-wise, this would be bad.

        • Mourmain
        • 8 years ago

        I don’t think a cumulative distribution function is very readable. It *needs* explanation. A histogram on the other hand is pretty intuitive: here are the frame times, and here’s how many of each we got.

        The CDF is just a postprocessing of the histogram that hides the smaller features. If the histogram has two shallow peaks for example, they would be made less visible by the CDF.

        The only problem with the histogram is that it might have a very high range of values. So logarithmic scales might be needed on the vertical axis.

          • jensend
          • 8 years ago

          A CDF or quantile plot may require a little explanation, but even just a caption is enough. I don’t know that a histogram is necessarily much better in that regard.

          I do agree that a histogram (or rather a kernel density estimator, which is largely the same picture but helps mitigate the arbitrariness of histograms’ bin sizes and bin cutoffs) can give a very nice view of a single card’s frame time distribution. However, as Damage pointed out below, they already tried to find ways of making it so people could easily get useful information out of comparing different cards’ histograms but had little success. This is the readability problem we’ve been referring to; I don’t see a way around it.

          Also, it simply isn’t the case that the histogram is more fundamental and the CDF/quantile fn is “just postprocessing.” The fundamental information is the frame times; if you graph that it’s almost all zeros and then a few thousand spikes where it goes to 1 instead. (I suppose it’s possible that a few frames could have exactly the same frame time up to the resolution of the system timer and FRAPS but that’s not much help). Not readable by any means. Doing the CDF is a simple running summation and represents all of the data. Doing a histogram is more complicated, requires arbitrary choices of bin size and bin cutoffs, and discards all finer-grained information. Mathematically it’s simplest to consider the CDF as the fundamental representation of the distribution and think of histograms as difference quotients.

        • JustAnEngineer
        • 8 years ago

        I agree regarding frame time vs. frame rate. It’s pointless to invert the units.

        I believe that histograms would more clearly show the story that Scott would like to tell regarding the distribution of occasionally-slow frames.

        • jensend
        • 8 years ago

        Remember, the quantile function is the inverse CDF; graphically, just flipping the CDF across the line X=Y. So by doing quantile plots for the last couple reviews Damage is pretty much making the attempt you wish for. I really think it is a big improvement.

        I’m sure he’s busy and stuff, but I sure wish Damage would respond to the scaling suggestion which I made in the discussion you referenced, which could make the quantile plot more readable. [url=https://techreport.com/discussions.x/22835?post=631463<]When I brought it up again two weeks ago, linking to my original explanation[/url<], he didn't catch what I meant and asked me to explain again what I had in mind, but he didn't respond to my explanation (or the code I included showing one way to do it). I also suggested that when a single number is needed (bar charts, rankings, etc) he try comparing mean square frame time rather than "time above x ms" or "# of frames above x ms." (Rationale [url=https://techreport.com/discussions.x/22192?post=605675<]here[/url<], [url=https://techreport.com/discussions.x/22666?post=623005<]here[/url<], and [url=https://techreport.com/discussions.x/22666?post=623169<]here[/url<].) I totally agree that we should "switch to [frame times] and be done with it"; it may be nice to provide at least one FPS chart to give people an easy way to compare TR's results to those of other sites which haven't picked up on the better metrics yet, but we shouldn't have to constantly switch back and forth between the two.

        • Damage
        • 8 years ago

        We tried histograms, and they proved not to be helpful. See here:

        [url<]https://techreport.com/articles.x/22151/2[/url<] I'm a little surprised folks keep suggesting them since 1) they turn out not to be useful and 2) we already went there. I had hoped folks participating in these discussions would, you know, have read. I guess histograms are just a familiar form of data visualization, and so the suggestion gets tossed out there, regardless. Kind of frustrating, nonetheless. Some of you want us to drop FPS entirely, while others want us to convert everything to FPS. Jensend suggests a middle course, keeping a single FPS number so folks can relate and leaving the rest in native frame times. No one, heh, seems to have observed that this course is precisely the one we've taken. FWIW, we abandoned "number of frames above X" for "time spent beyond X" some time ago. The rationale is also in the link above. Jensend, you complain a lot about me not responding, but you seem to be missing some subtle math. Release dates for GeForce GTX 680, Ivy Bridge, GTX 690, and (censored) followed by (censored), all packed tightly together. I've been working 12 hours days without weekend since April 10, and I can't always find the time to reply to each post. As we left it, IIRC, you wanted me to buy, learn, and convert to a new toolset so that I could implement your semilog scale suggestion. My silence was meant to imply my sentiments on that topic more gently that my words might have. Especially given my recent schedule. If I need to show the tail more clearly, I can easily change the scale of the axis in Excel without going semilog. As for your mean square frame time suggestion, Jensend, I'm not sure of its practical import. In selling it, you give a tip of the hat to Bensam's insistence that we report variance as a primary result. I've found that suggestion to be wholly unhelpful from a practical standpoint, since variances to the low side of the mean aren't in any way bad in a real-time frame production system. If what you're suggesting is similarly unsuited to the particularities of this problem, it may not be useful, either. If it is useful, well, you need to sell it to me in an email. And remember, if I buy it, I have to be able to sell it to a broader audience. Seems like sometimes you guys forget that. Also, part of your justification for your new number is that we need a reference to a real perceptive threshold. Right, yet somehow folks missed my use of vsync quantization points for a 60Hz display as notable thresholds, first shown and explained here: [url<]https://techreport.com/articles.x/22835/6[/url<] I think that's a substantial contribution. A little surprised it wasn't noted. Apologies if I'm coming across as grumpy, but seriously guys.... we need to convey useful info to a broad audience in an understandable way. I think we're largely making a go of it. Give us some credit for that. We'll keep refining things, but I think it's worth considering the ground we've already covered before making suggestions... ya know? 🙂

          • Bensam123
          • 8 years ago

          I don’t nag for replies. :l

          That and I still do think variance or standard deviation is very important, you could compress frame time as you wouldn’t need to show as much data simply by listing the standard deviation along with the average fps or with error bars. Error bars are pretty easy to understand.

          SPSS does a lot of this work for you, that’s why I suggested it. I stopped suggesting variance and standard deviation a bit ago as it seemed like you guys had no interest in it and you did give a reply on the topic of it some months ago.

          Now I just poke at you about having a best fit line in the price graphs. ^^

          [url<]http://en.wikipedia.org/wiki/Linear_regression[/url<]

          • jensend
          • 8 years ago

          I’ve acknowledged that you’re making a go of it, and I’ve given you credit for it; I’ve consistently lauded your move to looking at frame time distributions rather than avg FPS and your new quantile plots. They are fantastic. I’ve raved about them in other fora as well, trying to point people to TR so they can see what you’re doing. I think you too easily take suggestions as complaints. They’re not.

          When I say I’m hoping for a response, I’m not [i<]demanding[/i<] anything and I'm not even [i<]asking[/i<] for you to carefully read through and analyze it right away. I'm certainly not saying you need to make a decision about it and (if the decision is yes) implement it right away. It's just that when you ask for an explanation and I try to provide one, a bare [i<]acknowledgement that you saw what I wrote[/i<] would be very nice to have so I know there's a chance I wasn't totally wasting my breath. Discussions move off the front page into obscurity after a few days and so, unlike email, if you don't say anything I have no idea whether you saw it, and if you don't say anything after a week it seems likely you never will see it. A response like "Hm. Maybe I'll think about this later" would be great. As I've said many times, I know you're busy and I'm not trying to be a bother, I just hope to be of help. I never suggested you absolutely needed to get a new toolset; I did say things might be better/simpler in the long run with something else, but I'm sure it's possible to do the stuff I mentioned in Excel- heck, there are people using Excel in the strangest settings as a full-blown general purpose programming language. Nor would you have to buy anything if you did make a switch; R and Octave are some of the best analytic tools out there and they're free (open source). My main reason for mentioning the other tools is that I know how to do these things with these other tools but not with Excel, and my guess is that they're a little harder to do in Excel. Doing the scaled plot in Excel should be simple enough to figure out; I just have no idea how to do the axis-relabeling step in Excel. Maybe somebody with better Excel-Fu can point us in the right direction here. (A quick search comes up with [url=http://superuser.com/questions/51991/how-to-rename-the-values-on-y-axis-in-excel<]this superuser discussion[/url<]; I'm not sure how that'd work in this case.) When I mentioned variance in relation to mean square frame time my main point was to point out precisely the problem with variance you just described and say that mean square frame time doesn't have that problem. Another way of putting the general idea is that we don't really care about variation from the mean nearly as much as we care about variation from instantaneous response. Variance is avg((x-mean)^2) which means times below the mean are penalized just as much as times above the mean; perfectly consistent but very slow frame times would rate as perfect. That's not the case for mean square frame time.

            • Damage
            • 8 years ago

            Noted. 🙂

          • CuttinHobo
          • 8 years ago

          [quote<]I'm a little surprised folks keep suggesting them since 1) they turn out not to be useful and 2) we already went there. I had hoped folks participating in these discussions would, you know, have read. I guess histograms are just a familiar form of data visualization, and so the suggestion gets tossed out there, regardless. Kind of frustrating, nonetheless.[/quote<] Clearly what's needed is to somehow create a histogram to show what changes have already been suggested. I propose that you stop testing on your Haswell-E sample, GTX GT MX 699 Ti, and Radeon 8675309 prototype. This is more important.

          • cobalt
          • 8 years ago

          Just to clarify a couple things: First, everything you’re doing is absolutely, completely, 100% appreciated. The simple fact is that you’re adding real value on top of what’s being done elsewhere, and you’re making it accessible. I know our suggestions may come across as complaints, but in my opinion the excellent reviews you’ve been giving us made it clear how much difference there really is between a cursory analysis and a thorough one, and it turns out I love having more info and would love to have even more. I don’t think we’re blind to the fact that this is a lot of work for you, but at least for my part, I can’t help but want more of a good thing.

          So, with that said:

          My apologies as I haven’t have much time yet to read the 3770K review in detail, and I hadn’t noticed you’ve got the CDF/QF in there. It’s perfect — it’s brilliant, keep it up! Definitely more readable than a histogram in my opinion.

          Second, regarding the average frame rate chart (in FPS) right next to the 99%ile frame time (in msec): I didn’t understand that the reason you have the average FPS chart right next to the 99%ile latencies was to give a point of reference in FPS. I can live with that, but in my opinion — and again, just my opinion — it doesn’t work as well because the following chart is measuring something different. So you don’t get much benefit from having the FPS-for-reference next to the latency.

          I also stand by my opinion that the 50%ile (median) or mean measurements are qualitatively no different than the 99%ile measurements, so that alone is no reason to switch from one metric to another. I’d personally be fine with everything being in FPS instead of latency, but since others have convinced me latency is a better metric to use, I’m fine with going to latency overall.

          So, given the desire to use latency more: in my opinion, if you want to keep an FPS number in there for reference (to make it easier for a wider audience not accustomed yet to latency), I’d rather just add a third chart in there with identical information to the first. I.e. have the mean in frame rate FPS, then the mean in latency msec, then the 99%ile in latency msec. Or simply have two charts, both in latency, but annotate one or both with the FPS frame rate numbers somehow. (Hard in Excel, I know.)

          In summary: love the CDF, keep up the amazing work, but I’d prefer a different solution to keep a reference FPS in there than having two adjacent charts with incomparable metrics.

Pin It on Pinterest

Share This