Nvidia cuts GTX 770, 780 prices, primes GTX 780 Ti for 11/7

Today’s a big day for Nvidia. Earlier this morning, the company announced a new Shield update as well as the addition of ShadowPlay to its GeForce Experience software. Now, Nvidia has followed up with some price cuts across its high-end GeForce GTX 700-series graphics cards. It’s also set a release date and a price for its upcoming GeForce GTX 780 Ti.

According to Nvidia, suggested e-tail prices for the GeForce GTX 780 and GeForce GTX 770 have dropped to $499 and $329, respectively, down from the current $649 and $399. The company expects these cuts to go “live in e-tail by 6:00 a.m. on Tuesday, October 29th.” Since no time zone is mentioned, I assume Nvidia means 6:00 AM, Pacific Time.

Those cuts will position Nvidia’s high-end cards much more competitively against AMD’s new Radeons. For reference, here’s how the performance-per-dollar scatter plots from our Radeon R9 290X review would look with tomorrow’s cuts in effect. (You can use the buttons below the graph to switch between post- and pre-cut pricing.)


As you can see, the GeForce GTX 780 will be an appealing alternative to the R9 290X despite its marginally lower performance. We’ll have much more of a parity between the GTX 770 and the R9 280X, too. I should also note that the GTX 780 and 770 both ship with with free copies of Batman: Arkham Origins, Splinter Cell: Blacklist, and Assassin’s Creed IV: Black Flagโ€”and that you get $100 off when buying them as a combo with a Shield handheld at e-tailers like Newegg. AMD offers a version of the R9 290X with Battlefield 4 for $580, but the R9 280X doesn’t come with any freebies, as far as I’m aware.

In related news, Nvidia says the upcoming GeForce GTX 780 Ti, which it teased earlier this month, will debut on November 7 with a suggested e-tail price of $699. The card will be available with the same Shield combo and the same game bundle as the GTX 780 and GTX 770.

Comments closed
    • jihadjoe
    • 6 years ago

    Looks like 780Ti will have 2880 cores!

    From [H]:
    [url<]http://www.hardocp.com/image.html?image=MTM4MzMyODc5M1IxajdDV05aQkdfMV8xX2wuanBn[/url<]

    • NoKiddingBoss
    • 6 years ago

    [quote<] fanboyism The collective outlook and behavior of a group of people concerning a subject (movies, games, [b<]hardware[/b<], comic book characters, etc.) which when challenged results in an antagonistic, passionate, and unreasoned response. Rampant Fanboyism can be found on most internet message boards when such hot button subjects are discussed. You know a fanboy when you see one. [/quote<] yep, i see a lot of them here. smudged in both red and green.

      • Modivated1
      • 6 years ago

      In that case I am not a Fanboy just a fan, and according to your definition I hope I never graduate!

        • Airmantharp
        • 6 years ago

        Yeah, there’s no way you fit that definition of fanboy; you’re neither antagonistic nor unreasoned, and it’s definitely okay to be passionate about things you enjoy :).

    • Modivated1
    • 6 years ago

    For those interested in seeing how an Overclocked 780 would do against the R9 290x here is a review that pits them against each other in 4K.

    [url<]http://www.legitreviews.com/amd-radeon-r9-290x-vs-nvidia-geforce-gtx-780-at-4k-ultra-hd_127129[/url<]

      • f0d
      • 6 years ago

      not bad but not exactly what I was looking for personally

      i was thinking more like a 1075/1100mhz 290x (which is what they can overclock to by the reviews i have seen) compared to a 1150/1200mhz 780 (again what i have seen them overclock to in reviews)

      factory overclocks have always been fairly conservative compared to what you can do yourself

      good review though it does make the 290x look more worthwhile than i originally thought

        • Airmantharp
        • 6 years ago

        I got two things out of the review-

        First, the R9 290X is incredibly competent at higher resolutions and higher game settings, bringing AMD up to par with the experience Nvidia’s over-priced GPUs have been providing, and the subsequent Nvidia price adjustments are definitely welcome.

        Second, the overall higher performance of the R9 290X over a mildly overclocked GTX780 (SuperClocked/SC is EVGA’s entry-level factory overclocked card) shows just what kind of legs that the AMD card has, as overclocked cards, especially those with custom coolers designed to outperform Nvidia’s already excellent reference blowers, typically go for more than Nvidia’s MSRP, quickly putting them very close to AMD’s aggressive pricing on their new top-end card. Now, custom-cooled AMD 290Xs, possibly overclocked by AMD’s partners, will probably also cost more, but judging by the performance of both cards in this article, it’s not hard to predict that custom-cooled 290Xs will be incredible values.

        I also noticed a couple of other things. In particular, the ‘freezing’ LR found of the GTX780 in Far Cry 3 is disturbing. They attribute the 290X’s lack of this performance issue to having another 1GB of VRAM and back that claim up with VRAM usage in GPU-Z, and I tend to agree with them. This is the first time I’ve seen 3GB of VRAM being a legitimate limitation for a current-generation game being run at a very high resolution at otherwise playable settings. I’d like to see if a GTX Titan has the same issue, or if there’s possibly something else at work such as game optimizations on the game or driver side, and I’d like to see a comparison against a 3GB HD7970/R9 280X and the very rare 6GB HD7970, to see if the issue still occurs.

        Next, the two games that the 780GTX does clearly show slightly better performance in, DX:HR and Metro:LL, are at opposite ends of the spectrum; DX has pretty low-quality graphics that should be easy for any competent GPU to drive, so it really isn’t a point for discussion, but Metro:LL is possibly one of the most demanding games out today. I’d posit that Nvidia might be able to tweak their drivers a bit for 4k to improve performance and framerate consistency for other demanding games, but I don’t really expect too much from Nvidia given the breadth of the 290X’s lead on several games and the relative maturity of Nvidia’s drivers.

        Last, note that this test is a bit older. Nvidia has since released their ‘game ready’ driver, a minor update for sure, but AMD has released three additional drivers with a litany of performance updates for a wide swath of current games and AMD cards, so the performance LR is seeing may not be as relevant- the R9 290X could be even further ahead. Also, it’s worth noting that LR’s test was performed in Windows 7 x64, and performance in Windows 8.1 x64 would likely be better for both cards, particularly for the AMD card which supports DirectX features only available in Windows 8.1 that should be able to provide a noticeable bump in performance.

        Overall, I can’t help but be proud of AMD for stepping up to the plate with the Hawaiian Islands GPU. It’s their first ‘big’ GPU since the HD2900-series, and it really gives the compute-heavy Nvidia GK110 a run for it’s money; AMD has taken the performance crown in gaming, and delivered a product that challenges Nvidia’s compute domination for many consumer and professional workloads, at much more reasonable prices.

        Bravo!

          • Modivated1
          • 6 years ago

          WHY CHAP! I didn’t think you had it in you! Yes, there’s certainly no mistaking it I detect..Why I detect the ability to be objective! He pointed out the pro’s and con’s without leaning to a side.

          I say we must give him a Gold Star for that one, what do you say Windel?

            • superjawes
            • 6 years ago

            For the record, the most common bias I see on this site is anti-fanboy. I’ve seen multiple users play both sides of the coin depending on which side is being most obnoxious at the time ๐Ÿ˜›

            • f0d
            • 6 years ago

            yeah i have a tendency to do that
            when mantle was announced i was saying how awesome it was when everyone was saying how bad it was and now the 290x is out and all the amd guys are saying how its flat out better in every way than the equivalent nvidia card i seem to be leaning towards defending nvidia

            they are both good and the chart shows that they have similar price/performance ratios and what should be making you decide which one to get really isnt the actual performance of them but the features of each one that you like

            want mantle / trueaudio / top notch performance without overclocking and dont mind a little noisier card? go amd
            want gsync / shadowplay / decent performance for the money and dont mind heavily overclocking to come close to the 290x? go nvidia

    • Bensam123
    • 6 years ago

    So… am I the only one that noticed about half (literally) the responses on this article are Airmantharp trying to put AMD down or argue with others who’d find something positive to say about them?

      • Modivated1
      • 6 years ago

      NOPE! It’s like AMD shot his Mother or something. That dude is on a Mission!

        • Airmantharp
        • 6 years ago

        Well, I just finished penning my response to the LR article you posted. I wanted to spell out what I really think about these cards, while avoiding the discussion of coolers and noise levels, which the LR review did not mention, since i typically don’t get very involved in article comments, and because even I’d have a hard time summarizing my point of view based solely on the one or two line back-and-forth jabs that dominate these discussions :).

          • Modivated1
          • 6 years ago

          I don’t find you opinions outrageous, but you have to admit your frequency of response to these subjects are off the charts! No matter what side of the topic you are on or what the truth is at the end of the day we are all talking about leisurely products.

          If we were talking about the Government I could understand your degree of passion and you have made some merited points. I just don’t think the discussion is that serious.

            • Airmantharp
            • 6 years ago

            Thanks- yeah, they are leisurely products, and I did wind up being a little more ‘serious’ than usual. I know that I can attribute much of that to being disappointed in AMD’s cooler efforts, something many of us have been waiting for them to address. I could definitely have been more patient with those that were waiting for anything to come down the pipe that might throw some mud on Nvidia’s face for whatever reason, and I could have let certifiable ‘fanboys’ like clone just sit in the corner with their dunce hats instead of egging them on!

            • Bensam123
            • 6 years ago

            Yes, I can understand not having a Titan level cooler sucks… But I honestly don’t think this warrants close to 200 replies arguing with people who disagree that it’s still not that big of a deal, which then devolve into browbeating.

    • Modivated1
    • 6 years ago

    To all those gamers that want to tout the Titans superior Cuda Gpu Gpu abilities as a value this is what I have to say:

    At the end of the day a chip is only worth the performance I get out of it not what it could do. If I don’t have access to the benefit of what it can do then that unused potential cannot be considered when assuming the value of the product. This point of view covers both points you are trying to make here.

    1.

    I at 95c I would get more performance out of a R9 290x at $550 than a Titan (Keep in mind if this boiling hot thing is a issue causing the card to malfunction I have an AMD warranty that covers it as long as I leave it at stock settings. Titan could possibly get that performance if you overclocked it but it would be at your own risk, no warranty there) at $1000 off of the shelf. Clearly a better value for performance if I am willing to deal with the sound and heat vices which clearly seem worth $450 to me.

    2.

    I and many other gamers could not care less about the cuda features of the card because it does not improve our games in any way. So gaming value is the only consideration for us and therefore why Titan is subject to such scrutiny on these forum for costing $1000. I know that Titan is the best buy for those who use it for work and research purposes but I couldn’t care less about that aspect of the card because it is of no benefit to me.

    That leaves me with my verdict: Titan though a prime performer is edged out by the R9 290x (which lead grows with every new driver release), when you consider performance and price from me and many other gamers point of view than the Titan is an embarrassing card for whoever would shell out that much additional cash for less performance than the R9 290x. You could bring up sound and noise but that would not justify the $450 premium, you could bring up the Physx features but most games don’t include it because it is Nvidia’s proprietary feature and would alienate half of the PC market so no real gain there. From a gamers perspective a person would look foolish to even consider buying a Titan just to game, it’s Simply a rip off deal.

      • Deanjo
      • 6 years ago

      And anyone buying a Quadro for MS Word or a firegl for a htpc would be a fool too.

      • f0d
      • 6 years ago

      “At the end of the day a chip is only worth the performance I get out of it not what it could do. If I don’t have access to the benefit of what it can do then that unused potential cannot be considered when assuming the value of the product. This point of view covers both points you are trying to make here.”

      it depends on what you use it for – you wouldnt buy a titan only for gaming that would be silly like buying a quadro for playing freecell

      the comparison should be the $500 780 vs the $550 290x which are both very close price/performance wise now after the price cuts – it just depends on if you have a preferred brand and/or certain features that you want (mantle/shadowplay/trueaudio/gsync etc etc)

      1 would you get more performance out of a 290x at 95degrees or a heavily overclocked 780? not sure as i dont own either and i dont think there has been any comparisons yet

      2 again you are comparing the 290x to something it wasnt meant to be compared to, most people admit (even tech report – “Not only is the Radeon R9 290X a beefy graphics card intended to compete with the likes of the GeForce GTX 780”) that the 290x’s competition is the 780, its like comparing the 780 to a firegl card which are 2 different classes of card and have 2 completely different markets and prices

        • Modivated1
        • 6 years ago

        I only commented on a comparison that has already been made and discussed several times on more than one thread. You have a point with the 780 (especially after the price drop) but everyone is talking about the Titan so when it comes to the topic of Titan vs R9 290x then this is the way I see it.

        Obviously I am not the only one to feel this way because more people are arguing the comparison between the Titan and the R9 290x than the 780 and the R9 290x. No matter what the R9 290x was intended to compete with the consensus of argument is showing that people see the Titan and 290x as the rival contenders and it’s also apparent that many people don’t think it’s a fair comparion to compare the 780 to the R9 290x so they go with the Titan.

        People who are pro Nvidia are the ones suggesting this match up most. So when I bring up an all gaming perspective don’t hate on me because I won’t throw CUDA on the scale when considering what the Titan brings to the table, why would I, I don’t need CUDA so it’s a waist of money for me. There are people who need it and it’s a steal for what they use it for, Great! But that’s not me or any other person buying a GPU for gaming purposes. Don’t see how you can disagree with that but anyone can do what they want so if you want to buy the card with 45% mark up and yet doesn’t give you more of what YOU are looking to get out of the deal than be my guest.

    • Modivated1
    • 6 years ago

    Damage, can you update the Performance vs Price graph to reflect the performance gains achieved by the new R9 290x drivers?

    • Questar
    • 6 years ago

    Wow, is this the most commented article ever?

      • Klimax
      • 6 years ago

      Not even remotely. IIRC we are at best one third there.

      • Krogoth
      • 6 years ago

      This guy is not even in the top ten.

      If you want to find the most commented articles, then behold this classic.

      [url<]https://techreport.com/news/2799/dr-evil-asks-gxp-problems,[/url<] it used to have more comments but it appears that some were lost during the recent web server move. When Damage used to run Friday Night Day on a frequent basis, any material R&P related was almost guarantee to generate 1000+ comments.

    • Meadows
    • 6 years ago

    What’s with all the zillion comments on TR lately?

      • Airmantharp
      • 6 years ago

      I’ve been keeping clone going for days for comedic relief. I have to admit, it’s been fun!

      • superjawes
      • 6 years ago

      [Major hardware player] introduced [product], which has both perks and flaws, and [competitor] actually engaged in competitive business practice.

      And the internet polarized as usual.

        • Meadows
        • 6 years ago

        Hardware companies have — gasp — released products before, but never before have we had so much stuff flying around in the comments.

          • superjawes
          • 6 years ago

          There’s always some, but it’s been awhile since someone other than Nvidia made waves at the top of the spectrum.

          There’s also some extra stuff going on, like AMD in the XBOne and PS4, Mantle, G-Sync…

      • Krogoth
      • 6 years ago

      Fanboys being fanboys…..

      • Modivated1
      • 6 years ago

      Because this is fast becoming the HOTTEST Technology site on the web! Join the Crowd!!

    • CaptTomato
    • 6 years ago

    Don’t forget that most 7950/7970’s with decent coolers are oclocking champs.

    • D@ Br@b($)!
    • 6 years ago

    Newly introduced techniques from Formula 1 are already seeping trough to hardware reviewing:

    AMD Radeon R9-290X – Graphics card Thermal Imaging Temperature measurements

    [url<]http://www.guru3d.com/articles_pages/radeon_r9_290x_review_benchmarks,12.html[/url<] And while writing about heat, some cool pictures: [url<]http://www.guru3d.com/news_story/aqua_computer_radeon_r9_290x_full_cover_block.html[/url<] I probably had to put this in the comments of the review(feel free to link) but the last, serious, comment there was 6 days ago. And no, I'm not an employee of G3D(nor a shareholder)

      • Klimax
      • 6 years ago

      Interesting.

    • D@ Br@b($)!
    • 6 years ago

    AMD Radeon R9 290 Launch Delayed by a week

    [url<]http://www.guru3d.com/news_story/amd_radeon_r9_290_launch_delayed_by_a_week.html[/url<] It seems the 290 isn't up to the competition with the 780 like some of U geeks ๐Ÿ˜‰ thought. If 290 turns out to be the 'overheated dust-buster' the X is, and the custom cards can't improve on that, these two new cards from AMD are a real anticlimax, a deception, a bummer, a 'cold' shower, for me. Has been long since I've been leaning towards nVidia for the contemporary video cards. btw: This does mean the review has been postponed as well and U guys have to do the benching al over? TR?

    • kilkennycat
    • 6 years ago

    Let me make a guess..

    The mysterious GTX 780Ti is the dual-GPU version of the GTX780. Only way to outperform the R9 290X with current-gen Kepler silicon. Probably running at base GTX780 clock with little over clock margin – it would still outperform R9 290X.

      • Klimax
      • 6 years ago

      Wrong on all accounts. Take a look at 780 and compare it to Titan and tell us again who current Kepler silicon is at its limits. (unit and characteristics wise)

    • Chrispy_
    • 6 years ago

    [b<]A BIG "THANK YOU" (IN ALLCAPS) TO AMD FOR PROVIDING COMPETITION TO THE UPPER END OF THIS MARKET AGAIN[/b<] Seriously, Nvidia have just had to slice $150 of markup off their cards to remain relevant. Not better, just competetive with AMD. This is one of those moments when capitalism makes me happy. The R9 290 will really shake things up, since I'm expecting that to hit near-780 performance for around $400....

      • Airmantharp
      • 6 years ago

      Just make sure you thank Nvidia for forcing AMD to bring the HD7000-series to reasonable prices too!

        • Chrispy_
        • 6 years ago

        Not sure I understand….

        The 7970 launched at $550 with a whole load of supply problems due to TSMC’s 28nm issues.
        Nvidia launched their GTX680 a few months later at $500 but that pricing was justified by the gaming focused architecture (the 680 was the spiritual successor to the mid-range 560Ti, lacking a lot of compute power)

        Prices remained roughly competitive until TSMC yields improved in June, when AMD launched the 7970GHz to match/beat the 680 in games.

        Even at price parity, Tahiti is a far more capable GPU than Kepler in the overall picture, and still evenly matched when you look solely at gaming. I don’t see how Nvidia did anything to drive prices down, they seemed to do the bare minimum of price cutting to keep their products competitive with AMD’s own aggressive pricing strategy at the time.

        • BIF
        • 6 years ago

        Thank you Nvidia and AMD for being at each other’s throats like rabid hyenas!

        Fight!

    • HisDivineOrder
    • 6 years ago

    So now the big question is…

    How many SMX’s did they cut for the 780 Ti? 0, 1, 2, or still 3? At a $200 price increase over a $500 780 plain, I’d say I think they must be clearing the market of 780’s in the short term and popping out a 6 GB version of 780 (a la a version of Titan with reduced DP). That’s a huge difference in price, though.

    If I had my choice, I’d have a $700 780 Ti include:

    Fully enabled GK110
    6 GB memory
    7 Gbps memory
    1k+ GPU boosts
    Playing the game of raising the temperature throttle by allowing the user to raise it to “only” 95 degrees if you are particularly reckless. Unchain the power subsystem.

    One thing rarely mentioned with the 780 or Titan is they are being limited by their power subsystems because nVidia didn’t want to have a GPU that ran 95 degrees all the time. If they stepped back, held their hands up, and said, “Have at it,” like AMD, their performance would improve to the same degree. After all, imagine taking an R9 290X and limiting it to 80 degrees instead of 95. Where would the performance land?

    Finally, some shakeup in the GPU industry. It’s about time. This is why AMD sitting on their hands for two years is a problem, people. Next time we say AMD ain’t doing what they should be doing, this is why. This is the problem. Imagine what Intel would do if AMD would start competing in CPU’s and actually, you know, succeed a little.

    I really wish AMD had not bought ATI. They’d be dedicated to actually pushing the envelope in CPU’s and ATI would be dedicated to GPU’s. Their focus is spread so thin, they’re serving reheated leftovers far too often between a few, sparing moments of inspired thoughts littered with mediocre execution.

      • Antimatter
      • 6 years ago

      [quote<] After all, imagine taking an R9 290X and limiting it to 80 degrees instead of 95. Where would the performance land?[/quote<] A 290X with a custom cooler will probably have the same performance even if limited to a maximum of 80C.

        • Klimax
        • 6 years ago

        That’s rather unlikely, unless current stock solution is absolutely terrible (dust everywhere and cheapcrastic thermal paste like)

          • Airmantharp
          • 6 years ago

          The current stock solution IS absolutely terrible, for whatever reason. But the custom coolers will likely keep the temperature target at 95c, while unlocking noticeably higher performance without the racket that the stock cooler produces.

          When those get here, we can actually talk about price/performance. Saying that a Chevy is faster than a Cadillac doesn’t make it suddenly not a Chevy :D.

            • Klimax
            • 6 years ago

            You are way more optimistic then me. I doubt we will see such clocks. (At least without sky-high consumption.)

            • Airmantharp
            • 6 years ago

            I believe in human ingenuity- and in this case we’re talking about the Taiwanese, who happen to be masters on the subject :).

            Expect AMD’s partners to be both disappointed with the reference cooler that they’re being shafted with, and to have moved to provide a solution for their customers with real value- many of them have established brands based on the idea of making AMD’s products quiet!

            • Klimax
            • 6 years ago

            Not sure how far they can get against physics, but we’ll see. When? IIRC December, is that right?

            • Airmantharp
            • 6 years ago

            Nobody’s talking- could be tomorrow, could be next year.

            But I’m not sure where you’re seeing a potential issue- card draws more power, generates more heat, make a cooler capable of dissipating said heat quietly. I know that can be done- what else is there to worry about concerning the card itself?

            • Klimax
            • 6 years ago

            Amount of generated heat is way too high and consumption itself is damn high too. All IMHO of course. There are limits how much you can push silicon at these levels. (See Pentium 4 and reason they abandoned it prematurely)

      • Klimax
      • 6 years ago

      Actually Titan (aka GK110) can get to 1046MHz while keeping under 70ยฐC, so you don’t even need 95ยฐC idiocy to get there. (Just enable more power for it and unlimit fan)

      Although my testing is based on Gigabyte card, where it appears their OC suit has much more aggressive scaling often ignoring stock Boost and going far above. (Stock settings saw in Einstein@Home 950MHz continuously)

        • MFergus
        • 6 years ago

        What I’m really curious about is why the 290x draws a good deal more power while having a smaller die and lower clocks. I’m sure there’s good reasons why but those are the first things I would look at.

          • Klimax
          • 6 years ago

          Massive brute force. IIRC from comparison I did, AMD needs way more units then NVidia to match performance. Also pure compute hardware (like scheduling) eats quite lot power. See differences between Fermi and Kepler.

          • Benetanegia
          • 6 years ago

          [quote<]smaller die[/quote<] One of the reason is that the above assertion is not as true as it may seem at a first glance. GK110 does not have physically only 2880 SPs, it has much more, 33% more to be precise. It has 2880 32-bit only SPs and 960 64-bit only SPs (which are idling in gaming conditions). This way, Nvidia sacrificed area efficiency in order to obtain power efficiency, and that means we could treat GK110 as if it were a smaller chip: adding them together means GK110 is a 550 mm^2 chip with a total of 3840 SPs, but only 2880 SPs are active at any given time (2668 in the Titan), 25% of the die dedicated to compute units is always inactive.

    • Modivated1
    • 6 years ago

    For the moment it looks like an even trade off. Cool and quiet with the GTX 780 or Cutting Edge performance with the R9 290x a matter of preference. There are three upcoming events that are bound to shake things up even more.

    First is the release of the r9 290, if it has the equivalent performance of the 780 and the bust a $400 price mark then the 780 will be forced to take another price cut.

    The second is Nvidia’s release of the TI edition of the 780 should that performance take the flagship crown then they can again demand a premium (although the performance advantage might be closer than the r9 290x and 780 are now which will only allow a small price bump) a hefty one if they have a serious jump in performance.

    Finally there is the aftermarket cooling solutions for the r9 290x (and regular editions) which may bring the r9 290x up to par with the new 780 TI or if there is no real lead then allow it to barely pull ahead. Either way I am predicting neck and neck performance in the end, one just edging out the other which will leave the war to be fought with price and added value packaging (Like additional games for appeal).

    Good times for Gamers everywhere!

      • Pwnstar
      • 6 years ago

      It is doubtful the $150 price premium over the 290x will be worth it, but I’d love to be wrong.

        • Airmantharp
        • 6 years ago

        Well, for those that need it, if it’s got 6GB of RAM, then the price of a good card with at least that much RAM just dropped $300. Not a good reason more most to drop $700 on a computer part, but hey…

    • Arclight
    • 6 years ago

    In Uber mode the R9 290x is overall faster than the GTX Titan so the $50 over the inferior GTX 780 is pretty damn good. That said, even if i had the money i wouldn’t buy the stock version of the 290x just because dual/tri fan coolers are better. You haters love those blower fans but fact is (yes it’s a fact) they are louder and vastly inferior in performance to custom designs.

      • Modivated1
      • 6 years ago

      While I would love to agree with you here we have to admit that the 780 is not that far off the mark of the Titan in performance, and the R9 290x just edges out the Titan (without the new driver update). I would argue that the gap is worth the money but at only $50 different and a loud cooling solution that needs to be improved it’s really a toss up and a matter of preference.

      I myself wouldn’t buy this card without a custom cooler. What I can say is that I think AMD was able to launch this card at this price because they did not bother with the cooler rather they focused their investment dollars on performance knowing the after market solutions would do the rest. Why do you think there is no supply for the reference design?

      This current situation just builds anticipation. When a custom edition card comes out with higher numbers and a truly cool and quiet solution then these cards will be selling faster than popcorn can pop.

        • clone
        • 6 years ago

        I asked DaveBaumann why they allowed the acoustics of the cooler to enter the discussion but got no response.

        I suspect the issue regarding the cooler is far more complicated than just a simple “they are being lazy” or “they chose to be cheap”….. it’s not significantly expensive to equip a quieter more efficient cooler.

        even heatpipe coolers are comparatively peanuts given they come standard with the FX 8320 that retails for $170.00.

        1 the reference cooler is functional albeit audible and more importantly is compact and meets certain OEM requirements…. this seems like a cop out but it is true.

        2 add in board makers will be able to distinguish & or raise margins by equipping different coolers and I’m wondering if AMD wouldn’t face some push back if they’d offered one that was …. too good?

      • Klimax
      • 6 years ago

      Put Titan in “Uber-mode” too and watch…

        • Arclight
        • 6 years ago

        Watch what? Beat 2 R9 290x bought for almost the same price as a single Titan? Yeah…riiiight

          • Airmantharp
          • 6 years ago

          I wrote this for clone, but you’ve earned it too!

          “The more you compare AMD’s brand-new card to an eight month old prosumer card, the more people realize just how little you really understand, and yes, it is very sad :D.”

            • clone
            • 6 years ago

            comparing the 2 high end cards available at the time isn’t sad.

            calling it anything but fair is.

            • Airmantharp
            • 6 years ago

            Why not compare it to a Quadro, or a Tesla then?

            Because a Titan is a whole lot closer to those than it is to a consumer GPU. Did you miss all of the marketing slides and reviews?

            • clone
            • 6 years ago

            did quadro or tesla offer superior gaming performance?

            • Airmantharp
            • 6 years ago

            Does the R9 290X offer superior GPGPU performance?

            • Arclight
            • 6 years ago

            Does 1 GTX Titan offer superior GPGPU performance compared to 2 290x?

            • Airmantharp
            • 6 years ago

            For which application?

            GCN’s compute performance is undeniable. Apple is putting two in their new Mac Pro. But compute isn’t like FPS- different GPUs perform wildly different for different applications.

            • Klimax
            • 6 years ago

            How about stable and very good drivers?

            • Arclight
            • 6 years ago

            Like the kind that burns your card or gives it TDRs?

            • Klimax
            • 6 years ago

            First IIRC is against private set of drivers, second is universal across drivers and you better not go on that point too far for obvious reasons.

            And Titan IIRC receives similar treatment as Quadro. (Some applications not present for rest of set)

            • Arclight
            • 6 years ago

            Private set of drivers? WHQL 196.75?

            Regarding TDRs, you are wrong, it’s not universal across all drivers, the issue is always present but it only manifests when using certain driver versions.

            • Klimax
            • 6 years ago

            Universal as in all GPUs has a run-in with it.

            As for overheat, you refer to this case ( I remembered only problems reported with pre-release drivers from 200 series or some like that) Looks like it failed with some not all. (Incidentally I think I had those drivers too, but card didn’t fail…)

            Anyway I don’t think you caught my primary point. It was more about Quadro then consumer level. And then I don’t think AMD would complain if they had NVidia’s rate of problems…

            • Deanjo
            • 6 years ago

            Wouldn’t be surprised if the Quadro K6000 did especially with it being a fully enabled chip, more ram, and operating at a higher stock clock speed then Titan.

            • clone
            • 6 years ago

            K6000 scores 162 lower in passmark.

            was the only comparison I looked up.

            • Klimax
            • 6 years ago

            Which is completely irrelevant to target market for any Quadro. (Also driver level optimization is missing in Quadro drivers for these benchmarks.)

            • clone
            • 6 years ago

            I know zilch about passmark, did a quick Google search using K6000 and Titan benchmarks and that’s what I got, yeah it was lazy I know but my interest in K6000 is down around dinosaur bones.

            if you can find a readily available gaming bench using the same platform to compare K6000, Titan and R290X I’ll certainly check it out.

            or just Titan and K6000, the rest can be extrapolated.

            • Klimax
            • 6 years ago

            You don’t know much about Quadro and its target market. Otherwise you wouldn’t post this ignorant post.

            Hint: Quadro has different drivers then Geforce yet same chips. Can you guess why?

            • Deanjo
            • 6 years ago

            [quote<]Hint: Quadro has different drivers then Geforce yet same chips. Can you guess why?[/quote<] You can also use the standard drivers, just don't expect glitch free rendering on pro apps btw.

            • clone
            • 6 years ago

            I know guys who’ve gamed using Quadro’s while waiting for their machines to finish processes.

            networked across the plant all of them playing a first person shooter. Quake Wars at the time.

            • clone
            • 6 years ago

            Klimax you don’t understand what’s being discussed which is growing notably weak over time….. at some point ace you’ve got to try a little … just a little for heavens sake to understand so that you can at least formulate a criticism worth notice.

            don’t be so damned empty.

            • Deanjo
            • 6 years ago

            You might also notice that the 290X scores lower then the 780, Titan, and K6000 in Passmark so what does that say for the 290X and what does that say for Passmark?

            Titan 8069
            780 7961
            K6000 7906
            290x 7485

            • clone
            • 6 years ago

            no idea, as mentioned passmark was the most readily available bench I found, didn’t notice the R290X (wasn’t looking for it.)

            • Deanjo
            • 6 years ago

            Passmark is a bit of a joke when it comes to benching graphics. Very basic dx9/dx10 tests like bouncing balls.

            [url<]http://www.passmark.com/products/pt_adv3d.htm[/url<]

            • clone
            • 6 years ago

            I don’t doubt it…. I seriously put no effort at all into searching for a comparable, I threw in some keywords and passmark showed up, I figured it’d be a perfect motivator for someone else to finish what I started.

            you mentioned K6000’s hardware specs might make it a better gamer but K6000 is probably the last card that gets reviewed using gaming benches if it gets reviewed at all online.

            • clone
            • 6 years ago

            I’ve seen passmark’s cpu and gfx benches and now that I’ve brought it up I’m a little curious.

            anyone know how it’s results are determined?

          • Klimax
          • 6 years ago

          You missed my point. I’d say at this point it is quite willful. Hint: 780 Ti and why 290x won’t keep it. And why Titan is worth much more then some 290x and why it is in no danger from 290x.

            • Arclight
            • 6 years ago

            We don’t know yet if the GTX 780 Ti’s chip is the same as the GTX Titan. Chances are it’s still a more crippled version like the current 780 but with higher clocks.

            Will the 780 Ti perform faster than the 290x? Highly likely and nvidia will make sure to give it a clock speed high enough to do so. That’s not to say AMD is left without options and who knows how a properly cooled 290x performs.

            • Klimax
            • 6 years ago

            All they need is to kill DP and maybe another SMX and cranking high clocks. (IIRC leaked specs shows this – 1-2 killed SMX)

            As for AMD’S options, that would depend where on the curve their chip is and cost of pushing further. (My favorite point of reference is of course Pentium 4)

            Most likely next stage would be 20nm.

        • Modivated1
        • 6 years ago

        Let me finish that sentence for you.

        watch…. the card melt like Wax!

        MU HU HEE HEE HA HA HA!!!

        You should always finish the sentence yourself rather then leave it open for others.

      • Airmantharp
      • 6 years ago

      Are you blind to the difference between a nice cooler and a crappy one, blower or open air?

      There have been crappy dual/triple-fan coolers too.

        • Arclight
        • 6 years ago

        At this price point shitty dual fan coolers are practically unheard of.

          • Airmantharp
          • 6 years ago

          Have you ever bought a GPU before?

          We had nearly a decade of crappy dual-fan coolers from half-assed AIB’s before anyone really got serious about it. The ‘custom’ coolers used to be the cheap option!

          And the point stands- there are good blowers (Titan) and bad blowers (anything AMD aside from the HD7990), and good open-air coolers (MSI Windforce) and bad open-air coolers (just pick one from three or four years ago).

          The rebuttal was for your amazingly naive assertion that there are no good blowers- when the evidence to refute your claim has been repeated on this very site for years :).

            • Arclight
            • 6 years ago

            Yes, indeed i have bought video cards before and yes there were and still are bad custom coolers but right now, at this price point, i dare you to give an example of a card with a custom dual fan cooler that is inferior to the blower fan cooler of the stock model of the same card.

            Personally, if i could afford it I wouldn’t use the GTX Titan as a gaming card with the stock blower cooler, just like i wouldn’t use the stock model of the 290x. Sure the Titan’s stock cooler might be better than older versions but it’s loud and still inferior to custom designs, not to mention beefier after market air coolers or water coolers. This conversation is going nowhere.

            • Airmantharp
            • 6 years ago

            You’re right, if you keep insisting that ‘every open-air cooler ever made is better than every blower ever made’, this conversation will continue to go nowhere.

            There are reasons to use blowers instead of open-air coolers, and yes, they can be better/quieter than open-air coolers if the enclosure they’re put in is properly set up.

            • NvidiaCorpDrone
            • 6 years ago

            Hello there loyal raging Nvidia Fan. I am Nvidia Marketing Drone #605778. We here at Nvidia have taken notice of your great loyalty that borders on psychopathic obsession to our company. We would like to offer you a brand new GTX 780ti for your personal use. You only need to send us your personal information along with your bank account number for processing. If you cannot comply, then you can just send us your current address. Our CEO Mr. Jen-Hsun Huang would like to personally slot in his “Titan” into your mother’s board. If you have a sister then it is much better. We’ll allow her to stroke our “GTX 780” while we plunge an “Nvidia Shield” right up your orifice. Thank and have good day. All hail the Reich… I mean Nvidia.

    • BoBzeBuilder
    • 6 years ago

    Thinking about upgrading from my GTX 560 Ti, but does anyone know if 3GB VRAM will be a limiting factor in future games at 1440p?

      • Airmantharp
      • 6 years ago

      The console versions of incoming games will have >4GB of memory to work with just for video. 6GB or 8GB would be preferred depending on the memory controller, with 12GB recommended for those cards with 384bit controllers that support memory in 3GB/6GB/12GB increments.

      In the near term and dealing with lower than max settings or more reasonable budgets, though, 4GB or 6GB makes a good compromise.

        • BoBzeBuilder
        • 6 years ago

        6-8GB just to be safe? I think I would have to wait until next year for that. Dang.

          • Airmantharp
          • 6 years ago

          If you intend to use your GPU next year ๐Ÿ™‚

        • MadManOriginal
        • 6 years ago

        ‘compromise’…lol. While BF4 may be able to get away with >2GB as a requirement, many, many other games won’t unless they don’t want to sell on the PC at all. There is far too huge an installed base of cards with 1GB to ignore.

          • Airmantharp
          • 6 years ago

          Well, it doesn’t ‘require’ >2GB, but it does make good use of it- as does BF3, by the way, so that’s nothing new. The jump in fidelity gave that point away a long time ago.

          Just keep in mind that not everyone (say, on the Steam Hardware Survey) is up for playing the latest and greatest games, and on the flip-side, top-tier games like BF drive GPU sales. Vendors won’t ignore the current install base, rather, they’ll incentivize the purchase of better equipment, whether that’s through creative design or artificial asset targets.

        • MFergus
        • 6 years ago

        The consoles both have 8GB total but both have 3GB of that reserved for the OS so 5GB of usable RAM. I’m not sure how much will be used for video but 4GB is probably the absolute max they could use. I doubt 3GB will ever be a limiting factor for 1080 PC games though but more would be nice just in case.

          • Airmantharp
          • 6 years ago

          AT 1080p, 4GB should suffice- over that, I’d want no less than 6GB.

          Note that the consoles are just a reference point. The PC versions will likely ship with even more assets, and will undoubtedly make use of more VRAM than the console releases- that’s never not happened :D.

            • Klimax
            • 6 years ago

            So in other words, everybody should want Titan…

            ๐Ÿ˜€

            • Airmantharp
            • 6 years ago

            Or wait for vendors to provide cards with more memory. Memory’s cheap, relatively speaking; just have to wait for one company to cave, then the other follows suit, just like in the performance rankings :).

            • MFergus
            • 6 years ago

            GDDR5 memory isn’t so cheap though.

            • Airmantharp
            • 6 years ago

            Not compared to DDR3, no, but it’s still just a small fraction of the price of the card. It’s not like many wouldn’t pay another $50-$100 to get a card with double the memory :).

            • MFergus
            • 6 years ago

            Ya it’s not hugely expensive but just like with phones, they almost always charge a ton even for cheap memory increases. $50 to $100 is pretty good.

            • Airmantharp
            • 6 years ago

            Oh, I expect them to try to charge it, but that’s where competition comes in!

            • MFergus
            • 6 years ago

            Ya your right, unlike with phones, there are multiple OEM’s of every card.

            • Klimax
            • 6 years ago

            I don’t need to wait… ๐Ÿ˜€

            • Airmantharp
            • 6 years ago

            I can’t justify the purchase of a couple of Titans, but I sure could use a few… ๐Ÿ˜€

        • End User
        • 6 years ago

        If games start using 6GB+ of memory I don’t want to be using 2013/2014 era GPUs.

        AMD has gone with 4GB on their 290X and the 6GB Titan is not worth the money so we are stuck with 4GB cards for the foreseeable future.

        I know you’ve been waiting to buy a new GPU. What are you going to buy?

        Edit: Hmmm, the 780 Ti may ship with 6GB.

          • Airmantharp
          • 6 years ago

          That’s the one, for now-

          However, the R9 290X has a wide memory controller and takes slower RAM, so I wouldn’t rule out an 8GB version in the reasonably near future. If they get G-Sync on it, they’ve got a winner- well, that, and they need to fix the damn cooler :D.

        • Antimatter
        • 6 years ago

        Even though consoles have 8GB of RAM rumours indicate that around 3GB will be dedicated to the OS with the remaining shared between the CPU and GPU.

          • Airmantharp
          • 6 years ago

          Yup- but if a game like BF4 can run with less than 512MB of RAM on a 360/PS3, and let’s say it takes all of 1GB on a XB1 or PS4, then it has 4GB for graphics- and that’s just the console version, for a game that was developed for both generations of consoles.

          Expect games releasing in the near future to be able to make use of more.

      • f0d
      • 6 years ago

      i dont really think 3GB (or even 2GB for that matter) will be a limiting factor for games as you dont ALWAYS have to run with the highest amount of AA and settings

      most games will do just fine with 3GB of VRAM even at max settings and the rare few games that need more at max settings you could just turn down the AA and a few settings to keep it within 3GB VRAM

      imo its not worth going out of your way to get a vidcard with 4 or 6 GB of VRAM just for one or 2 games that can use that amount at their highest possible settings, if a vidcard you want hs a lot of VRAM then great but i wouldnt go out of my way just to get a special edition of a vidcard with extra ram

      we dont know exactly how much VRAM future games will use and by the time games start to use more than 2 or 3GB for the average game then you will probably be looking for a faster video card anyways

      • End User
      • 6 years ago

      I think the highest memory usage I have seen was with Crysis 3. It hit 2.8GB for me on my 4GB 770 at 1440 (max settings). If you turn off AA (don’t need it at 1440) and drop the settings down to high the memory usage drops below 2GB and the game still looks awesome.

    • sschaem
    • 6 years ago

    With everything said… the GTX 780 is in stock. the 290x is not.

    Peeking at some stores, it seem if you are not in the queue, you wont get a card until late november.

    I really hope for AMD that they wont miss the buzz…
    Build all that momentum but have nothing to sell… I guess that just the old AMD we love and hate.

      • Airmantharp
      • 6 years ago

      I mentioned it in the other thread, but I hope that AMD’s partners are busy yanking off the reference cooler, and that that’s the reason initial stocks have been low.

      You might have to wait till sometime next month to get a reference cooler version, but you might be able to get a custom-cooled version in the next week :).

      • pixel_junkie
      • 6 years ago

      This basically happens with every new hardware cycle, doesn’t it? Thankfully my build isn’t scheduled until end of November so that gives me a couple of weeks to watch stocks fill, prices drop, tests run, aftermarket products emerge, etc…

      • MrJP
      • 6 years ago

      If that’s really the supply situation with the 290X, why would Nvidia cut prices so quickly? Perhaps it’s not the 290X they’re really concerned about?

    • wizpig64
    • 6 years ago

    Sheeit, was really hoping for a 760 cut as well. SLIing with my current one would still be comparable to a 780 or better, but now trading up for a 780 actually seems doable… hrggg.

      • Airmantharp
      • 6 years ago

      Those cuts might come later, after both companies finish releasing their new high-end products- and two GTX760’s are still faster than any single GPU on the market :).

    • pixel_junkie
    • 6 years ago

    If I’m totally off on this then someone please correct me, but one thing I haven’t seen taken into consideration in this entire discussion of which is best (ergo bang/$) is overclockability as these figures all appear to be based on standard clocks. A components’ capacity for being pushed beyond their stock config has as far back as I can remember always factored into their value. As far as I can make out the 780 has much more headroom to work with as the 290X is already working at its thermal limit (yes, you can crank the fan up even more but its already quite loud at the 55% “uber” setting) and from previous reviews there’s easily another 5-10% to gain through OC on the 780 putting it in the 290X “uber” performance figures at a lower cost taking the new pricing into consideration (I’m leaving game bundles out of the equation for the moment and looking at this from a purely price/performance perspective).

    I’m in the market for one of these two cards in the coming weeks and would be curious to hear your thoughts on this.

      • Airmantharp
      • 6 years ago

      The Nvidia cards overclock well, the AMD card doesn’t.

      You can wait for custom coolers for the AMD cards, but they will most likely be of the open-air type that will take extra care to run in tandem. Price/performance/noise/overclocking, the GTX780 looks like the best bet, though you might want to see if they release a 6GB version at a reasonable price-point.

        • pixel_junkie
        • 6 years ago

        Both good points that I’m taking into consideration as well. At some point I may like to upgrade to a dual-GPU config if/when UHD monitors come down to an affordable price or should I decide to run a multi-monitor setup. The 290X performs exceptionally well in Xfire configs at high resolutions but as you point out the open fan design that will undoubtedly be found on the aftermarket coolers wouldn’t be my first choice for such setups. And yes, a 6GB 780 would indeed be nice.

      • CaptTomato
      • 6 years ago

      IMO, wait for the standard 290 to appear, also, we don’t know the 290x’s full oclocking potential as it’s blower is a limiting factor.

      Recent driver updates have already cranked up the 290x’s performance, not sure whether the 780 will see as much of a boost as it’s been out a while, but they always seem to squeeze more out of them given enough time.

      • Farting Bob
      • 6 years ago

      If you intend to OC them, youll want a third party version with their own heatsink design, and there are many different designs (and binned chips with factory OC or at stock), its unfeasible to do a comparison against all of them. GPU’s have always run pretty close to their max speed by default anyway.

      • f0d
      • 6 years ago

      when considering overclockability i think we should wait for different coolers for the 290 and 290x before deciding because those standard amd coolers are horrible

      if i had to decide RIGHT NOW which one overclocks better i would easily have to say the 780 because its has a much better standard cooler – 780 already has aftermarket coolers and the 780 has more thermal headroom

      as a guess (we dont know because there isnt any aftermarket 290x coolers yet) i still think the 780 will overclock better than the 290x even when the 290x has an aftermarket cooler

      • sschaem
      • 6 years ago

      The other factor is : what is Mantle worth to you, and how much True Audio adds to the value of the 290x? And will the extra GB on the 290x be worth it in the long run ?

      Its also unclear to me if the GTX 780 fully implement tiled resources… And how well console title will port to non GCN HW.

      Personally, without Mantle and True audio I would go for the GTX 780 without a moment of hesitation,
      (even so the 290x got 25% more memory).

      But microsoft seem set to keep any DirectX optimization, and further development, out of Windows7.
      So Mantle will be the only true choice for Windows7 gamer that want a high efficiency API.

      So the choice is not clear at this time…

      Windows8 + GTX780 , should hold for the next 2 years
      Windows7 + 290x seem to be more future proof , and actually scale up in performance over time

      The worse combination would be Windows7 + GTX 780… I would stay away from this combo.

        • pixel_junkie
        • 6 years ago

        Ah, interesting thought. I hadn’t considered cornucopia of OS/GPU combos coming up and their pros and cons. I really do not like Win8 but have been mulling over the move to it for gaming purposes.

        I build a PC every couple of years and keep up with hardware on an interested but not in-depth level for the in-between periods until it’s time to build at which point it all of course becomes far more interesting where I’m really going to get the most for my money. But now with Mantle and SteamOS and better performance in-game (at least from some articles I’ve read) in Win8 have made the decisions, shall we say, not as cut and dry as it used to be.

          • Airmantharp
          • 6 years ago

          How would you feel if you had to used a Mac at work from now on?

          There’s a learning curve with Windows 8.1, but it’s not that bad. I upgraded everything to Windows 8 a year ago, and I got along- even taught my dad to use it. Annoying, but it’s fast, stable, and it works.

            • pixel_junkie
            • 6 years ago

            Haha, perhaps I am just getting old and should be more open to giving it a more in depth look. But this just feels like the whole Vista thing all over again. I tried it for a little bit and just didn’t like it so I stuck to XP until Win7 came out which immediately just kinda felt right. Or maybe enough time had passed that I was finally ready for a new OS. Hmm…VMware.

            • Airmantharp
            • 6 years ago

            Vista was a bungle of sorts- Microsoft didn’t optimize it, and they didn’t get hardware and software vendors behind it, but it paved the way for Windows 7/8/8.1 to be as fast and as secure as they are today. By Service Pack 2, Vista was near indistinguishable from Windows 7, with the only truly glaring difference being Windows 7’s Trim support for SSDs.

            8/8.1 were faster from the get-go, stable, more secure, and 8.1 addresses some of the interface complaints for those that didn’t want to run something like Start8 to get stuff back where they wanted it.

        • Airmantharp
        • 6 years ago

        I wouldn’t tie much to the operating system- this isn’t XP vs. Vista, everyone should be upgrading to 8.1. Not the most fun thing in the world to do, but that’s the wrap.

        Out of the feature disparity you mentioned, 3GB vs. 4GB will matter at higher resolutions with presently incoming games, but the difference will likely be inconsequential any further out- either you have 6GB+, or you’ll be turning settings down a lot. Mantle is a big question mark, and it sounds rather over-promised to gamers; and if it makes the cards run any hotter, it may not make much of a difference. TrueAudio is rather intriguing, but we have next to no good info on it, and on the surface doesn’t look like it’ll have a higher adoption rate- we need developers announcing supporting titles and commentary on just exactly how it works, what it does, and how difficult it is to implement to evaluate it’s worth. G-Sync, which you didn’t mention, is probably the most intriguing out of the swath of upcoming technologies, since we know it’s cost, effects, and availability already, and it is very, very nice. And AMD really needs to get in on that bandwagon, more than Nvidia needs to support Mantle or any other AMD initiative.

      • pixel_junkie
      • 6 years ago

      All good points. Yes, the reference AMD shroud coolers aren’t the best in the biz (I don’t quite understand why AMD hasn’t addressed this issue yet, but whatever) and performance from aftermarket vendors should make it a bit more interesting in the coming weeks. That all being said, stock the 780 runs within an 80C thermal envelope whereas the 290X just goes balls to the wall. I’m curious to see how this will shake out with the arrival of the 290 and the 780 Ti and various aftermarket variants of all of the above.

    • gmskking
    • 6 years ago

    Still waaaaay overpriced.

      • Airmantharp
      • 6 years ago

      For who? Compared to what?

        • Arclight
        • 6 years ago

        For regular gamers and compared to prices of yesteryear. Remember the days of the HD 4000 or HD 5000 series? In the prime time of price wars i don’t recall high end single GPU cards costing upwards of $500 for stock versions.

          • f0d
          • 6 years ago

          ahh yesteryear
          i remember paying $550 for an asus tnt2 ultra with 3d shutter glasses
          those were the good old days…

          • Airmantharp
          • 6 years ago

          Well, HD4000 was a bit of a fluke- but remember HD2000? or HD7000? Or X1900XTX? ๐Ÿ˜€

            • Arclight
            • 6 years ago

            I remember also the GTX 8800 Ultra costing close to $1000 not to mention the recent GTX Titan…

            But that\s besides the point, gmskking was referencing the good old times.

            • Airmantharp
            • 6 years ago

            Voodoo 5 5500? I bought one, then quickly exchanged it for a Geforce 2 GTS. I do remember those times… that was the time I gave up on 3Dfx :D.

    • ClickClick5
    • 6 years ago

    Oh Nvidia. Your price gouging antics make me laugh!

    HAHAHA!

      • Airmantharp
      • 6 years ago

      They make their shareholders smile though :D.

        • superjawes
        • 6 years ago

        [quote<]They make their shareholders smile though :D.[/quote<] Airman smiled! He must be an Nvidia shareholder! Caught you, you dirty rat >=)

          • Airmantharp
          • 6 years ago

          Well, if I had to pick whose shares to own, it wouldn’t be AMDs!

          But I’d probably buy Intel stock. They have the very best reputation for executing on their products, and have the best products on the market.

          And no, I don’t own Nvidia stock, but that wouldn’t be that bad either :).

    • bogbox
    • 6 years ago

    Now we wait to get a triple double bundle from AMD to settle the score : Never Settle: Reloaded 2

      • Airmantharp
      • 6 years ago

      More game bundles are never a bad thing, I love picking up top-tier games for $10-$15 from people selling the codes!

        • f0d
        • 6 years ago

        i was about to say id rather the cards cheaper (i still do prefer cheaper cards) but getting cheap codes off ebay is pretty awesome too, i got a load of the planetside 2 ones with boosters and camo and stuff for $5

          • Airmantharp
          • 6 years ago

          When you consider the market value of the games they’ve been including, though, that drop in price would probably only be about $20-$30. A lot on the low-end, not so much on the high-end :/.

            • f0d
            • 6 years ago

            while that is true i would still prefer the discount on the actual card as most of the bundled games i have no interest in

            i dont like splinter cell and assassins creed games and i dont think i would like batman either

            also if there are any games i do like in a bundle i probably would have purchased them before i would have bought a new card

            another thing is these bundles dont apply to all countries and here in australia we usually dont get the bundles but we do get the discount if they reduce the price on the cards (i got zero games when i purchased a msi 670 then later i purchased 2 670 gigabyte windforce cards and still got nothing)

            so overall when i see these bundles i kinda go “meh”

            its good for those people that like a wide variety of games in usa though and for us people that can purchase the things we want off ebay ๐Ÿ™‚

    • jimbo75
    • 6 years ago
      • Airmantharp
      • 6 years ago

      Why don’t you ask Nvidia?

      They sold GTX260’s for ~$200. Not a direct comparison, but it does make <$400 for a GK110-based card sound reasonable. How low is AMD willing to go?

      *also note that die size is a factor in cost, among others- yields can quickly factor in, and are as much a function of die size as they are of ASIC design and validation requirements- meaning that a single Hawaiian Islands die could cost less than a single GK110, or it could cost more, if their yields are off

      • SCR250
      • 6 years ago

      A better question would be when if ever again will AMD show a profit on discrete GPU’s.

      As for your dumb comment about “negative margin” you could not be more wrong as Nvidia in the past had die sizes in the same range on an immature process and were selling those for $499-$350 and making profit on every sale. On a very mature 28nm process with the GK110 being produced in quantity for well over a year Nvidia will again make profit on each sale. And with the price drop they could actually make more revenue as sales quantities become larger the lower the price.

      • tviceman
      • 6 years ago

      Can we please ban this troll?

        • chuckula
        • 6 years ago

        Once again, while some people may have noted that I don’t always agree with jimbo75, I’d go against banning him for the same reason I’m against banning Unknown-Error and OU812 (who appears to have banned himself anyway).

          • tviceman
          • 6 years ago

          It’s not about agreeing or disagreeing, He is a blatant troll who is only commenting to incite negative comments in return. Have an opinion all you want; but when crossing the threshold of no respect and void of any logic those opinions are no longer opinions – they’re trolling flame baits good for nothing more than bringing discussions down to Nazi name calling.

            • SCR250
            • 6 years ago

            He loves to hang around the S|A forums all the time with others like himself who constantly kiss each others backsides about how great AMD is and now evil Nvidia is. And with charlie’s constant brainwashing he really doesn’t know any better.

            • LastQuestion
            • 6 years ago

            Why so serious?

            • jimbo75
            • 6 years ago
            • MadManOriginal
            • 6 years ago

            Maybe your next card will be bundled with a sense of humor.

            • jimbo75
            • 6 years ago
            • superjawes
            • 6 years ago

            ::ahem::

            Yes, they posted before you, but none of them resorted to namecalling, now did they? I’ll agree that these are rude to some extent, but with a certain measure of vidication considering the lengthy discussion already under the 290X review.

            Now calm down and just try to enjoy the conversation. You don’t need to be so emotional about companies that, at the end of the day, are just trying to make money.

            • derFunkenstein
            • 6 years ago

            Get the troll names right. It’s Count Chuckula, jerk.

            • chuckula
            • 6 years ago

            Indeed. Who is this Duckula and why is he plagiarizing my posts?

          • SCR250
          • 6 years ago

          And with jimbo75 constantly putting his foot in his mouth all the time it is actually kind of fun seeing him become more and more animated.

        • CaptTomato
        • 6 years ago

        U work for Nvidia?

          • Airmantharp
          • 6 years ago

          He does sound like it, more than jimbo75 sounds like he works for AMD. I’ll happily keep jimbo75 around for comedic relief.

          The ‘ban this dude!’ stuff has no place on our site, though.

            • jimbo75
            • 6 years ago
            • CaptTomato
            • 6 years ago

            You’re giving him too much credit, LOL.

            • SCR250
            • 6 years ago

            AMD wouldn’t even hire you as a janitor. How sad.

          • tviceman
          • 6 years ago

          Nah I am a fireman, but I own shares of AMD.

      • HisDivineOrder
      • 6 years ago

      Considering how much pure profit they made off selling a GPU designed for $200-300 boards for $400-$500 last year (and they were hailed for giving us VALUE versus the $550+ of the Radeon 7970), I’d say they can afford quite a few. Not to mention they’ve been stockpiling GK110 for what? A year now? More?

      Somehow, I think they’ll release a cheaper, even more crippled version of GK110 before the end. Let’s call it…

      The Geforce 770 Ti. One could see that just sliding into that $400 price point. Let prices equalize, 770 for $300, 770 Ti for $400, 780 for $500, 780 Ti for $650-700.

      That’s assuming AMD ever gives them a reason to reach that low with the GK110 parts.

    • Deanjo
    • 6 years ago

    Thinking once the price drop hit that the 770 would make for a nice card in a Steam box build.

      • Deanjo
      • 6 years ago

      Well picked up the Evga GTX-770 with the titan cooler for $329 locally as they have already dropped their prices.

      The Steam Box is now complete. i5-3570k, intel Z75 MB, 8GB DDR3, 512 Gig SSD, GTX-770, Fractal Define Mini, bluetooth card and openSUSE.

        • Airmantharp
        • 6 years ago

        That looks awesome dude!

        Got to let us know in the forums how it performs!

          • Deanjo
          • 6 years ago

          It will do pretty well for 1080P gaming considering the GTX-580 had no issues on basically the same setup.

    • MadManOriginal
    • 6 years ago

    So I wonder if GTX 760 street prices will drop down a bit, or maybe the rumored GTX 760 To will come out. I’m not sure what configuration NV could have for such a card.

      • Airmantharp
      • 6 years ago

      Their options are pretty limitless (as are AMDs) when it comes to card configurations, but they’d do well (as they have in the past) to keep the product lines fairly simple and well delineated. People get turned off by information overload, and there’s already plenty of that in the graphics card market!

    • jessterman21
    • 6 years ago

    Now that’s what I’m talking about!

    • kamikaziechameleon
    • 6 years ago

    price war has begun!

      • chuckula
      • 6 years ago

      [Yoda]Begun, the price war has.[/Yoda]

        • f0d
        • 6 years ago

        said, this already has been ๐Ÿ˜›

          • chuckula
          • 6 years ago

          Oh yeah… my bad! Good call ๐Ÿ™‚

            • f0d
            • 6 years ago

            all good – i just wanted to make another yoda reference mostly ๐Ÿ™‚

    • dpaus
    • 6 years ago

    If you draw a ‘best fit’ line through the GTX 660, the GTX 760 and the GTX 770, you’ll get a price-performance slope that very closely matches the corresponding line through the R9 270X and the R9 280X. The GTX 780 doesn’t fall that far outside the Nvidia line, but the R9 290X is rather far off the AMD line. I suspect we might see a ‘price adjustment’ on the R9 290X after initial demand has been met.

    And the U.S.S. Titanic can carry on being ‘the only ship in the quadrant’ ๐Ÿ™‚

      • Airmantharp
      • 6 years ago

      That’s a good way to put it, but instead of drawing a line, try a slope instead. Faster cards, like faster cars, are and should be more expensive per unit of performance in a settled market.

      And yeah, though I wouldn’t mind a price-cut on the Titan (and the effects thereof), I don’t expect it to budge much if at all :).

        • dpaus
        • 6 years ago

        Yeah, ‘premium’ products often command a non-linear price/performance differential, so the line should arc to the right at the upper end.

        I think the real challenge for Nvidia is going to be pricing the Titan after the introduction of the 780 Ti. The 780 Ti will have to have a substantial enough performance differential to set it above the R9 290X (if possible), but at a price point that’s at least vaguely along the arc you describe – which will leave the Titan waaaaay more expensive at substantially lower performance. Ooops.

          • Airmantharp
          • 6 years ago

          But it should be more expensive, if the GTX780Ti ships with the same DP compute settings as the 780, and they can always just release a ‘refreshed’ Titan to keep gaming performance at parity with the consumer cards.

          Lots of options here, of course.

    • jimbo75
    • 6 years ago
      • swaaye
      • 6 years ago

      Because the only game that matters is BF4 right? I’ve never even played the series. It seems like PC gaming is riding on that as the sole reason to even consider a high-end card purchase.

        • jimbo75
        • 6 years ago
          • Airmantharp
          • 6 years ago

          You’re going to quote Fudzilla over here too?

          Do you really think that makes your argument credible?

          There’s a reason TR only reported on availability; no one, aside from AMD, has the complete picture as to the actual supply and demand of this card, and no one is likely to get that information.

            • swaaye
            • 6 years ago

            He sounds like an AMD employee to me. We’ve had their marketing dept hanging out on here before. He’s almost always positive about their products and downplays the competition.

            • chuckula
            • 6 years ago

            Nah… I’d never insult AMD’s marketing department by implying that they’d hire jimbo75….

          • Klimax
          • 6 years ago

          ArmA III

            • Airmantharp
            • 6 years ago

            That’s a special kind of punishment ๐Ÿ™‚

            • Klimax
            • 6 years ago

            I know. I still remember Operation Flashpoint and Arma I. (unfortunately didn’t get to have Arma II nor III yet)

            ETA: Looking forward to Star Citizen. ๐Ÿ˜‰

          • Sagia
          • 6 years ago

          well, perhap because they didn’t have enough stock

      • f0d
      • 6 years ago

      did you think the 7970 was also similarly overpriced when its prices were slashed from its original $550

      • Benetanegia
      • 6 years ago

      Cry me a river jimbo, you were blatantly wrong. And don’t even try to save face now by saying “wait AMD’s got something coming”, because that’s exactly what we have been telling you all along.

      No one EVER said 780 was good value, so check your reading comprehension. We said it could be just as good to pleople who care for other things other than 2 extra fps, but our message was clear in saying that neither card was good value at the time and that people should wait a couple weeks (took 2 days…) until there was some movement. That’s two very different things (implying same value != good value). Now 780 is just better value to everybody, until, of course, AMD reacts, after which Nvidia will react, etc.

      *people who prefer quietness over 2 extra fps, OR people who use the card for long periods of times and who live in areas where electricity costs a lot, which is most of europe, where after 2-3 years the 290X would end up costing them 50-100 โ‚ฌ more than the GTX 780, when the 780 costed 550โ‚ฌ, now that it will cost what, 400โ‚ฌ? It’s no contest.

        • superjawes
        • 6 years ago

        If I may, the power arguments have nothing to do with electricity costs.

        WARNING: MATH AHEAD.

        Okay, so the 290X uses 346 Watts at load, and the 780 uses 309. That’s a difference of 37 W. For argument’s sake, Chicago’s electricity costs are approximately $0.12/ kW-hr. That’s [b<]KILO[/b<]Watt, or 1000 Watts. So if both cards are at load, the cost difference after one hour is: $0.12[/kW-hr]*0.037[kW]*1[hr] = $0.00444 That's...not a lot. If we assume the cards are always at load, after one day, the difference is $0.10656. After a week is $0.74592. After a year, $38.8944. [s<]Three hundred and sixty-five days, and you're under 5 bucks. Now some places in Europe might have higher prices, yes, but I highly doubt energy costs would reach โ‚ฌ50 over three years, let alone over the card's lifetime. Which, I might add, would be greatly shortened if you're running at load 24/7, which we know isn't the case.[/s<] [i<]But remember the primary assumption is that you're running this card at load, 24/7/365, which is not the case even if you are a power user, and is a scenario where product longevity is more important than energy costs.[/i<] /MATH The power arguments aren't about draw, but dissipation. They are related, but the concern is basically how much heat is being pumped into your system from your card or other component. Lower draw means less heat is being pumped out regardless of cooler. Even if the heat ends up being the same, the card drawing lower power will likely be more silent because it does not need to cool as actively as the hotter card. THAT is why we look at power metrics. Heat, not dollars(/euros). EDIT: fixed some faulty math... EDIT: fixed some faultier math...

          • Benetanegia
          • 6 years ago

          [quote<]$0.12[/kW-hr]*0.0037[kW]*1[hr] = $0.000444[/quote<] [quote<]EDIT: fixed some faulty math...[/quote<] You might have fixed some faulty math, but not all. 37 watts it's 0.037 kW, not 0.0037 kW (3.7 W) so your math is off by a factor of ten. Your ~3.89 dollars per year sudenly becomes $38.9 per year on a relatively average/lowish pricing area and in EU prices are often $0.30 or more (VAT included) [url<]http://www.energy.eu/.[/url<] So does that put things into perspective or not? In my particular case, because I use my PC for work (CAD, game design) and almost all my home entertainment, the card is out of idle state most of the day (sadly AMD cards consume a lot on multi-moniitor or playing video too, for reason unknown to me [url<]http://www.techpowerup.com/reviews/AMD/R9_290X/25.html[/url<] I did my math and all things considered it amounts to about $64 per year, and maybe an similar but doubled amount in the hot summer months, because my AC has to disipate that same amount of heat.

            • superjawes
            • 6 years ago

            Ah, quite right that I messed up the Watt-to-Kilowatt conversion.

            However, you kind of overlooked this bit:
            [quote<]Which, I might add, would be greatly shortened if you're running at load 24/7, which we know isn't the case.[/quote<] Even if you are using your computer most of the day, it's very hard to keep a GPU at full load for 24 hours a day. Eating and sleeping alone should (hopefully) cut that down by a third, slashing energy costs considerably, but that's assuming that you're still at "load" for the remaining two-thirds of the day. The most important bit, however, is still not loads or energy costs. We concern ourselves with power draw because it generates heat, so even if you do have a 24/7 load, you shouldn't be looking at power draws to save money, but because a card with a higher draw will generate more heat and probably have a shorter life. Energy costs don't mean a thing if you have to replace your card a year or two sooner than the alternative, and if this is a work-related machine, that downtime costs you more than the energy anyway. Sure, there are some energy savings to be had, but I just want everyone to be clear that product longevity and system heat are the bigger concerns.

            • Benetanegia
            • 6 years ago

            I never made the case that these figures apply to everyone, but they surely apply to many, which is why I said that to some/many people the GTX780 could indeed make more sense and not only from a compfort, quietness pov, but also an economic one.

            And look at the TPU link I posted, in multi-monitor and video playback the 290x consumes much more too, it’s not only when 100% load.

            • superjawes
            • 6 years ago

            And I’m just trying to make the case that the economic perspective is and should be dominated by other concerns, that’s all.

            It’s really not you that prompted my math-infused rant. I’ve just seen people make the $/kW-hr case on hardware with smaller draw differences, and I think people should be thinking about power in terms of how long it will last, not how much it costs to power it.

            • Benetanegia
            • 6 years ago

            I understand, and I engaged in this conversation because of the exact opposite. What I see is that most often than not people completely disregard power draw differences impact on lifetime price of products, while at the same time fighting like rabid monkeys over a $50 difference in purchasing price. To me all those perf/$ that are so common and popular in the internet are not completely meaningfull and often times misleading because of this, and like I said people can discuss for hours on end about a $15 price difference in $150-200 cards, insults flying too, without even considering this.

            • superjawes
            • 6 years ago

            I’ve seen that, too. In fact, the 290X review is swimming in it.

            Anywho, you and I are basically on the same page. Good talk ๐Ÿ™‚

            • Great_Big_Abyss
            • 6 years ago

            Honestly, an extra $3 per month on my powerbill when Hydro already costs me $200/month doesn’t really make me flinch too much.

            I don’t look at power consumption when making a decision about what video card to buy (unless I’m limited by my powersupply, of course).

            • Benetanegia
            • 6 years ago

            And I’m ok with that. But then, if one would be one of those millions of people who keep their cards for 3 years or more (grand mayority of people actually), meaning 3x12x3 = $108 higher cost and one were at the same time, one of those who fight and fight and fight over how the card that consumes that much more is $100 cheaper and anyone who gets the other one are idiots, I would quite certainly call that person an idiot instead.

            • Great_Big_Abyss
            • 6 years ago

            I notice an extra $100 all in one shot. An extra $3 a month is a lot less noticeable. same amount of money over three years, YES. But Perception matters a lot.

            • Benetanegia
            • 6 years ago

            I can’t agree with that. If you can’t save up an extra $100 in one or two months, for a seemingly important purchase at over $500, then you shouldn’t spend that much to begin with. But I get it I suppose, some people can’t keep money in their pockets. I’ll never understand that. It’s so simple…

            1- Acknowledge it’ would cost you $3 more per month.
            2- Save $3 every month after purchase, instead of actually paying it on electricity bills. It’s money you wouldn’t have available anyway.
            3- When purchase time comes, take that money, add it to whichever you would have paid.

            It’s all the same. Just like it doesn’t hurt you to spend $3 extra a month, it’ should hurt you even less, much less, to put those same $3 aside for a future purchase. Any other rationalization makes no sense to me.

      • l33t-g4m3r
      • 6 years ago

      Mantle?, meet fp16.

      • travbrad
      • 6 years ago

      [quote<]Lol the 780 is having it's price slashed despite all the Nvidiots telling us that it was just as good value as the 290X.[/quote<] Were those people related to Mr. Strawman by chance?

    • dpaus
    • 6 years ago

    In the context of this article, I’d find ‘Before Price Cut’ and ‘After Price Cut’ buttons below the graph to be more meaningful.

      • superjawes
      • 6 years ago

      That would be nice, but for reference to everyone else: [url=https://techreport.com/review/25509/amd-radeon-r9-290x-graphics-card-reviewed/13<]Here's the "before" from the 290X review[/url<]

      • Cyril
      • 6 years ago

      Your wish is my command.

        • superjawes
        • 6 years ago

        Well now my reply just looks silly…thanks, Cyril ๐Ÿ˜›

        • dpaus
        • 6 years ago

        Cool, thanks!

        So, um, if I ‘wish’ for some of that surplus loot you have cluttering up your basement to be shipped to my home address, does that become a ‘command’ too? ๐Ÿ™‚

          • Cyril
          • 6 years ago

          Sorry, I only fulfill graph-related wishes. ๐Ÿ˜‰

            • dpaus
            • 6 years ago

            I’m tempted to ask “what kind of half-assed (wish) fairy are you?!?” but I don’t want to risk the BanHammer ๐Ÿ™‚

        • Airmantharp
        • 6 years ago

        I’ll add another request, though I know it won’t be as simple:

        Given the disparity between coolers, and that GPU performance is getting to the point where it’s being driven and limited by cooling, with the tradeoff being noise, can we get a graph comparing performance/decibel?

        When we start comparing aftermarket coolers for these cards, I think it would be a critical tool to evaluate the value that various cooling solutions represent.

        As an aside, general everyday acoustic references might be helpful too, as would a leaderboard showing the quietest and loudest cards tested over time :).

        Thanks again!

          • Benetanegia
          • 6 years ago

          Maybe performance/Sone would be a better metric, because people clearly don’t understand the decibel scale.

            • Airmantharp
            • 6 years ago

            Well, a linear or relative scale would be better given that logarithms are beyond most people’s reach, but I didn’t get that Sone would be any more relatable from a quick skim of the Wikipedia article. They’re all pretty complicated, which is to be expected, and at least decibels are used by other sites too- so it would make sense to report those numbers for reference, even if they used a more presentable scale for comparisons.

            And thank you for educating me on another way to approach the problem. The more you know, the more you know that you don’t know :D.

        • SCR250
        • 6 years ago

        I wish for a winning lotto ticket

    • cynan
    • 6 years ago

    The GTX 780 for $499 is great. But man, NVIDIA sure has some large cajones selling a slightly faster Ti version for $200 more.

    I guess this means the Titan is EOL? Or is the GTX 780 Ti, just the Titan rebranded? If so, I guess the $700 price makes a bit more sense. Especially if they keep the 6GB. (Yes, yes, it’s a great card for the 10 people who are looking for cheaper versions of the Quadros for specific small-scale compute tasks. But that’s just not how it’s marketed).

      • Airmantharp
      • 6 years ago

      See, you can’t compare a consumer card like the 780Ti to a prosumer card like the Titan- expect the Titan to still be quite significantly faster for compute workloads. It really is in a class all it’s own, and for those that can make use of it’s feature-set, it’s still quite the bargain :).

        • cynan
        • 6 years ago

        Do we know the 780 Ti is not just a Titan?

          • Airmantharp
          • 6 years ago

          Nope- but it wouldn’t make sense for them to ship it with full-speed compute, given the niche they carved out for Titan, and how successful that was.

          But if it is, Nvidia just needs to weld a little more RAM on it, and they’ve sold me two cards, maybe three- that is, when I can upgrade to LGA2011 to properly feed them :).

            • cynan
            • 6 years ago

            It’s all well and good to buy a Titan for its compute prowess. However, rebadged Titan, or tweaked GTX 780, either way, the Titan is pretty much done.

            The gaming performance between the current GTX 780 and Titan is so close that if the 780 Ti is significantly faster it will certainly match and probably outperform the Titan in most games. As such It’s going to be a tough pill for much of the demographic who dolled out the big bucks on the Titan (ie, those who really are just buying it for bragging rites and for whom the compute performance is a convenient excuse) to spend $300 more on a Titan over a faster (for most games) GTX 780 Ti.

            You’re going to be disappointed if you think the Ti will have more than 6GB ram.

            • Deanjo
            • 6 years ago

            [quote<]It's all well and good to buy a Titan for its compute prowess.[/quote<] That is the main feature of the Titan and they have sold a ton of them because of it.

            • Airmantharp
            • 6 years ago

            I can always hope for more RAM!

            And yeah, the Titan is largely done for the gaming market- but that’s okay. Up until the release of the R9 290X, there was no other way to get what the Titan delivered, and plenty of people were willing to pay for it.

            But just for reference, this isn’t just about ‘bragging rights,’ it’s about getting the performance you need to do whatever it was you were trying to do.

            If you were trying to run 3×30″/27″ screens, for example, Titans weren’t just the best way to do it, they were the first way that actually worked. Quite the niche example, I know, but I’ve seen it done, and it applies on down the line too.

            I’ll use myself as a second example. I’m running just one 30″ panel, and a pair of GTX670 2GB cards is barely enough for games like BF3 on ‘high’ settings. Less intense games can run maxed out, except for any meaningful level of AA- that’s out. In my case, a pair of Titans would be perfect, as again, I do photo and video stuff more than I game.

            Of course, after grabbing the cards I have, I wasn’t about two drop another two grand on video cards when what I have definitely still works, but you can see that there’s definitely a draw there.

            • Pwnstar
            • 6 years ago

            Titan isn’t meant for the gaming market, the 780 is.

            [quote<]Titan is largely done in the gaming market[/quote<]

            • Airmantharp
            • 6 years ago

            We’re on the same page there :).

            • cynan
            • 6 years ago

            [quote<]Titan isn't meant for the gaming market, the 780 is.[/quote<] Hmmm. I guess [url=http://www.nvidia.com/titan-graphics-card<]Nvidia didn't get the memo[/url<]. This might be the rationale approach, but that's sure not how they're marketing it. Also, I'd wager that the majority of people who bought a Titan, while many surely do, don't use it foremost for compute tasks... The only text on the main product page is: [quote<]SUPERCOMPUTER TECHNOLOGY. REVOLUTIONARY GAMING. The technology that powers the world's fastest supercomputer is [b<][i<]now redefining the PC gaming experience[/i<][/b<]. Introducing GeForceยฎ GTX TITAN. Bring the powerful NVIDIAยฎ Keplerโ„ข architecture technology that drives the Oak Ridge National Laboratory's Titan supercomputer [b<][i<]to your next gaming experience[/i<][/b<]. [/quote<]

            • Airmantharp
            • 6 years ago

            Well, AMD calls the 40% fan position on the R9 290X ‘Quiet Mode’, but Scott specifically states that it is far from quiet in the review…

            Marketing is fun :).

            • WaltC
            • 6 years ago

            Yea, the Titan has been extremely successful…I’ve read the comments of at least [i<]four people[/i<] who bought it!

            • Airmantharp
            • 6 years ago

            Your sampling for ‘success’ is from forum comments?

            Granted, I appreciate the perspective- but again, the Titan was far from purely just a top-end gaming card. They probably sold more for purposes other than gaming than they did for jet-setting enthusiasts.

            • Klimax
            • 6 years ago

            Even if it were anywhere near that low, it would be net for NVidia, because those chips couldn’t be used in Tesla/Quadro cards…

        • slowriot
        • 6 years ago

        The niche of people who need the Titan’s compute abilities but don’t need Quadro/Tesla levels of support seems like it would be very small to me. I’d guess Nvidia would like to push those people towards the appropriate professional model as well. That in combination with the Titan no longer being a gaming market halo product makes me think Nvidia won’t keep Titan on the market for much longer.

          • Airmantharp
          • 6 years ago

          Don’t just think engineering/big iron prototyping, but also modeling at every level, presentation, and content-creation as well.

          It’s a pretty big market; even I considered Titans for the photo and video work I’m getting into. For the performance they bring for even consumer-level compute applications, they’re a steal.

            • slowriot
            • 6 years ago

            [quote<]It's a pretty big market; even I considered Titans for the photo and video work I'm getting into.[/quote<] Here's my opinion, take it or leave it. I think that statement is crazy, but it is very much in line with your typical view point. I think you have a habit of significantly overstating your needs, and worse using those "needs" to make up some non-existent market of people who have similar thoughts. It's the same reason why half of those posts on the 290X review must be you bringing up how it is ("half'" is a joke but seriously, do you think it's reasonable to make dozens and dozens of posts reiterating the same point? I don't, and honestly feel you killed many good discussions in that article's comment section because of your "need" to bring up the same point over and over). This is just another example in my opinion. You've taken your own crazy computing desires and applied it to far more people than is reasonable. Titan is a niche product, it's becoming even more niche as it is no longer a reasonable purchase for gaming. That leaves a small market of professionals who need Titan levels of performance but don't need/want professional levels of support.. that's just not large and from Nvidia's perspective not really a market they'd want to continue feeding.

            • Airmantharp
            • 6 years ago

            I’ll take your opinion as-is, and I think that it’s perfectly valid.

            My ‘crazy computing desires’ are a bit niche, but remember that computer gaming is itself a pretty small niche, given the price delta over console gaming. So I’m in a niche inside a niche, inside a niche of people who play games at all.

            Now, again, I said that considered the Titan, because it *is* a reasonably priced product for what it brings to the table, and that I still consider it to be far too expensive for my needs already. I didn’t buy one, I didn’t add it to a wishlist, I didn’t even think about trying to budget for it- and I’d need two. That’s totally out of the question.

            And unless I specifically state that my posts are about me, don’t presume so- I’m interested in objective discussions, not subjective crap-slinging, and I will stand by my what I a post.

            Last, I’ll leave you with one last observation- even on an overclocked system with plenty of RAM, using professional software, re-rendering a single video shot from my DSLR took all night, and the results were less than impressive, meaning that I’d have to make adjustments and do another render. Given that, I find that I could use all of the performance I can get, and if deadlines were involved, with real commissions, Titans don’t look so expensive after all.

          • Deanjo
          • 6 years ago

          [quote<]The niche of people who need the Titan's compute abilities but don't need Quadro/Tesla levels of support seems like it would be very small to me.[/quote<] The titan has always been a niche product.

            • slowriot
            • 6 years ago

            That’s my point Deanjo, the already very small market is diminishing even further. There’s less reason for Nvidia to keep Titan around. It eats sales from their professional products and is no longer a true halo product. What’s the motivation for Nvidia to keep it around much longer? Seems to be very little.

            • Airmantharp
            • 6 years ago

            You know, they could always refresh the Titan too, where it provides the same gaming performance as the upcoming 780Ti but retains the compute ability.

            • Deanjo
            • 6 years ago

            [quote<]There's less reason for Nvidia to keep Titan around.[/quote<] Not with the 780ti's computing performance being diminished. Titan's roll is still filling that niche and a 780ti hasn't changed any of that.

            • slowriot
            • 6 years ago

            Why wouldn’t Nvidia want to push those people towards proper Tesla and Quadro cards? Why would Nvidia want to continue providing a card for the professional market but without professional support and more importantly to Nvidia without the professional market profit margin?

            • Airmantharp
            • 6 years ago

            Why don’t you ask Nvidia? Why did they make the Titan in the first place, instead of just making the 780?

            You have to assume that they believe that there’s a market for that product, and that they’re making sales that they wouldn’t otherwise get. Remember that the Titan is still ‘prosumer’, not professional, and that as you say it lacks the support of the Quadro and Tesla lines, but it also lacks the drivers and the ECC memory and other data integrity features that are paramount to any professional visualization or compute card.

            • slowriot
            • 6 years ago

            [quote<]Why don't you ask Nvidia? Why did they make the Titan in the first place, instead of just making the 780?[/quote<] Because Nvidia wouldn't answer. Why did they release Titan? Several reasons. I figure they wanted a halo product they could sale to extend the shelf life of the GeForce 600 series. It's just a rebranding of an existing card they were already making, therefore straightforward to produce. It has a $1000 price tag so it's very low volume. I believe they weighed those points against Titan potentially stealing sales from their professional series, you can live with that because you only need this product to exist for a short period and then you kill it off. "There's a market for this product" means a lot of things, which is part of my argument. I believe there's two distinct markets left for TItan (now that gaming is gone). Group A) The "prosumer" or "lower half of semi-professional" market, those who can justify buying a Titan but an equivalent Nvidia professional card is too expensive. Group B) The "upper half of semi-professional" or "professional" market who could purchase Nvidia's professional cards or have in the past but whose hardware needs are met by Titan and it's only $1K so why in the world would they spend more. I don't think group A generates enough profit for Nvidia to continue to feed group B. Hence, time to roll out the GTX780 Ti and go back to making group B pay more.

            • Airmantharp
            • 6 years ago

            They could do that, and there’s no way to know how Nvidia views the ‘Titan’ experiment. I’d prefer that they keep it, since it still represents something unique, but as you say that market has shrunk now.

            • Deanjo
            • 6 years ago

            Spending $1000 on a card appeals to a far larger market then spending 2-3k+ for the equivalent Quadro / Tesla.

            • superjawes
            • 6 years ago

            Isn’t there a market for independent content developers? Specifically, I believe the Let’s Play community on Youtube is the largest, which would benefit from cards that can both play games [i<]and[/i<] do some computing, since these creators have to do their own work to put content online. I could be way off on that, but I could see it making a difference.

            • slowriot
            • 6 years ago

            Is there a market of independent content developers and creators? Yes, of course! But the subset of that market who makes enough money from it to justify buying a Titan or makes enough money independently of the content creation to justify Titan? THAT market seems very small to me, do you disagree?

            Is that market large enough for Nvidia to continue keeping Titan on the market? Do you think that market is worth keeping versus the market of people/companies who could afford Tesla cards but want to go cheap and instead buy Titans? Wouldn’t Nvidia want to eliminate the latter group?

            • Airmantharp
            • 6 years ago

            Again, Nvidia created the prosumer-card niche with Titan- why would they want to eliminate it?

            And do you assume that no one makes enough money to purchase a $1000 video card? My other hobby is photography. It makes Titans look cheap, and the lens I’d most like to acquire right now is as much as two Titans- ON SALE. And that lens, like the Titan, is worth every penny.

            • superjawes
            • 6 years ago

            [quote<]Is there a market of independent content developers and creators? Yes, of course! But the subset of that market who makes enough money from it to justify buying a Titan or makes enough money independently of the content creation to justify Titan? THAT market seems very small to me, do you disagree?[/quote<] Yes, I do. It's a business investment for anyone who makes money on said content. Just skimming my Youtube subscriptions turned up 9 individuals who might use such a card. That's excluding the network/studios I am subscribed to, who might also use a Titan (but I count them differently because it would be more than one Titan). Include everyone in a network and that's a sizeable population just from Youtube. It doesn't have to be a "subset that makes enough money" to count. Better equipment = better content. [quote<]Is that market large enough for Nvidia to continue keeping Titan on the market? Do you think that market is worth keeping versus the market of people/companies who could afford Tesla cards but want to go cheap and instead buy Titans? Wouldn't Nvidia want to eliminate the latter group?[/quote<] Nvidia should never eliminate a group as long as they are making money. Teslas might be "better" suited for certain tasks, but they might also be outside the feasible price range. And you seem to have glossed over one of my points, specifically that Titan is still a decent gaming card. Being able to both game and render is a powerful thing, especially for people who would only be able to purchase one computer for a task. A company, as you mention, might be better off using 780's in SLI for gameplay footage, then export data to a workstation for editing and rendering. This is not typically an option for indies, since it would at least double the hardware costs.

            • jihadjoe
            • 6 years ago

            A high enough profit margin is a pretty good reason to keep a product around despite low volume sales. Titan makes Nvidia at least $500 in pure profit for each unit sold, considering they can afford to sell the 780 at half the price.

            I don’t think it’s too small a niche either. If someone is doing GPU compute on a non-workstation platform (pretty much anyone not using a Xeon), then they probably don’t need Quadro either. Not much use having ECC on the GPU when the results get sent out to a main memory that doesn’t use ECC either. In this use case Titan is the perfect compute card because one can get near K6000 performance for less than 1/4 the price.

          • Klimax
          • 6 years ago

          Not everybody can afford that.

      • SCR250
      • 6 years ago

      [quote<]The GTX 780 for $499 is great. But man, NVIDIA sure has some large cajones selling a slightly faster Ti version for $200 more.[/quote<] Need to wait to see what the actual reviews on the 7th show first.

    • Benetanegia
    • 6 years ago

    It took 2 days, jimbo75 and others, 2 days to make the GPU landscape change dramatically.

    But let’s not wait… ever.

      • chuckula
      • 6 years ago

      Given the.. “availability” of the R9 parts, it seems that lots of people never got the chance to buy one before the price cuts were announced anyway.

      That just shows you how much AMD cares. They didn’t want a lot of people to order their cards at inflated prices, so they intentionally constrained the capacity to ensure that very few people would buy them at full price. The only valid criticism of anything that AMD has ever done is that they… JUST CARE TOO MUCH!

        • clone
        • 6 years ago

        are you saying Nvidia lowered the prices on their premium product because they wanted to or because AMD forced them too?

        simple question.

          • Airmantharp
          • 6 years ago

          I’d say that it was part of the same plan that they’ve both followed for over five years. Hasn’t changed.

          Why does that surprise you? Do you really need AMD to be your knight in shining armor that bad?

          • f0d
          • 6 years ago

          are you saying amd lowered the prices on their premium product because they wanted to or because nvidia forced them to?

          simple question.

          heres a reminder what happened when nvidia released a faster card – its almost exactly the same except replace nvidia price cuts and introducing the 780ti with amd price cuts and introducing the 7970ghz edition

          [url<]http://www.anandtech.com/show/6093/amd-radeon-hd-7970-7950-7870-price-cuts-inbound-7970ge-available-next-week[/url<] the only difference is nvidia was more nimble and responded faster

          • MFergus
          • 6 years ago

          Do you think AMD lowered the prices of the 7970 because they wanted to or had to? How can people not see that both companies play the exact same price game.

          • chuckula
          • 6 years ago

          [quote<]are you saying Nvidia lowered the prices on their premium product because they wanted to or because AMD forced them too?[/quote<] 1. Yes. 2. I said that Nvidia would cut prices before the street price of the R9 was even known and people exactly like you said I was an idiot (for being right). 3. AMD cuts prices in response to competition. Nvidia cuts prices in response to competition. AMD is a multi-billion dollar company that is out to make money. Nvidia is a multi-billion dollar company that is out to make money. The sun comes up in the morning. None of these things should be shocking, but in your twisted comic-book world AMD is like the X-men or the Justice League who are magical superheroes and Nvidia is like the legion of doom or something.. sorry clone, neither company is all good or all bad. If you bothered to look at other posts I've made you'd see that there are plenty of Nvidia fanboys (OU812 in particular) who think I'm an AMD cheerleader, so spare me the disingenous "outrage" at my comments.

            • clone
            • 6 years ago

            a few things come to mind.

            1st Nvidia cut prices not after R9 was known but after R9 was released, tested and all elements were known so no, Nivdia did not cut prices before R9….. they waited which would make you wrong, R9 has been selling for a few days now.

            2nd AMD didn’t cut prices in response to competition they undercut the competition, AMD priced R9 $100 under 780 before 780 Ti was announced…. huge difference. at least on platitudes you’ve pretty much got it right, AMD and Nvidia are out to make money and yes the sun does come up in the morning.

            nothing is shocking Chuckula, Nvidia has done what they always do, they pushed for higher prices, 780 Ti’s MSRP is going to be $699 that’s an upward trend which is typical of Nvidia, AMD undercutting Nvidia is equally common…… typical, not shocking.

            you haven’t accomplished anything Chuckula, you’ve made a lot of mistakes as usual and overblown what you don’t understand…. which is typical as well….. no “X-men, no Justic league, no Legion of Doom” and most notably nothing shocking.

            disappointing sure, it’s disappointing that Nvidia as usual is pushing prices higher, that’s disappointing but not unexpected.

            • superjawes
            • 6 years ago

            Let’s be clear, though. If AMD could sell the 290X for $700, they would. Why wouldn’t they?

            Sell at the highest price you think you can get away with to maximize profits, cut prices if the situation changes.

            • clone
            • 6 years ago

            what really scares me isn’t that AMD could have asked more but that AMD doesn’t need to change any of their current pricing.

            doesn’t it feel scripted? within 2 days of AMD’s setting their position Nvidia repositions their entire lineup including what at least appears to be on the surface a new high end card and this repositioning will not require AMD to do anything in response, 780 will compete directly with R290X and 780 Ti will raise the price of the high end by $150, this won’t make AMD look bad at all because it’s not competing with it, they priced themselves into a new bracket instead.

            where is the battle?…. where is the competition?

            it seems both companies knew what the other was bringing to the table and knew well ahead of time how they would position their products…. it just feels very scripted and a pox on both houses if it’s true.

            • Airmantharp
            • 6 years ago

            But both companies DID know, as did we! The questions weren’t the potential performance, rather, they were how far AMD and Nvidia were willing to push their GPUs (AMD pushed HI pretty hard), and at what price points they were going to try to hit. We know what the pricing is for the 780Ti, now we’re waiting to find out how hard Nvidia is going to push it, and we’re still waiting for contrete info on the R9 290 non-X.

            • clone
            • 6 years ago

            did we?…. did you know well ahead of time 780 Ti was going to retail for $699?…. I just found out, I knew a Ti was coming true but the performance and pricing…. nope.

            AMD priced R290X at $550, are you saying AMD knew well ahead of time that their best would be inferior to the Ti no matter how hard they pushed it?….. that may have required a notable insight well ahead of release if true.

            • Airmantharp
            • 6 years ago

            Did we know the exact specifics? No.

            Can any reasonable person apply basic trend analysis to see how this was all going to end up? Sure.

            Are you going to keep pretending that every release from these tightly-competing companies is an absolute revelation? Can you see further than your nose? Do you need glasses?

            • clone
            • 6 years ago

            sigh… Airmantharp will you please stop crying, enough with the tears.

            no revelations, just curiosity, don’t speak for me, you are terrible at it, grow up, put on the big boy pants and try acting with a modicum of respect that I previously had granted you.

            stop crying, it’s not constructive, it’s just childish.

            if Nvidia wanted to really compete and grab that 50% marketshare that AMD has, fighting on price just a little would certainly serve them well and likely offset losses.

            I suspect we are seeing collusion…. could be wrong but even if I am you really shouldn’t cry about it.

            • Airmantharp
            • 6 years ago

            You are absolutely seeing collusion. Just like Intel can’t exist without AMD (the Feds would chop them up like Ma Bell), Nvidia can exist without AMD and vice-versa. Hell, they’re even taking turns letting each other put stratospherically priced products on the market for periods of time, intentional or not!

            If I were to ask anything of you, it’d be to look at the bigger picture, to see the past and apply those trends to the present market situation and the immediate future.

            Try to put things in context, instead of trying to declare absolutes. There are no ‘bests’, ‘worsts’, just tools more suited or less suited to particular jobs.

            • clone
            • 6 years ago

            so you are claiming Nvidia needs AMD to have 50% of the market or else they’ll get into legal troubles?

            ROFLMAO.

            p.s. Intel disagree’s with you.

            • Airmantharp
            • 6 years ago

            Nvidia has to have competition; Intel has been having to behave themselves now that AMD seems to have all but given up on the CPU business (and I do wish that wasn’t so). It’s the same in every industry. Monopolies get busted.

            And yeah, Nvidia could muscle AMD quite easily- they’d pay for it in the earnings, but it wouldn’t be too hard. AMD’s been on thin ice for the better part of a decade now.

            Oh, and between Intel and AMD? Intel just payed AMD a cool billion USD to make them shut up. AMD’s still in a tight financial situation.

            • clone
            • 6 years ago

            1st: Nvidia does not need AMD at all, anymore than consoles needed to have Nvidia & AMD graphics in them.

            2ndly if Nvidia needed AMD the money move would be to push AMD into the sub 20% range and maintain the illusion of competition…. which is what Intel has done.

            Nvidia trying to steal 30% of AMD’s market….. very attractive option, reason for shareholder rewards, Nvidia maintaining 30% of AMD’s market for AMD, just a horribly bad comment on every level that deserves nothing but scorn.

            trying to pretend that shareholders wouldn’t get excited about Nvidia becoming the dominating player in desktop consumer graphics and grabbing the lionshare of the coin in the process….. so flawed as to defy any and all logic.

            • Airmantharp
            • 6 years ago

            So now you’re a psychic? You know exactly how much your evil nemesis wants to take from your beloved AMD?

            And you are certain that we don’t actually enforce our antitrust laws here in the US, they’re just there for decoration?

            • Klimax
            • 6 years ago

            And highly active EU regulators.

            • clone
            • 6 years ago

            silly is what silly says I guess.

            • Diplomacy42
            • 6 years ago

            50% of the market, ROFLMAO.

            AMD has like 20% of the market.

            • clone
            • 6 years ago

            processors they have less than 20% last I looked, gfx closer to 50%.

            • superjawes
            • 6 years ago

            Graphics is only 50% if you exclude Intel’s component, which is surprisingly strong.

            Otherwise I think it is pretty evenly split between Nvidia and AMD, with Nvidia typically having a slight lead.

            • Klimax
            • 6 years ago

            Surprisingly? When you don’t need games or anything too taxing other ways then IGP’s are adequate. And I think those Crystalwells are promising. (Just too expensive, but still want one for tests)

            • superjawes
            • 6 years ago

            Well, surprising considering where most of the competition is. When this type of crowd talks about GPUs, we almost exclusively mean discrete cards, which are entirely AMD or Nvidia.

            • clone
            • 6 years ago

            I was talking about discrete as it relates to the battle between AMD and Nvidia, Nvidia was pushed out of the IGP market by Intel so it’s not really up for grabs by Nvidia whereas AMD’s discrete marketshare is.

            • Klimax
            • 6 years ago

            Ok. But frankly, it pays always to mention what market one uses… (as seen already, confusion is trivial)

            • clone
            • 6 years ago

            this line of discussion was trivial.

            if the mentioned stat isn’t immediately obvious, if it completely clashes with your view then perhaps it’s time to consider why. especially before fingers hit keys.

            • Klimax
            • 6 years ago

            Usually clash with my POV is because the other person is quite wrong. Or their opinion and experience is sufficiently narrow to miss too many things.

            Oh and if I were you, your own advice is quite sound for you…

          • HisDivineOrder
          • 6 years ago

          I think he was being sarcastic, so I’m not sure he was saying much of anything beyond, “AMD is a corporation. They do nothing out of the kindness of a heart they as a corporation do not have.”

            • clone
            • 6 years ago

            I know, it’s a rhetorical question.

        • Vhalidictes
        • 6 years ago

        The lack of R series cards makes perfect sense, chuckula (nice name!).

        AMD is simply waiting for the 7900 and 7800 series cards to sell out. Which they haven’t yet.

        On that note, I find it a shocking failing of TR to not have the 7950/7970 (if not the 7850/7870) on that scatter plot. They do use the same shaders and general design as the R series.

        Then again, those cards would skew it pretty bad.

      • clone
      • 6 years ago

      Benetanegia AMD radically changed the landscape, Nvidia in response tweaked it…… lowering the 780 by $150 is not radical, it’s a reflection of the new reality set by AMD’s R290X.

      worse still the new price isn’t compelling, just more interesting which likely will be enough for the faithful which might be enough for Nvidia.

        • Airmantharp
        • 6 years ago

        You say this like you expected something else to happen. Would Nvidia have met your expectations if they’d kept the prices as they were?

          • clone
          • 6 years ago

          I didn’t expect anything less from Nvidia.

          Nvidia is doing what I expected them to do which is disappointing.

          they’ve tweaked their pricing in response to AMD setting the bar and are now trying to raise it with 780 Ti.

          if 780 Ti is faster than R290X (which is likely the case) and if Nvidia chose to price it at the same level if not $50 lower they would have surprised me…. but they aren’t, Nvidia is doing exactly what they always do so no surprise, no shock, yes they did meet my expectations.

            • Airmantharp
            • 6 years ago

            Good! Nvidia knows that they have the better product, and are pricing it accordingly. Glad you recognize that should be the case!

            • clone
            • 6 years ago

            should?…. I don’t know, your excitement at the prospect of paying more for product speaks poorly though.

            that you believe it’s only right and proper for Nvidia to be the company that demands more kinda…. sort of implies you are a fanboy?… maybe a little?

            I’d love it if Nvidia asked the same or less than R290X….. giving Nvidia more is less fun than me keeping it in my pocket.

            it’s selfish I know but it’s definitely not showing bias towards one company or the other.

            • Airmantharp
            • 6 years ago

            In your world view, is it even remotely possible for anything to be better than your beloved AMD?

            In case you didn’t see my other posts, I could give a rats about AMD vs. Nvidia. To me, they both make stellar products, and I buy the best one for my money. It’s been more Nvidia than AMD, but the disparity can almost completely be attributed to AMD/ATi not having a competing product, going back over a decade, of course. Remember that history discussion you glossed over?

            But I’ll give you an example. I bought a GTX670 release week. They were even widely available! I bought it to replace my two HD6950 2GB cards, even though it was technically slower, because it was, at the time, almost as fast as an HD7970 while being far, far quieter. My HD6950’s weren’t loud, but they were as loud as I was willing to go. You know what’s worse? The GTX670 put out lower framerates in my most demanding scenario- a fully-loaded BF3 multiplayer match with tons of action and destruction going on- but it felt at least twice as smooth, and twice as responsive! Go figure, TR later put the screws on AMD and revealed that AMD hadn’t done a damn thing to make sure that Crossfire was an actually effective implementation of a teamed GPU setup. They weren’t even trying, they thought everything was a-okay!

            Then I bought a second GTX670, and wow, what a night-and-day difference between SLI and Crossfire. Not only was my system noticeably quieter than with AMD’s cards, the quietest blowers they’ve ever made for a top end product, but it was just a smooth as running a single Nvidia card!

            And you know what’s real frustrating? Had AMD addressed their frame-pacing issue way back when, I’d never had considered spending money with Nvidia. I’d still be running AMD cards; they were certainly fast enough. But that’s not what happened.

            AMD STILL HASN’T FIXED THEIR DRIVERS. They still haven’t released a driver that would address the frame-pacing issue I was seeing at 2560×1600, and they haven’t bothered to release a WHQL frame-pacing driver at all!

            If you’re solidly devoted to AMD for whatever reason, that’s fine. I like you, and I hope you stay around- really. But understand that there’s a whole lot more going on here than just the graph at the end of an article.

            • clone
            • 6 years ago

            1: yes and had Nvidia priced 780 Ti the same as R290X I’d probably buy one…. but it’s coming in over $700 after taxes which isn’t interesting.

            on a side note you really shouldn’t speak for me, you are terrible at it, I don’t care about AMD, I care about competition, I care about pricing, I care about performance but what I don’t care about is AMD.

            2nd if I see a post that’s got potential for discussion whether it be interesting or egregiously flawed I’ll respond to it, at that point I’ll read the responses to me only, I don’t read other posts unless my name comes up in a general search so while you have a great many posts and responses on the fly if my name isn’t in them I don’t “see” them.

            p.s. I don’t care about AMD, I’m a contrarian and if someone spews a load of lopsided dreck about one company then I’ll highlight it, it’s why fanboys are the typical meat I eat but it’s not because I’m a fanboy, as mentioned in another post you won’t see me celebrate a company’s demise because that hurts competition, you won’t see butthurt in my comments either….. I really don’t care beyond the sake of competition.

            we need more of it.

            • Airmantharp
            • 6 years ago

            Yes, we DO need more competition!

            Granted, there’s other market factors at work here- we also need more demanding games. I played a good couple of hours on BF4 last night, and the consensus seems to be- on release day no less- that the game is easier to drive than BF3, and that anything that played BF3 well can play BF4 well, even with higher-quality graphics!

            So, part of the problem is actually consumer demand for higher-end cards. The lack of extremely popular, highly punishing games lately has resulted in a slack in demand for top notch hardware. If the demand was there, both companies would have to compete to meet it, and lower prices on high-end cards would result.

            • clone
            • 6 years ago

            read above post…. covering the same discussion from a different angle.

            p.s. ridiculous response.

            • Airmantharp
            • 6 years ago

            So you believe that AMD and Nvidia will just go on making newer, faster GPUs if the gaming industry fizzles out and dies? Having new, demanding top-tier games has nothing to do with demand for high-end graphics cards at all in your mind,does it?

            Do you believe in magic? From your perspective, do all of these neat and wonderful things just form in some estradimensional ether and then magically arrive iin your hands??

            • clone
            • 6 years ago

            don’t pose specious scenarios.

            • Airmantharp
            • 6 years ago

            You’re not capable of using a hypothetical scenario to evaluate a situation?

            You believe that there’s no relationship between graphically demanding games and the development and sales of high-end graphics cards?

            • clone
            • 6 years ago

            hypothetical yes, specious =’s stupid.

            I’ll ask again, please stop posing specious scenario’s.

            • Airmantharp
            • 6 years ago

            I’m glad you’re willing to concede the discussion. Things weren’t going too well for you :).

            • clone
            • 6 years ago

            saying it doesn’t make it so.

            I quote facts you drift into obscurity and pretend you matter…. cheers.

            • Airmantharp
            • 6 years ago

            I don’t have to pretend :).

            Come see me in the forums if you like!

            • clone
            • 6 years ago

            god you’ve done badly.

            • Airmantharp
            • 6 years ago

            At exposing you? Hardly. I’ve had you talking solid for days, we’ll have no shortage of source material in the future!

            • clone
            • 6 years ago

            [quote=”Airmantharp”<]At exposing you? Hardly. I've had you talking solid for days, we'll have no shortage of source material in the future![/quote<]that'd be called stalking, harassment, bullying. given your fondness for joking about death threats and your admitted practice of stalking, harassment and bullying, is that next Airmantharp?

            • Airmantharp
            • 6 years ago

            It’s reference material from information you’ve freely provided. I’m quite aware of what stalking is dear friend :).

            • clone
            • 6 years ago

            [quote<]It's reference material from information you've freely provided. I'm quite aware of what stalking is dear friend :).[/quote<]it's pathetic, practice and admission, thanks for those. it doesn't surprise me that your familiar with stalking.... intimate would be the more accurate description.

            • Airmantharp
            • 6 years ago

            No, I’m not going to get intimate with you. And I’m not going to stalk you, either- like I said, you put the info out in the open on your own :D.

            • clone
            • 6 years ago

            apparently you are, I answered some questions, you fixated on me…. like a stalker, you then asked more and more and more… stalking, you’ve admitted you weren’t asking honestly but instead were looking for information about me….stalking.

            you’ve then bragged about how you’ve been stalking, harassing, and bullying.

            so yes, while you are well versed in stalking you seem to have deluded yourself that while others would be guilty of such actions you aren’t despite committing the same actions.

            • Airmantharp
            • 6 years ago

            I’ve just been asking you questions on a public comment thread to try to figure out what you’re actually trying to say, given that most of what you say doesn’t make a whole lot of sense.

            But yes, it has been fun!

            • clone
            • 6 years ago

            god you’ve done badly but hey, you win, while I’m terrible for trying to have an honest discussion about graphics and for being overly determined to have a discussion.

            whatever you win, congrats you’ve succeeded at the whole “harassment, stalking bullying thing.”

            • Airmantharp
            • 6 years ago

            Hell, I think I’ve done great! I haven’t even thought about cursing or calling names yet. This discussion is almost even civil!

            • clone
            • 6 years ago

            silly is as silly does.

        • f0d
        • 6 years ago

        both companys do the same when the opposition releases a new faster card
        the 7970 dropped from $550 to $479 then to $429 when the gtx680 came out and was faster
        did you really expect nvidia to do any differently?

        fact is that both companys always change their pricing structure when the opposition comes out with a faster card, so nvidia dropping prices is just business as usual

        • Klimax
        • 6 years ago

        Radically how? Somebody will have to tell me what is radical on GCN 1.1, because it is similar to GK110 vs. GK104.

        Game changers don’t tend to be on par with competition…

          • superjawes
          • 6 years ago

          No one radically changed the landscape. The 290X is a fast card, but the chip usage is exclusive to that model and possibly the 290. Again, the rest of the 2xx series uses the same silicon as the 7xxx series. Even then the 290X is only marginally better than the 780.

          But of course, Nvidia is no better. Titan is the newest silicon, and that’s been out for a while. The 780 Ti will likely use some variant of the same chip.

          So we see AMD’s new silicon doin so-so against existing Nvidia silicon, and Nvidia releasing the more of the same silicon. Boring. Hopefully AMD’s driver updates and future hardware tuning greatly increase the power of the 290X so Nvidia has to do something serious to counter it.

            • Klimax
            • 6 years ago

            That was mine point. But some people here argued with me that somehow this release is magical and game changing.

            BTW: Same view on these gens like you. (G-Sync might be different story, but we have some time to kill….)

            • superjawes
            • 6 years ago

            Oh I am aware. USAFTW’s “What’s with the cold reception?” comment is still in the Top Comments box, despite several responses, including my own, outlining why we’re all taking pages out of Krogoth’s book.

            And G-Sync is a game changer. It will eventually not be called G-Sync (will probably adopt whatever TV adopts), but it solves the reconstruction problem in an elegant way and offers ways for displays to show games, movies, and Youtube videos all at ideal speeds.

            • Voldenuit
            • 6 years ago

            [quote<]And G-Sync is a game changer.[/quote<] Agree with that. Also agree that G-Sync in its current implementation - restricted to a single GPU architecture (Kepler) from one chipmaker (nvidia) and only available on select expensive (and TN!) monitors is not going to turn the industry upside down. But the idea of decoupling screen updates from a fixed refresh rate is long overdue. I can see a scenario where people would pick the option that gives them smooth gameplay to a competing option that may be 5-10% faster but has tearing or lagged frames. Will G-Sync help 780 against the 290X? I don't think so, as the benefit requires an added cost (new monitor). But in upcoming years the technology will become more pervasive and thus more compelling. Of course, the best case scenario for all (AMD included) would be for AMD to come up with an open standard equivalent instead of nvidia's proprietary implementation.

          • clone
          • 6 years ago

          Titans performance for $400+ + taxes less…… =’s radical change.

          very sad you can’t see the obvious.

            • Airmantharp
            • 6 years ago

            The more you compare AMD’s brand-new card to an eight month old prosumer card, the more people realize just how little you really understand, and yes, it is very sad :D.

            • clone
            • 6 years ago

            awww Airmantharp, now you want to demonize me for stating facts… awww that’s so sad ๐Ÿ™

            I compared the existing high end to the new high end, I don’t care when they came out I compare what is available to compare at the time… in a cppl weeks Titan might be forgotten and I might be comparing 780 Ti but it’s not out yet so I can’t.

            if 780 Ti introduces a new high end then I will until then the best Nvidia has is Titan.

            • Airmantharp
            • 6 years ago

            Your ‘facts’ are cherry-picked from one very myopic time-frame to attempt to ‘prove’ a point that’s meaningless if you bothered to take even a half-step back and look at the bigger picture. It’s like being born five minutes before your twin and then screaming ‘I’m the oldest!’ to assert privilege for the rest of your life. It’s that stupid.

            If you want to convince anyone that somehow AMD’s new product is ‘the king of all cards ever’, you just might want to see how the market settles after it’s release. Being cheaper for a day or two does not make anything ‘the best ever’.

            • clone
            • 6 years ago

            give it up already, you’ve moved from discussion to personal attacks and your focus seems to imply I’m the source of all your issues.

            on a side note try not speaking for me, you aren’t good at it.

            not once did I say anything is perfect, what I have done repeatedly is repeat what the reviewers have said, which despite your silly is not that R290X is perfect, I just repeated the results that stated R290X is the best….. gold awards, elite awards, you know conclusions that said despite the rough edges it’s the best.

            and you’ve been complaining ever since.

            stating facts doesn’t make me a fanboy or biased, I’m just stating facts, stop complaining about it.

            • Airmantharp
            • 6 years ago

            But you can’t tell the difference between rating the GPU itself and rating the card and cooler combination as a whole?

            That’s what we’re talking about here- we know the GPU is awesome. We all agree! But that cooler isn’t, and the only way to get said award-winning GPU is with that damn cooler!

            So of course we’re disappointed. We’re all fans of GPUs here, and we were hoping that AMD would choose to actually compete with Nvidia on putting out a quality reference product, and not only did they fail, they made it worse than their last two halo releases!

            Is that too much insight for you? Let me know, I can probably break it down and spell it out if you like :).

            • clone
            • 6 years ago

            I didn’t rate anything, those that tested the card compared the card as a whole including the gpu and cooler & professed it king, the best, Gold awards, Elite awards…… “rough edges don’t mean it’s not the best”.

            and with that “damn cooler” it’s still currently the best.

            you aren’t offering any insight I haven’t already considered and it’s doubtful you could do anything but add static to the conversation, I read the same review, I drew my own conclusions, I’ve been repeating the conclusions of those that wrote the review.

            it’s not complicated.

            • Airmantharp
            • 6 years ago

            But it is complicated!

            Instead of focusing on ‘awards’, why not focus on what someone might actually buy? Let’s say someone asks the question in the SBA forum. Would you seriously tell them to buy no other card than the loudest single-GPU card to hit the market in the last three years, despite knowing both that versions with custom coolers are coming and that both companies haven’t finished releasing their products for this cycle?

            Because that’s where everyone else is coming from. We’re not debating review awards or merits, we’re debating how this release affects actual decisions we’d make for ourselves and suggestions we’d make to others.

            So, here’s a relative question for you:

            Will TR put the reference R9 290X in the holiday buyer’s guide?

            • clone
            • 6 years ago

            yes, those that reviewed the product and gave it the awards…. most definitely they did not have buyers of these cards in mind… because they must be total idiots

            my god Airmantharp what a notably horrible response.

            they just throw out awards based on graphs and ignore the pro’s and con’s …. just hand them out without any thought, like your argument… no thought at all.

            if the loudest single GPU graphics card wasn’t annoying and also happened to be the best single gpu gfx card…. absolutely I’d recommend they buy one.

            p.s. it’s not everyone, it’s you…. this is an old discussion from another thread.

            • Airmantharp
            • 6 years ago

            Look outside your myopic sliver of a window into the outside world, outside this thread, outside this forum, outside this site- and see just what people are really saying about this release.

            The card’s not available, and it’s not the one anyone wants, and anyone that’s paying any attention at all knows that very, very soon the rest of the cards in this release cycle will be released, drivers will be refined, roundups will be conducted and published, and prices may even change again.

            If you want to throw a party over AMD releasing a part that’s actually competitive with Nvidia’s top-end silicon for the first time since they bought ATi, go right ahead. You’ve waited long enough.

            The rest of us will wait for the cards that matter.

            • clone
            • 6 years ago

            it won every award it could, universal praise from those that reviewed it, armed with this reality you call the reviewers idiots, the reviews themselves flawed and now are trying to dismiss it as nothing.

            whatever.

            • Klimax
            • 6 years ago

            Because awarders wanted so desperately to award AMD something.

            • clone
            • 6 years ago

            it won every award it could, universal praise from those that reviewed it, armed with this reality you call the reviewers idiots, the reviews themselves flawed and now are trying to dismiss it as nothing.

            whatever.

            • Klimax
            • 6 years ago

            And??? It in no way invalidates my remark.

            • clone
            • 6 years ago

            Klimax while you can claim that every single website that tested R290X is on the take, saying doesn’t make it so.

            nor is it proof.

            you are correct your comment is no less valid than it was before.

            • Klimax
            • 6 years ago

            They can award all awards they can, but it cannot change a thing out there, because they are inherently awarding successor to 480. They can say whatever they want, but it doesn’t mean it will hold.

            And I suggest to reread my comment, I nowhere accuse anybody on taking a thing. I just say they wanted to give AMD award for anything at least remotely good.

            • clone
            • 6 years ago

            sigh…. empty comments addressing issues never pushed forward.

            of course things will improve in the future….. wow? is that supposed to be a revelation or a platitude because I and everyone else on the planet is aware of that nugget.

            p.s. no, you didn’t say money was the motivation behind every website selecting R290X as the best card, you claim they did it because they just want to give AMD awards…… a notably stupid comment.

            • Klimax
            • 6 years ago

            Have seen too many reviews giving out awards for subpar product, where only one positive things could be said about it. What will they have left for custom coolers which apparently fix quite few ugly things? This is the case. You don’t get even 480-level margins for same characteristics.

            Sorry, but shielding your own opinions by meaningless awards, where half of reviewers ignore too many things is not good. And it will be challenged. You need much better counter, otherwise its yours posts which are empty and most likely stupid.

            BTW: Techreport nor Anandtech gave any award for 290x nor 480… (PC Perspective gave both same award – Gold)

            • clone
            • 6 years ago

            I agree, awards get thrown around to easily nowadays but in this case it was universal…. not half, universal.

            you are correct TR and Anandtech just said it was the best card and could certainly understand why ppl would choose to buy it despite the higher acoustics of the reference cooler.

            congrats on that……

            • superjawes
            • 6 years ago

            I can’t believe I missed this gem…doesn’t every AAA title, especially every iteration of modern warfare first person shooters, get the same sort of praise from major reviewers?

            Point being that you should read such things with a salt shaker within arms reach…

            • clone
            • 6 years ago

            so you guys have no faith in Tech Report….. you consider all their work junk?

            amazing…. literally why do you come here if TR, H OCP, Anandtech offer nothing?

            • superjawes
            • 6 years ago

            You really need to take several steps back and calm down. At the very least, stop drinking the 290X Kool-Aid.

            Sorry if that’s harsh, but you did completely miss my point (again). You were trying to make the case that “it won every award it could,” but that is no reason to blindly support the card as you have been doing this entire time. The 290X outperformed the 780, but only barely. It was priced below the 780, but now it’s not.

            Yeah, the 290X is a good, fast gaming card, but it’s not that much of an improvement, so you should take any special recognition of the card with a grain of salt (at least). There are other factors to consider, especially now that the 780 is priced below 290X, and the huge elephant in the room is still the 780 Ti, which could be poised to retake the top spot for Nvidia.

            • clone
            • 6 years ago

            Superjawes can we agree that R290X at launch and upon review by multiple websites was considered the best card at the time or not, if you can’t agree with that then I’ll just recommend you go back and read TR’s conclusion and say I agree with TR, faster, cheaper, it’s rough edges weren’t a deal breaker.

            best card.

            that is TR talking not me.

            I am saying no more regarding the card & no less regarding the card and I’ve said no more and no less on the matter.

            idiots have accused me of being a fanboy for quoting those that reviewed the card, idiots are claiming independent testing results are bogus, idiots are claiming independent testers can’t formulate logical conclusions.

            I’m not saying this, idiots are, I’m not drinking any kool aid, I’m just quoting the results from those who’ve tested it.

            the situation has since changed, the card is no longer cheaper, the situation looks like it’ll change again in the near future as Nvidia claims they have a response that’ll likely be superior…. I don’t disagree with that, that’s fine, it’s to be expected and when the situation changes the R290X’s ranking will reflect those changes.

            if the card was reviewed today the conclusions would be different, personally I believe the extra gb of ram and perf is worth it, some don’t, that’s certainly up for debate and I don’t disagree with either position, it’s splitting hairs at most.

            but what happens today won’t change yesterday, the reviews were fine, the conclusions valid.

            no Kool Aid, no “R290X is perfect and will never be anything less than perfect”, I quoted the reviews results, enough with ppl blaming for them, enough with ppl accusing me of bias for quoting them.

            crap on Tech Report and Hard OCP and Anandtech but stop taking R290X’s success on me.

            • Airmantharp
            • 6 years ago

            Best PERFORMING card with VERY REAL rough edges THAT EVERY POTENTIAL PURCHASER SHOULD TAKE INTO CONSIDERATION.

            If I type it bigger, will you read it?

            • clone
            • 6 years ago

            why do you keep venting on me because you hate TR’s and those who tested the cards conclusions?

            TR says the rough edges…. not :very rough edges” but the “rough edges” aren’t deal breakers.

            if you don’t like TR’s article go attack Scott repeatedly in every thread, attack him while he’s having conversations with others, you have a problem with TR’s article go cry on Scott’s shoulder about it.

            you just can’t seem to shut up about it, constantly repeating the same bullshit over and over, and attacking me for it when your real problem is Tech Report and every other website that tested the card.

            get your head out of your ass and complain to the ppl who wrote the review, I can’t change them or edit them, like you I don’t have that influence.

            • Airmantharp
            • 6 years ago

            Why complain to TR? They executed their review with the same skillful precision that no other site can match.

            But apparently their review is a little above your level. I can understand that; these guys are, after all, really, really smart. You quoting fragments of their review as the basis for your campaign is really quite insulting, though.

            • clone
            • 6 years ago

            [b<]just like the 290X's rough edges aren't deal-breakers, the GTX 780's perks aren't deal-makers. Not when one of those crisp new Benjamins is on the table. Nvidia desperately needs to cut prices, or AMD wins this round on sheer value. [/b<] you are so pathetic.

            • Airmantharp
            • 6 years ago

            …and then Nvidia cut their prices. Now, the R9 290X’s rough edges ARE deal-breakers, and the GTX780’s perks ARE deal-makers!

            Isn’t this fun! Now it’s AMD who needs to cut prices, or be cut out of the high-end game completely!

            • clone
            • 6 years ago

            [quote<]...and then Nvidia cut their prices. Now, the R9 290X's rough edges ARE deal-breakers[/quote<] thank you, again you accept that indeed R290X.... best card.

            • Airmantharp
            • 6 years ago

            Only in your mind ๐Ÿ™‚

            • clone
            • 6 years ago

            your words, not mine, 3 times now.

            • Airmantharp
            • 6 years ago

            No, it really is all in your mind. You should get out more.

            • clone
            • 6 years ago

            [quote=”Airmantharp”<]Because it was the 'best' card for all of three days.[quote="Airmantharp yet again"<]For less than two weeks. Literally.[/quote<][/quote<]not mine, can't be bothered to look for the 3rd.

            • Airmantharp
            • 6 years ago

            Coherence is… lacking. Sometimes your posts can be reasonably understood despite the poor grammar, but this time you’re off in left field.

            But don’t let that stop you from trying!

            • clone
            • 6 years ago

            ok, because you did put ” around best I’ll drop it.
            [quote=”Airmantharp”<]For less than two weeks. Literally.[quote="Airmantharp agreeing with TR's conclusion"<]...and then Nvidia cut their prices. Now, the R9 290X's rough edges ARE deal-breakers[/quote<][/quote<]still leaves 2 that it's the best. p.s. high demand, has nothing to do with the cards merits.

            • Airmantharp
            • 6 years ago

            I put ‘best’ in quotes, as explained above, because it isn’t the best. It’s the fastest, with the loudest reference cooler on a single-GPU card made in the last three years, and extremely limited availability. Comprehension and all that.

            • clone
            • 6 years ago

            updated.

            p.s. silly is as silly does, you’ll get it in a few minutes. ๐Ÿ™‚

            • superjawes
            • 6 years ago

            It’s not a success, though. TR also pointed out a lot of issues with the 290X, didn’t they? There are perks to it, but there are also major drawbacks. You are using one fact, that the 290X is faster than Nvidia’s only released alternative (albeit barely), to go on repeated pro-AMD/anti-Nvidia rants.

            When the 290X release hit, the 780 Ti had already been announced, so we knew that the playing field was already set to change. There was absolutely no reason to buy a 290X on release day except for bragging rights. The realistic approach would be to wait for Nvidia’s responses (multiple) and make purchases accordingly (or not at all, since you can get great cards from both camps for much cheaper).

            Props to AMD for a good card, but there is no reason for fanfare.

            • clone
            • 6 years ago

            sure they did, R290X has issues, I agree.

            and upon consideration of both pro’s and con’s TR concluded that [b<]"just like the 290X's rough edges aren't deal-breakers, the GTX 780's perks aren't deal-makers. Not when one of those crisp new Benjamins is on the table. Nvidia desperately needs to cut prices, or AMD wins this round on sheer value."[/b<] Superjawes I'm not on a "pro AMD, anti Nvidia rant." someone said "Nvidia is amazing for lowering card prices" and I said "you can thank AMD for that" and after a whack load of posts here we are. neither comment was anti anything.

            • Airmantharp
            • 6 years ago

            They’re both amazing for charging whatever they can get away with when the competition hasn’t released it’s cards yet. Then they both ‘amazingly’ lower prices. Isn’t the free market amazing!

            • clone
            • 6 years ago

            and again, I can’t talk to anyone else Airmantharp, you have to jump in and attack yet another post not directed at you, you are so desperate for my attention, you can’t go without it.

            stalking and harassement?

            • Airmantharp
            • 6 years ago

            Pretty sure we’re all discussing the same basic things here.

            • superjawes
            • 6 years ago

            Titan can be used for more than just games, though, so the value of the card cannot be judged on gaming benchmarks alone.

            • clone
            • 6 years ago

            it’s not about whether or not Titan is a gaming only card.

            Titan just happened to be Nvidia’s fastest gaming card (that can also do other stuff) and it cost around $1000.

            so when AMD releases a card that offers Titans gaming performance for $400+ + taxes less that’s great for gamers.

            you are correct in saying Titan was never targeted or priced for gamers but it doesn’t matter, it happened to be Nvidia’s best and cost $1000.

            • Airmantharp
            • 6 years ago

            And the Titan is still worth $1000- well, it’s worth a lot more than that, actually. If Nvidia just stopped making Titans today, and didn’t replace them with something else, the prices of the Titans left on the market and the prices of used Titans would go up!

            So yeah, comparing an eight month old prosumer card to a consumer card that’ll be on the market for less than two weeks before it’s competition shows up is just plain ignorant. It’s not helping your argument at all.

            • clone
            • 6 years ago

            then why don’t buy and resell them?….. like dozens of them, their worth so much you must have a dozen or so on order.

            you compare what’s available at the time and adjust as the situation changes, get over it, what happens tomorrow won’t change what happened today btw.

            no matter how much you may dislike that… tough.

            • Klimax
            • 6 years ago

            Maybe because you missed it all? They are still produced, so they are exactly worth 1000USD to target market.

            • clone
            • 6 years ago

            read Airmantharps comment then reread my response, if you don’t understand my comment which atm you clearly don’t then stop now.

            • Klimax
            • 6 years ago

            Yeah, you missed his point by light-years…

            • clone
            • 6 years ago

            naah, didn’t but whatever, simple likes to believe it’s complicated.

            • Klimax
            • 6 years ago

            You really have nothing better to say then some cheapshot? I didn’t know you run out of better arguments that fast.

            • clone
            • 6 years ago

            Klimax it’s sarcasm, their is nowhere to go with it and nothing to be gained by having a discussion about sarcasm, at the time this started every comment I made was being attacked by the same poster regardless of who I was responding too, they’d jump in and attack the comment…. it’ll likely start again tomorrow, I don’t know but in the lull I’ll answer other questions like yours.

            1st it was sarcasm, 2nd I know Titan is still available… that’s a platitude, everyone knows about Nvidia’s Titan knows it’s still available. 3rd Titan is worth what Nvidia is asking to some…. so?

            I answered one of your queries below and this one isn’t interesting, hopefully it’s done.

            • Klimax
            • 6 years ago

            You better mark your sarcasm, because only in rare cases it is obvious and even then people can miss it. I don’t think this one was that case.

            As for your complaints about other poster, they are completely baseless. Also irrelevant and just attempt at evasion.

            • clone
            • 6 years ago

            I’ll do what I want and if you can’t get the obvious you’ll be the bonus you were before my response.

            • Klimax
            • 6 years ago

            There is nothing obvious on internet. (BTW: You may want to forget quite few assumption) Hm. Bonus for whom. Anyway I’ll just treat all your posts as a joke and blast them as such. At least I won’t need to spend too much time on them…

            • superjawes
            • 6 years ago

            However, Nvidia announced the 780 Ti prior to the release of the 290X, so when reviews were published, everyone knew that Nvidia would be introducing a new, faster, gaming-focused competitor right on its heels. That makes any and all direct comparisons between the 290X and Titan pointless.

            And really, Titan is faster than a 780, but only barely. The rest of the cost difference was always explained by the “gaming+” nature of the Titan.

            • clone
            • 6 years ago

            that’s not always true, Nvidia lied through every orifice during the Fermi delays, Nvidia even held a fake card to try and delay sales and made a number of empty promises…. Nvidia’s PR hype is kinda crap overall to be honest.

            that’s why ppl wait till it arrives and is tested before forming a conclusion.

            • Airmantharp
            • 6 years ago

            Which is different than they do for any other product exactly how…?

            Do you expect marketing departments to not market?

            Do you think AMD will never lie to you?

            • clone
            • 6 years ago

            read Superjawes comment and then my response, if you can’t understand…. which clearly you don’t atm then stop.

            and btw why are you attacking my discussions with others?

            • Airmantharp
            • 6 years ago

            Because you’re an easy target ๐Ÿ™‚

            • clone
            • 6 years ago

            nah, it’s because silly is as silly does little one and you can’t help yourself.

            I’d call it sad but you already know that, I’d call it small but you know that as well so I’ll just call you silly ๐Ÿ™‚

            • Klimax
            • 6 years ago

            And AMD is different how?

            • clone
            • 6 years ago

            read Superjawes comment and then my response, if you can’t understand…. which clearly you don’t atm then stop.

            • Klimax
            • 6 years ago

            You have problem, I understand both of you too well. You are wrong.

            Or have we already forgotten masters of paper lunches. Have we forgotten delay being renamed as on-track as promised?

            • clone
            • 6 years ago

            Klimax you’ve just said I and Superjawes are right, the worst part is you just want someone to say AMD is equally guilty in a discussion about an Nvidia announcement.

            AMD’s part is out, Nvidia says they have a new one coming, AMD’s new part isn’t up for debate, Nvidia’s is.

            • Airmantharp
            • 6 years ago

            If price/performance is ever up for debate, it’s in comparison to something else. Can’t compare a card in a vacuum.

            • clone
            • 6 years ago

            of course… so?

            why is it you offering garbage and upon failure resort to platitudes? is that how you save face?

            here I’ll offer up the more relevant platitude, “you can’t change yesterday.”

            • Airmantharp
            • 6 years ago

            But I don’t have to change anything. I just have to be able to see more than two feet in front of my face, do some basic addition, think ever so slightly outside the box, and realize that there’s more to a graphics card than raw performance, and that no single graphics card exists in a vacuum!

            • clone
            • 6 years ago

            you see what you want to see, nothing more and when the facts don’t suit what you want to see you take it out on ppl who agree with the conclusions.

            go complain to Scott, Kyle, Anand.

            • Airmantharp
            • 6 years ago

            The facts are in the measurements- the conclusion, and the awards, are subjective opinions. I’ve quoted both for you, but I understand if that’s a little much for you to take in at once.

            • clone
            • 6 years ago

            why are you talking to me, go complain to Scott, Kyle, Anand.

            • Airmantharp
            • 6 years ago

            I wouldn’t want to waste their time explaining how you’re misrepresenting their reviews. They have better things to do.

            • clone
            • 6 years ago

            [b<]"just like the 290X's rough edges aren't deal-breakers, the GTX 780's perks aren't deal-makers. Not when one of those crisp new Benjamins is on the table. Nvidia desperately needs to cut prices, or AMD wins this round on sheer value."[/b<] no misrepresentation, just a troll who can't stop complaining..... go complain to Scott, Kyle, Anand.

            • Airmantharp
            • 6 years ago

            Yet you ignore the entire review just for the conclusion.

            Here’s a hint: if you actually read the whole review, you wouldn’t need to read the conclusion!

            • clone
            • 6 years ago

            I did, the conclusion is just that, all factors considered and a reflection of the entire review.

            • Airmantharp
            • 6 years ago

            A conclusion can’t exist without a review…

            • clone
            • 6 years ago

            what a stupid comment.

            • Airmantharp
            • 6 years ago

            So you believe that they should just write terse conclusions and not bother posting their testing results and breaking them down? That’ll save them a boatload of time!

            • clone
            • 6 years ago

            you’ve been stalking me for a while Airmantharp, you’ve claimed you have learned things, that I’ve revealed things and yet you are still asking stupid questions that show you’ve learned nothing, that apparently I’ve revealed nothing.

            • Airmantharp
            • 6 years ago

            Maybe I just admire your spirit. You’re so very insistent in your delusions!

            • clone
            • 6 years ago

            sigh….. [b<]"just like the 290X's rough edges aren't deal-breakers, the GTX 780's perks aren't deal-makers. Not when one of those crisp new Benjamins is on the table. Nvidia desperately needs to cut prices, or AMD wins this round on sheer value."[/b<] reality.

            • Klimax
            • 6 years ago

            Not any more… reality changed as it has changed when 480 first debuted, 7970 appeared and 680 happened.

            Frankly, I’d disagree with some parts of conclusion, but then it is only from POV of reviewer who’s trying to represent portion of gamers. He cannot represent all, so it is just matter where you are.

            • clone
            • 6 years ago

            I agree and I’d throw in caveats if it was only one reviewer but the praise is universal, in two weeks I doubt it’ll still be that way but you can’t change yesterday. the reviews are that solid.

            R290X is not perfect, it’s not without flaws, it’s not the best choice for everything…. but as a gaming option which is what interests me, the reviews said and I agree that for the money it is the best existing option.

            that’s not to say I don’t agree with those who complain about the cooler / fan acoustics but I had an Nvidia 280 and a 480 and the sound levels were decent in my setup, what annoyed me far more was the fan failures that required RMA’s… both dammit 2 week turnarounds and one involved a fight with customs over brokerage charges regarding an RMA’d card!!!!

            while some are concerned with R290X’s operating temp I’m the opposite and encouraged by AMD’s display of confidence in it…. personally think it’s fantastic.

            because 780 Ti is coming I’m going to hold, I tried to grab an R290X but both times they sold out and now with the latest announcements I’m curious enough.

            • Klimax
            • 6 years ago

            Praise universal. And? Solid reviews, when the best is that performance is there but other characteristics are bad. That some reviewers ignore this part in conclusion is just sad testament that they wanted to give AMD something. That’s all, small subtle bias where they root for underdog.
            Also most of reviewers are using gaming POV. (IIRC there are only few exceptions – Tom’s HW, Sky from Canucks HW and IIRC Anandtech; but even those generally rate cards as gaming not GPGPU unless article focuses on that aspect)

            Encouraging AMD regarding temperatures??? They are near upper limits of silicon itself.

            I must say most funny thing is, that most or all concussions got invalidated by NVidia’s announcement and current cuts, which I think is record time for this type of change.

            • Airmantharp
            • 6 years ago

            I tried to put the lack of emphasis on noise in conclusions into perspective before, but essentially, reviewers are smart enough to realize that as bad as this cooler is, very few will buy it; though it is a little annoying that we haven’t heard much about custom cooled versions yet.

            • clone
            • 6 years ago

            stupid response that pushes the same lines of garbage yet again just because you don’t want it… I’ve responded to a few of your posts and now I’m regretting it, you just want to push the same absurd lines while offering nothing.

            when AMD wins an award the internet is corrupt, when AMD wins an award the web is rooting for the underdog…. garbage is all you’ve got to say and yet you’ve whined when I’ve treated you dismissively when apparently it’s the only appropriate response.

            go complain to Scott, go complain to Kyle, go complain to Anand. I didn’t write the reviews.

            p.s. when Nvidia wins an award or a few of them in a cppl weeks I won’t cry and whine about it, you won’t see cries of corruption in my comments, I’ll read the reviews and consider it.

            stop pissing and moaning just because you can’t grasp that AMD might make a decent part worthy of an award and on occasion make the best part.

            the narrow minded thinking narrowly.

            • Airmantharp
            • 6 years ago

            See, here’s the fun part: you actually think that the amorphous ‘internet’ is out to get AMD!

            • clone
            • 6 years ago

            Klimax claims the internet is out to reward AMD any way it can and you accuse me of what?

            misfire silly…. hypocritical as well.

            I’m just going to throw this in because you’r growing desperate and are no longer worth anything to me.

            “silly is as silly does I guess.”

            • Klimax
            • 6 years ago

            Blindness to how people want to treat underdog is in no way my problem.

            But then, there is a reason why I mostly read TechReport and Anandtech. Neither awarded 480 nor 290x.

            BTW: What has been your response to 480 back then?

            • clone
            • 6 years ago

            you are correct TR and Anandtech just said it was the best card and could certainly understand why ppl would choose to buy it despite the higher acoustics of the reference cooler.

            congrats on that……

            p.s. as for 480, I owned one for a bit.

            • Klimax
            • 6 years ago

            Small problem, you seem to be intentionally overlooking all kind of things regarding AMD, while blasting NVidia. Maybe I could find an excuse one or two for AMD too in your posts.

            You know what double standards mean?

            • clone
            • 6 years ago

            I’m not blasting anyone, Nvidia makes very good graphics cards, I agree Nvidia’s drivers are superior to AMD’s but in regards to R290X it’s offering the right performance at the right price after weighing the pro’s and con’s. (titan gaming perf at GTX 780’s price)

            if 780 Ti wasn’t promised in a week or two I’d have an R290X on order now or as soon as I could find one, I have a buyer for my existing card already…. but 2 weeks is close enough to wait.

            • Airmantharp
            • 6 years ago

            You’re waiting?!?

            After demanding that the whole world recognize your proclamation of the R9 290X as the very best card ever made by anyone ever, you’re not going to buy one?!?

            You must really like trolling :).

            • clone
            • 6 years ago

            silly is as silly does I guess.

            • superjawes
            • 6 years ago

            You’re accusing others of not understanding, yet your response to my comment–which is only relevant to the current situation with the 780, 290X, Titan, and the future release of the 780 Ti–is a general attack against Nvidia.

            My point in this situation is that 290X and Titan value comparisons are [i<]meaningless[/i<]. Titan does more than just gaming, the gaming gains over a 780 are minimal, and Nvidia has an actual gaming card on the way. Your argument was that the 290X was somehow a better deal than a Titan because of a $400 price gap. My argument is that the $400 price gap is explained by abilities Titan has that the 780 does not. If you want to make any comparison between the 290X and Titan, you are going to need to show some evidence that the 290X can do those non-gaming tasks as good as a Titan. Otherwise, the comparison is just a strawman.

            • clone
            • 6 years ago

            Nvidia says “hey we’ve got a new card coming”….. ok today it’s true, yesterday it wasn’t and they totally lied about it in every way for a year but this time….. oh yeah, this time it’s true….. I’m not saying Nvidia isn’t bringing something but putting blind faith in an announcement from any company is ridiculous so I’ll wait for it to arrive & be independently reviewed.

            apparently that won’t be good enough for you Superjawes, or Airmantharp, or Klimax as all three have said reviews are completely worthless above but me, I’ll place faith in the independent testing results.

            I compared Nvdia’s fastest against AMD’s fastest, it’s not unfair, it’s fair, Titans performance in games for $400 + + taxes less.

            it’s simple, it’s obvious and it’s fair.

            you are correct I could have said superior performance to 780 GTX for $100 + taxes less but 780 GTX isn’t as fast as Titan so while true and accurate at the time Titan was the performance bar, not GTX 780.

            if I could buy a car for $40,000 that was faster than a Ferrari selling for $1,000,000.00 I’d be saying “Ferrari like performance for $900,000 + + taxes less. and everyone would be like “yeah”.

            what they wouldn’t say is “that’s not fair because the Ferrari cost $1,000,000 so it’s not fair.”, one idiot has gone so far as to say “it’s an 8 month old product”…. so?

            it’s the best they’ve got, it’s getting compared, nuff said.

            • superjawes
            • 6 years ago

            And that $40,000 car will most certainly be faster than a heavy duty tractor-trailer offered by a truck manufacturer, so the comparison still holds, right?

            OF COURSE IT DOESN’T! They are different products for different purposes. You’re looking at one thing, ONE, that the 290X does better than Titan and ignoring everything else. “Fair” would be to run the 290X through all of the tasks Titan is good at. If the 290X was able to do everything better than the Titan, then [i<]and only then[/i<] would your comparison hold water.

            • Airmantharp
            • 6 years ago

            And the R9 290X would still be too damn loud ๐Ÿ™‚

            • clone
            • 6 years ago

            how does that change the fact that the buyer would be getting Ferrari performance for 900,000 + tax less?

            I do not give a rats ass that the $40,000 car is not a Ferrari….. of course it’s not a Ferrari…. it’s the Ferrari performance I want… that and the lower price.

            will you guys quit complaining about this.

            • Airmantharp
            • 6 years ago

            Because you’re not actually getting Ferrari performance? I mean really, is the concept totally lost on you, or are you just twisting words so you can keep trolling?

            • superjawes
            • 6 years ago

            But you’re not comparing a Ferrari to a similar product and that is my issue! What if the $40,000 car you’re talking about is an F1 car? Sure, it has great performance, but it’s go no roof, no passenger seating, and no storage space (and it might not even be street legal).

            What if you’re not just after gaming performance and want some computing power? By your comparison, one might pick up the 290X in place of a Titan and be royally disappointed. Why? Because they are different products. There are other tasks Titan perform better than a 290X. That is my issue with your argument.

            • clone
            • 6 years ago

            [quote<]But you're not comparing a Ferrari to a similar product and that is my issue! What if the $40,000 car you're talking about is an F1 car? Sure, it has great performance, but it's go no roof, no passenger seating, and no storage space (and it might not even be street legal). What if you're not just after gaming performance and want some computing power? By your comparison, one might pick up the 290X in place of a Titan and be royally disappointed. Why? Because they are different products. There are other tasks Titan perform better than a 290X. That is my issue with your argument.[/quote<]when looking for absolute gaming performance I look to compare the fastest that offers the best value, Titan and GTX 780 offered both at the time. R290X offered Titan performance for $400 + + taxes less, it's a fair comparison. if you are looking for compute perf Titan offers something AMD doesn't and it's a fair comparison but compute performance isn't what anyone was looking for.... it was a gaming comparison.

            • superjawes
            • 6 years ago

            I think I’ve just about expended the effort available to argue this anymore. If you are going to insist silly comparisons, I will only provide silly responses.

            • clone
            • 6 years ago

            ok Superjawes, I go out to buy a damned fast car, I find one for $40,000 that’s as fast as a Ferrari.

            wonderful, I want it.

            it’s not a Ferrari, I don’t care, I didn’t want a Ferrari, I was never going to pay the $1,000,000 for the Ferrari but what I really do want is a car as fast as a $1,000,000 Ferrrari… that’s what I want, the performance. (in this case gaming performance because I don’t care about Titans compute)

            am I going to have to make “sacrifices” to achieve this in comparison to the Ferrari, absolutely…. do I care, absolutely not…. will what I’m buying ever be a Ferrari, no, but it’ll be as fast as one for a fraction of the cost.

            you are tired of having this remedial discussion, so am I, likely far more so than you, you are absolutely confident that I don’t … “get it” and that’s ridiculous because I do get it, I don’t care. Titan’s gaming perf for $400 less…. wonderful.

            • superjawes
            • 6 years ago

            Congrats! You bought a car that’s as fast as a Ferrari, but has no seat, body, or safety features!

            But it’s as fast as a Ferrari, so it must be a good purchase!

            • clone
            • 6 years ago

            c’mon now Superjawes, you either know absolutely nothing at all about cars or you are just pretending too.

            you know Ferrari perf can be had for a fraction of a Ferrari’s price which is my point and it’s accurate.

            step up, be honest and admit it, this is your moment.

            • superjawes
            • 6 years ago

            No, that wasn’t your point. Your point was that it’s “fair” to compare apples to oranges, declaring apples the best based on an apple-based test. The result is a conclusion that is bananas.

            • clone
            • 6 years ago

            of course it’s fair to compare oranges to apples.

            • superjawes
            • 6 years ago

            This confirms it. You are bananas.

            • clone
            • 6 years ago

            lol, cheers on that.

            • Klimax
            • 6 years ago

            While missing the other half of Titan capabilities. I get it, you are just a AMD gamer, nothing more, but world of GPU is not composed only of gaming now is it.

            I distinctly remember what AMD fans were using against 680. It wasn’t gaming…

            • clone
            • 6 years ago

            Superjawes and Klimax, my focus on gaming performance is not an indictment against Titan.

            Titan is a wonderful product for those that are looking to use it’s unique abilities… I’m not one of those ppl….. yes Titan has a place in the market, I’ve never said otherwise, yes their is a market outside of gaming, never said otherwise.

            and none of this changes the reality….. R290X offers Titans gaming perf for $400 + + taxes less…. wonderful.

            • Klimax
            • 6 years ago

            In that case Titan was never even aimed your direction. (I guess one needs to make it clear what position they argue and how specific it is)

            • Airmantharp
            • 6 years ago

            You can never be too specific with clone :).

            Had full-on dissertations ever been an acceptable comment format, I could have laid it all quite clearly to the point that there’d be nothing else to talk about, as could you, and I expect many others. The only thing we’ve really discussed has been semantics, lol.

            • clone
            • 6 years ago

            why would I need to consider buying it as a pre requisite to wanting it’s performance?

            what a ridiculous comment.

            • Airmantharp
            • 6 years ago

            Clarification is ridiculous?

            How about never capitalizing a sentence?

            • clone
            • 6 years ago

            silly is as silly does I guess. (it’s a closing comment for when you’ve pushed yourself out of relevance.)

            • Klimax
            • 6 years ago

            I didn’t say you needed to consider buying it, just that its target wasn’t even your direction, because you don’t know how to use half of its power.

            Semantic games are all fun, but they are infinite, thus I leave this semantic subthread. (Better things to do then argue this)

            Thanks for playing. ๐Ÿ˜€

            • f0d
            • 6 years ago

            and the 780 gives you firegl gaming performance for thousands less
            how much more awesome is that.!!

            • clone
            • 6 years ago

            it’s not when you get more with R290X.

            • Airmantharp
            • 6 years ago

            Sure, it absolutely must be compared, for the purposes of gaming. And as we’ve mentioned before, the Titan was never a good deal for gamers- and neither was the GTX780, really. If you look through the forums, you’ll see recommendation after recommendation not to buy anything faster than an HD7970 or GTX770 until these new cards come out and the resulting price restructuring settles in. That AMD released the first card in this refresh cycle is hardly relevant, and to say that it is the ‘best’ card is a play on a technicality, if not an outright lie. Reviewers worth their salt alluded to that point pretty clearly, including TR, AT, and H.

            The real comparison happens after all the cards are released, the prices have settled, and the drivers are finalized, and we’re not there yet. And again, no, I don’t trust AMD or Nvidia. I trust TR, they’ve earned it. They’ve verified the release date for the 780Ti, and have yet to verify a release date for custom cooled 290Xs or for the 290 non-X. So that’s where we stand.

            How’s your reference R9 290X working out for you?

            • clone
            • 6 years ago

            yes, Titan was never a good deal for gamers…. so?

            remember when Fermi was a year late…. only a stupid moron would think it wasn’t fair to compare the 280 to HD 5 at the time.

            you compare what’s available and update as required, quite spewing bullshit about it already, in a cppl weeks things might be different and they’ll be discussed then. don’t blame me for Nvidia’s failure and it’s not biased to compare what’s available, only a fanboy’s mind thinks in those terms.

            [quote<]How's your reference R9 290X working out for you?[/quote<]Ncix.com had 20 cards but they were sold out before I got the order in..... bummer. selling damned fast apparently.

            • Airmantharp
            • 6 years ago

            There’s a big difference between two weeks and a year, and between the launch of an updated product and the launch of a completely new design on an untested node. So yeah, I believe Nvidia this time, and I didn’t believe them last time.

            And no, Nvidia didn’t fail- they succeeded at moving $1000 video cards!

            But I do hope you get your reference R9 290X as soon as possible. The knowledge that you’re going to have to live with that cooler every day will bring me endless joy :D.

            • clone
            • 6 years ago

            yes their is, so?

            two weeks wasn’t known until recently, and two weeks won’t change today.

            in two weeks R290X’s value will be reconsidered but that doesn’t change yesterday.

            • Airmantharp
            • 6 years ago

            Right, so I have to ask- how was that AMD party you threw up in the great white north? Was it too cold for an Nvidia graphics card bonfire?

            • clone
            • 6 years ago

            why are you so childish?

            what’s wrong with you, honestly? why so pathetic?

            • Airmantharp
            • 6 years ago

            Why do you love AMD so much that you take any criticism personally?

            • clone
            • 6 years ago

            why are you so childish? what’s wrong with you, honestly? why so pathetic?

            seriously, what is wrong with you?

            • Airmantharp
            • 6 years ago

            I’m fine- what’s wrong with you?

            • clone
            • 6 years ago

            you, you are the problem, you attack every post I make even when I’m not talking to you, you, you are the problem, you have readily admitted you are stalking me, trying as much as possible to harass me.

            you, you are the problem.

            • Airmantharp
            • 6 years ago

            I stand by every word I said, but remember, ‘stalking’ is something you brought up. We already know that your feelings are hurt, you don’t have to throw a tantrum to prove it :).

            • NeelyCam
            • 6 years ago

            After reading through this painful discussion, I’m ready to announce the results.

            Winner: Airmantharp
            Loser: clone

            • Airmantharp
            • 6 years ago

            Thanks Neely, but I wasn’t really trying! He just kept digging deeper and deeper, all I did was hand him the shovel :).

            • clone
            • 6 years ago

            sigh…. I can’t lose, I’m just quoting TR.

            [b<]"just like the 290X's rough edges aren't deal-breakers, the GTX 780's perks aren't deal-makers. Not when one of those crisp new Benjamins is on the table. Nvidia desperately needs to cut prices, or AMD wins this round on sheer value."[/b<] if you'd like to go whine, whine to TR about it, to Kyle about it, to Anand about it.

            • superjawes
            • 6 years ago

            You keep quoting that article. I do not think it says what you think it says.

            /IM

            That is a highly qualified statement from Scott. He doesn’t even give the win to AMD except in a case where Nvidia leaves prices where they are (note, they’ve already changed). Even then, he uses the term “value,” which can include a lot of things, including the 290X’s “rough edges” and the 780’s “perks.”

            The only rational way to read that statement is “it’s a mixed bag.”

            • clone
            • 6 years ago

            I never said otherwise, do you ever wonder how annoying it is that ppl can’t assume even a little amount of intelligence is contained within the opposing view?

            I’ve not once ever said R290X is a perfect card….. I had assumed that everyone on the planet could not possibly think this.

            it’s so abundantly obvious….. c’mon? really?

            what I did say was “best card” as determined by everyone who reviewed it…. that doesn’t mean perfect in any way and I’m disappointed that I have to mention something so obvious.

            & the more annoying part is that I’ve probably got to throw in another brutally obvious caveat…..”in it’s segment” because god forbid someone try to drag my “best card” claim out of context and think I’m classifying it as the optimal web surfing option for 90 year old coke bottle glass equipped granny using her 17in crt display for e-mailing recipes…. seriously, cmon already.

            what I’m probably most guilty of in the discussions around R290X is in my believing ppl would understand that my claims of Titan’s gaming performance for $400 less + + taxes was not an indictment of Titan but instead a compliment for what AMD managed.

            the other “mistake” I may have made but not really is surrounding my expectation that ppl would eventually acknowledge instead of fighting tooth and nail to the bitter end a conclusion formed by every website that tested R290X… that overall it’s the best.

            that’s it.

            I never said the cooler was perfect, I did say that despite the cooler the card is considered the best …. which to be clear is just a parroting of the conclusions found by everyone who tested it…. yes it’s a mixed bag, one mixed in a way that still managed to make it … at the time of review the best, even now arguably still the best albeit a strong case can be made for both views and in two weeks possibly it’ll be 2nd which still doesn’t change yesterday.

            • Airmantharp
            • 6 years ago

            Being the fastest card and the best card are two different things, and both are quite relative, requiring context for any useful specificity.

            I could say that an Nvidia GT620 was the best card, or Intel’s Iris Pro is the best graphics solution, and I wouldn’t be wrong.

            • clone
            • 6 years ago

            too easy & thx, reread my comment and correct yours.

            thanks for validating my point regarding absurdity conclusively.

            p.s. you should have stuck to grammar attacks but alas, silly is as silly does…. and how many times did I have to finish with that just to say good riddance?

            lol, so silly, but then again, silly is as silly does. ๐Ÿ™‚

            • Klimax
            • 6 years ago

            And??? AMD added just bloody cores and that’s it. Do you know price structure of Titan and how bloody hell NVidia is gold mining us? Titan is loser to Apple’s margin or Intel’s.

            Sorry but you are trying to argue something which is both illogical and soon not to be true either. (Unless NVidia got sudden case of dumb)

        • Benetanegia
        • 6 years ago

        It’s all the same. The landscape is radically different now than it was when 290X launched.

        Nvidia did exactly what AMD did the “last generation” after Nvidia did the exact same thing that AMD just did.

        Or do we have short memories? Selective memories maybe? 7970 launched at $550, $150 more than its predecessor. It was 15% faster than the at the time 14 moth old GTX 580 that was selling for $480 or so, and about 30% faster than its predecessor, selling for $300. Not really a breakthrough, AMD just released the card where it supposedly belonged, perf/$ wise, always comparing to the competition and not their own “lesser” produts on their stack.

        2 months later Nvidia released the GTX680 for $500, 10% faster card, for 10% lower price, and was more silent and efficient to boot. They could have limit themselves to equal AMD’s price, even ask a little premium, instead they decided to undercut AMD with a better product.

        AMD responded like they should (and with the only option they really had), slashing the price by $100 and releasing a faster product afterwards to keep prices higher.

        Oh and game bundles. Lots of them…

        Nvidia has not done anything different now, it’s the nature of the game.

        Why, I wonder, do some people have so selective memories when it comes to AMD?

      • Diplomacy42
      • 6 years ago

      and in 7 more days, it will change again

    • UnfriendlyFire
    • 6 years ago

    Ah, I love it when prices come tumbling down. The only slightly annoying part is when you buy a new GPU, and its price tumbles like 30% within a month.

    Now if only the same could be said for Intel…

      • RDFSteve
      • 6 years ago

      It could, if Intel had any competition.

        • chuckula
        • 6 years ago

        One reason that Intel didn’t cut prices is that the most recent competitor from AMD launched with a $900 price tag…. we’re lucky that Intel didn’t [b<]raise[/b<] their prices given that type of competition.

          • Airmantharp
          • 6 years ago

          We’re extremely lucky that Intel has chosen to compete with themselves instead of doing what AMD and Nvidia have done lately and pricing new products above current products.

          For what, four generations now, they’ve been releasing their new products into the same market positions as their old products, sometimes even with a price cut. It actually seems oddly magnanimous given most of their history.

        • esterhasz
        • 6 years ago

        In some segments, they do. Just look at the Haswell Celeron (dual, 1.4Ghz) in the new Acer Chromebook, which sells for $280 (the whole notebook, not the chip). That’s dumping it.

      • WaltC
      • 6 years ago

      Don’t limit yourself to Intel…it’s a classic mistake…;)

        • Airmantharp
        • 6 years ago

        Yup- AMD did the exact same thing every time they had Intel soundly beat.

      • NovusBogus
      • 6 years ago

      Well, to be fair that’s how oldschool Intel operated…because $500 only got you an entry-level CPU that was guaranteed unable to run new stuff a year later. The ARM threat is preventing both AMD and Intel from playing pricing games.

        • Airmantharp
        • 6 years ago

        ARM’s got nothing to do with the desktop market. Intel could play games, and they probably are, given how slow top-end performance has risen these past few years, as Intel has focused more on mobile power efficiency, to good effect.

        AMD has just mostly not shown up to the fight, for whatever reason. Their technology (Bulldozer et al) is quite promising.

    • thanatos355
    • 6 years ago

    A GTX 780 for $499? Mmmmmm. Yes, please!

      • WaltC
      • 6 years ago

      290X for a piddling $50 more? Yes, yes, please!

        • sschaem
        • 6 years ago

        1 More GB of GDDR5, fully support Mantle, Tier2 Tiles resource support, True Audio compute HW, GCN compute engine found in next gen consoles, much better performance at 4K , XDMA for better crossfire scaling

        Is that really worth 50$ ?

        – thumb up this post if you think so
        – thumb down otherwise

          • Airmantharp
          • 6 years ago

          +1 with a rebuttal ๐Ÿ™‚

          More memory helps today- BF4 has shown that. But we’ll need even more in the not too distant future, certainly within the useful lifetime of any card bought right now.

          The benefit of Mantle is still an open question. We really can’t base an objective opinion off of marketing slides, and we know that the ‘competition’ isn’t standing still.

          Tier2 Tiling support, GCN compute, and XDMA are all moot to the user- either the card performs, or it doesn’t. XDMA specifically helps to fix AMD’s multi-GPU implementation, possibly bringing it up to par with the capability that Nvidia has been shipping for over half a decade.

          But is it worth $50? Absolutely. Just don’t buy the reference version.

            • f0d
            • 6 years ago

            nvidia also has shadowplay and gsync to think about for their side and as you said we dont know what mantle will bring (how many games will support it?)
            imo you have to think about what works and is supported NOW and not what might be in the future as you dont know what will or wont work out

            personally gsync and shadowplay are 2 important things for me as i record a lot of gameplay and being able to do that using the onboard video encoder without framerate loss is awesome, i love the idea behind gsync and having a smooth display especially after seeing how well lightboost turned out on my 144hz monitor and how lots of people that know (tech report for one) say its a fantastic thing

            • sschaem
            • 6 years ago

            nvidia should be greatly rewarded for taking the gsync initiative and I hope it catch on as a spec in one form or another.( I would only consider buying a 4K gaming display with variable refresh…)

            For shadowplay, lets hope AMD release some competing software, now that nvidia made it a selling point. Since HW h.264 encoding is also a core HW feature of ‘GCN’ (leveraged on xboxone and ps4)

          • MFergus
          • 6 years ago

          Doesn’t XDMA CF still scale worse than SLI atm? It’s bandwidth is future proofed though and they are very close.

            • Airmantharp
            • 6 years ago

            That definitely needs quite a bit more investigation. As it stands, it’s just different; there are a whole lot of variables to take into account to make a determination.

            What we’ve seen, so far, is that HI CF performance is probably smoother than SI, but we don’t know just how well it’s working. At the very least, I’m waiting for frame-time measurements from TR.

            • MFergus
            • 6 years ago

            PCPer’s review of 290X CF has the 780 being smoother and scaling better for non 4k resolutions. If you plan to buy a 4k monitor anytime soon though the 290X is the definite choice.

            I don’t know if at 4k the vram is the limiting factor or the SLI bridge is bandwidth limited but it’s most likely the 780’s vram.

            • Airmantharp
            • 6 years ago

            That’s the thing- I respect PCPer’s work, but I want to see what TR has to say. They uncovered this mess, and this release is a good chunk of the result.

            As for 4k, well, I’m half dependent on how well that market develops, and whether or not G-Sync gets adopted. But I’m in no hurry, personally :).

            • MFergus
            • 6 years ago

            Someone who intends to get a 4k monitor anytime soon has money to get new graphic cards then anyways. Someone who cares which card is more future proof is most likely not going to be buying a 4k monitor soon as they will be too expensive.

            • Airmantharp
            • 6 years ago

            You can get ones with half-assed controllers for <$1000, so it’s not a stretch to imagine a decent, affordable 4k monitor hitting a reasonable price in the next six months or so, which means that any decision in video card upgrades would be influenced by 4k performance. Mine certainly is.

            • MFergus
            • 6 years ago

            AMD would be the easy choice if you’re planning to get a 4k monitor over any Gsync monitor IMO. As long as my current card was still decent though I just wouldn’t buy a new one until I have a 4k monitor if they are going to buyable soon.

            • Airmantharp
            • 6 years ago

            That’s where I’m at. I could use a little more performance and a little more RAM, but that wouldn’t make games any smoother- it’d just let me turn the settings up a bit and use some AA. Neither is enough reason to drop a grand on video cards.

            And yeah, I’m definitely waiting to see where G-Sync goes- both to see what kind of monitors it winds up in, and to see if AMD or Intel get aboard, if they can. The technology behind G-Sync should really be implemented in every display panel and every GPU everywhere, be it in a car, in your TV, or on your cellphone. It’s that disruptive of a technology, and I definitely want it on my desktop.

            • Pwnstar
            • 6 years ago

            I prefer keeping my minimum FPS above 60 and using Lightboost to reduce motion blur instead of Gsync. Luckily, Gsync can be traded for Lightboost, so that option will be available.

            • Airmantharp
            • 6 years ago

            If there was a way to guarantee that FPS would stay at whatever target you set, I’d be on board with that too- but such a solution doesn’t exist, and likely can’t exist without making extensive compromises elsewhere. id tried with Rage, for example.

            • Pwnstar
            • 6 years ago

            You can come close to that idea by using a powerful GPU and/or reducing graphics options. That’s what I meant by trying to keep a [i<]minimum[/i<] FPS.

            • Airmantharp
            • 6 years ago

            I knew where you were going :).

            Thing is, much as many have tried to do it, it’s just not possible- any good game has too much variability, so you either give up a significant level of detail, or you deal with uneven frame rendering.

            • Pwnstar
            • 6 years ago

            It works for me. I have the AMD 7950 and I try to shoot for a 80 FPS average so the minimum frame rate isn’t too low. I do have to turn down graphics to get that FPS but so far the games still look good. When they stop looking good at 80 FPS, I’ll upgrade my GPU.

            • Airmantharp
            • 6 years ago

            That’s about what I’m doing at 1600p with a pair of GTX670’s and a 2500k at 4.5GHz. Works for BF3, for the most part, though I don’t think anything could make that game perfectly smooth.

            • Pwnstar
            • 6 years ago

            Those cheap 4k monitors are not worth the money as they are limited to 30Hz.

            • Airmantharp
            • 6 years ago

            Sure- but the panels are good for 120Hz, so all that’s missing is the controller. They’re also pretty old now, so newer versions will likely be here fairly soon.

            • Pwnstar
            • 6 years ago

            You’re right. I was just pointing out that the cheap ones are not worth it currently. Don’t want somebody here to buy one and be unpleasantly surprised.

            • Airmantharp
            • 6 years ago

            Cool :). I’m a bit forward-thinking; there’s very few things I do that can be described on a public forum without thinking :D.

            • Diplomacy42
            • 6 years ago

            Interesting qualification you made there…

          • Sagia
          • 6 years ago

          does it support Physx gpu accelerated games? does it support CUDA? how about the linux driver? Can it beat nvenc encoding? does it support Shadowplay? Does it support 3d vision? G-sync or Adaptive V-Sync? TXAA? No screen tearing when using Nvidia surround?

          Still too expensive for Nvidia GTX 780 with all these features, when AMD 290x clearly beat you on performance

          If you didn’t care about the feature, better buy 290x

            • Starfalcon
            • 6 years ago

            G-sync is in the monitor, has nothing to do with the video card.

            • Airmantharp
            • 6 years ago

            Yeah… about that.

            • Sagia
            • 6 years ago

            I know about that. So you suggest that ATI GPU can run monitor that have G-Sync?

            • Airmantharp
            • 6 years ago

            Two things-

            The G-Sync hardware is in the monitor, but the GPU has to be able to interface with it over DP. Without any more information, it’s absolutely conceivable that AMD can put G-Sync support in their GPUs.

            What we don’t know is whether Nvidia will let them, and whether adding G-Sync support to AMD GPUs may be done by a software/firmware update or will require a hardware modification and new cards.

            • Sagia
            • 6 years ago

            Well, I get the ideas from this site [url<]http://www.extremetech.com/gaming/169091-nvidias-g-sync-promises-better-faster-monitors-that-will-revolutionize-pc-gaming[/url<] . Like I said dude, do you expect the Nvidia to opened this technologies to others just for free or AMD to paid the license fees to Nvidia just to use this g-sync. One company are too cheap but got most of the features, the later are to proud of themself (get the license for cuda and physx) and believe that they can do even better. [url<]http://www.extremetech.com/computing/82264-why-wont-ati-support-cuda-and-physx[/url<]

            • Airmantharp
            • 6 years ago

            Well, unlike PhysX that was mostly reduced to a ‘graphics enhancement’, G-Sync will make a difference for literally everything you do, gaming and beyond. Further, while CUDA was absolutely necessary for Nvidia, supporting it (like Nvidia supporting Mantle) would likely result in hardware design decisions to better support a competitor’s API. It would have helped AMD get into the industrial compute scene that’s currently dominated by Tesla and Phi cards, but I don’t think AMD really wanted to support Nvidia here. Instead, they’ve focused more on the late-blooming OpenCL API, and they’ve made big enough waves in certain sectors already. And we’re all better off if they keep pushing OpenCL, despite CUDA’s advantage on Nvidia hardware over OpenCL.

            Really, only TrueAudio and G-Sync are truly disruptive technologies, and G-Sync is the only one with a true, universal defined benefit. I’m skeptical that TrueAudio is going to enable anything revolutionary, at least not immediately, though I’m definitely looking forward to seeing what they can do with it.

            Last, unlike CUDA/PhysX, G-Sync is a fairly simple change to the DP protocol on the GPU side. While it might require new hardware, actually implementing it will be fairly straightforward compared to the level of software development necessary for AMD cards to take advantage of CUDA or PhysX.

          • Zyrusticae
          • 6 years ago

          Just sayin’, this works both ways.

          On the Nvidia side I’m pretty much married to SGSSAA, downsampling, strong SLI support, HBAO+, Shadowplay, plus G-Sync coming in the future (I’m about due for a monitor upgrade anyways).

          Downsampling alone is a feature I’d pay for, honestly. It really makes a big difference in a lot of games (especially DX11+ games with no strong anti-aliasing options – BF4, I’m lookin’ at you).

          • brucek2
          • 6 years ago

          Worth $50, yes. Worth $50 and substantially more noise and heat to the point where a dual card set up is completely untenable (for my preferences), no. Neither card is really 4K capable on its own and it doesn’t sound like a dual card AMD set up will be very appealing in practice despite its performance.

          Also entering my calculus would be nvidia’s work in game streaming which might end up being very appealing for my house, and my leftover AMD hostility due to my perception of their unethical conduct in sticking gamers with defective drivers offering only marketing performance vs actual performance for a long time until the press finally caught them at it. They seem to have fixed that problem but who knows what they would do to me next.

        • HisDivineOrder
        • 6 years ago

        So far, every R9 290X is too loud for me and the 780 reference system is not. The performance gap is not so large that I would endure those volume levels.

        Plus, reading through the threads of people saying they were early owners, I noticed at least a few describing new levels of coil whine the likes of which have never been seen.

        I do not like coil whine, either. Knowing that the 7970 and 7950 boards have been widely reported to have lots of problems with that, I’m inclined to believe them. I’d hoped AMD had fixed the problems they had with that, but it would appear they have not. Before you say, “They couldn’t ever go to production with a problem like that,” I’d point you to their frame latency and Crossfire snafus.

        $50 less for a substantially quieter system (due to fan noise and due to no coil whine) seems like a solid deal.

          • sschaem
          • 6 years ago

          You can lower perf of the 290x by about 10% to get around 780 performance, dropping noise level.

            • Airmantharp
            • 6 years ago

            Well, it’s twice as loud at stock- not sure 10% less fan speed is going to cut it :).

            • MFergus
            • 6 years ago

            That wouldn’t make any sense now that the 780 is cheaper. Why pay more to down clock the 290x to 780 levels?

            • HisDivineOrder
            • 6 years ago

            I suppose a reason would be if you just hate nVidia with an all-consuming rage that will allow you no idea of owning anything nVidia even glanced at.

            Hating a company so much is self-defeating.

    • Jon1984
    • 6 years ago

    Even though the 280X looks like a fine alternative to the GTX770. Looking forward to see its performance this week ๐Ÿ™‚

    • chuckula
    • 6 years ago

    This never happened.

    I have it on good authority from multiple forum posters that Nvidia can’t cut prices because they are evil and AMD is awesome. Consequently, this is all a lie and every Nvidia card ever made is a ripoff.

      • Airmantharp
      • 6 years ago

      With all the vitriol from those still in an orgasmic state over the R9 290X release, I wonder if the death threats will start coming from regulars now :-p

        • clone
        • 6 years ago

        are you serious?… is anyone really getting death threats?

        this isn’t funny, it wasn’t before and it isn’t now….. it’s amazing how radicalized this forum appears to be going not just with the personal attacks, the prolonged hair splitting and fanboy rants but now ppl joking about how fanboys are issuing death threats?

        if it’s true then TR has a serious problem on their hands and it’s not funny, at all.

          • Airmantharp
          • 6 years ago

          Yes, actually, quite a few were.

          • Klimax
          • 6 years ago

          Have seen it. And I think even received it too.

            • clone
            • 6 years ago

            I can’t find anything funny about it at all.

            literally it speaks so badly of the individual.

            • superjawes
            • 6 years ago

            [url=https://techreport.com/forums/viewtopic.php?f=32&t=56186&start=60<]On the topic of death threats...[/url<] In hindsight I think Airmantharp was in the wrong to joke about the death threats, especially now that there seems to be confusion about it. All the "legitimate" death threats are coming from spammers, and as JBI suggested, probably just to drive people away with a false narrative (essentially). So everyone take a step back and a few deep breaths...

            • Airmantharp
            • 6 years ago

            I joked about them because they were a joke. Calling them ‘death threats’ is giving the individual(s) responsible a little too much credit :).

            • superjawes
            • 6 years ago

            I agree with that, and I even laughed at the ones I received, but to someone joining the site today it might not be received in the same light, and sets a bad first impression.

            And my concern here wasn’t over the legitimacy of the “death threats,” but the confusion that they might cause. They are spam, not threats, and everyone can just stay calm and…Gerbil on?

            • Airmantharp
            • 6 years ago

            Yup ๐Ÿ™‚

            • Klimax
            • 6 years ago

            Agreed.

            • Klimax
            • 6 years ago

            They were so over the top, I couldn’t take them seriously. Had they were quite bit toned down, it would change things…

      • clone
      • 6 years ago

      given Nvidia announced the 780 Ti will carry an MSRP of $699 nothings really changed and most of the accusations are more true than not.

      nice try though Chuck, if looked upon with no consideration it was almost accurate.

        • Airmantharp
        • 6 years ago

        What accusations? Are you just talking to yourself?

          • MFergus
          • 6 years ago

          Well clone is either blatantly trolling or incredibly biased. One or the other.

            • Klimax
            • 6 years ago

            How about both?

            • clone
            • 6 years ago

            it’s not trolling to state facts, you won’t read “butthurt” in my comments, you won’t see me celebrate either companies failure or demise, I state facts and try to see through the fog….this isn’t a condemnation of Nvidia just like Nvidia tweaking it’s lineup to suit a new product and respond to AMD is anything but expected.

            • Airmantharp
            • 6 years ago

            Well, you do state the discrete facts pretty well, but you also refuse to accept any context that would put those facts into perspective. And as with most things, the fact itself is entirely useless without context to relate it to the bigger picture.

            • clone
            • 6 years ago

            I quote a conclusion made by the ppl who reviewed the card and you shoot your mouth off that I’m not quoting in context?

            god that’s a horrible response, absolutely horrible.

            p.s. I’m not going to run multiple discussions like the other thread, I wasn’t even talking to you when I posted in any of this thread but whatever, condense it to one roof or let it die.

          • clone
          • 6 years ago

          is accusations the wrong word? by using it you’ve become notably confused?

          reread Chuck’s comment and substitute whatever descriptor you wish that will suit the tone of Chucks comments that “Nvidia can’t cut prices, Nvidia is evil, AMD is awesome, Nvidia is a ripoff, it’s all a lie.”

            • Airmantharp
            • 6 years ago

            Did you not realize that he was making a joke? The Count can troll quite well if he chooses to, but this isn’t an example of his trolling ability :).

      • Hattig
      • 6 years ago

      The 290X release wouldn’t have made the same ripples had NVidia priced the 780 reasonably in the first place. A $150 price drop is not a small amount.

      Let’s see how a $399 290 looks against this price dropped 780. AMD also need to get their game bundles active on their new cards as soon as possible.

        • HisDivineOrder
        • 6 years ago

        As much as I’d love a $399 R9 290, I don’t think AMD will do that. I think they’re going to go on with their original plan of putting out a $450-500 R9 290. I think if they ever considered bumping it up to $500, they probably won’t now. I doubt a $400 R9 290 is in the cards, though.

        At least not this year. Maybe in a few months.

          • Airmantharp
          • 6 years ago

          Here, I’ll disagree with you- I think that $399 is the perfect price point for the R9 290 non-X, and my prediction is that they’ll release it at that price.

          Hell, if you live anywhere near North Texas, I’ll wager a bear!

    • superjawes
    • 6 years ago

    Nvidia price cuts on the 780…called it.

    Then again, so did 500 other people on this site ๐Ÿ˜†

    Now we just need to see how the 780 Ti performs, which will be more telling than the 290X review, I’m guessing. The Ti will certainly be faster than the 780, but it might not be able to beat the 290X, which could leave AMD with the top-end gaming card.

      • SCR250
      • 6 years ago

      [quote<]Now we just need to see how the 780 Ti performs, which will be more telling than the 290X review, I'm guessing. The Ti will certainly be faster than the 780, but it might not be able to beat the 290X, which could leave AMD with the top-end gaming card.[/quote<] If Nvidia is really pricing the 780 Ti at $699 then it would be doubtful that is will be slower than the 290X.

        • superjawes
        • 6 years ago

        Well, we still haven’t seen the actual specs on the 780 Ti, so I’ll reserve judgement until then. My gut tells me that they will get above the 290X, but whether or not it’s “enough” faster is a different story.

        Just interesting, it is.

      • Pwnstar
      • 6 years ago

      Titan’s price was not cut, just like I predicted! So I called that one.

        • MFergus
        • 6 years ago

        It’s exactly why comparing the price of the Titan to the 290X or even 780 never made sense. The Titan isn’t expensive because of it’s gaming performance. It’s expensive for it’s compute capabilities. They can’t lower its price or it may cut into there Quadro sales more. There’s nothing wrong with the 780 TI being cheaper and a faster gaming card than the Titan.

        • superjawes
        • 6 years ago

        And Titan isn’t exactly a competitor to the 290X. It’s a good gaming card, but it’s not targeted at a gaming-only crowd like the 290X/780 and below are, which was also established earlier.

        • Klimax
        • 6 years ago

        although I would love Titan price cut (second for my system would be good) I didn’t expect it. They don’t like to torpedo Tesla too much…

      • JustAnEngineer
      • 6 years ago

      Considering the size of these cuts, is it any wonder that the folks that bought the NVidia cards a few weeks back are so mad at the world now?

        • Airmantharp
        • 6 years ago

        They should be pretty mad at themselves for not doing any real research. We’ve know this stuff coming for months :/.

        • Klimax
        • 6 years ago

        So AMD buyers should have been mad at AMD when they slashed 7970 and introduced game bundles?

    • f0d
    • 6 years ago

    begun, the price war has ๐Ÿ™‚

    this is good news for all involved, nvidia and ati fans alike

    • Star Brood
    • 6 years ago

    Lots of good NVIDIA announcements today.

Pin It on Pinterest

Share This