Nvidia announces GeForce RTX 2080 Ti, RTX 2080, and RTX 2070

Alongside CEO Jensen Huang's pre-Gamescom keynote, Nvidia unveiled three new GeForce graphics cards using the Turing architecture. The GTX 2080 Ti uses a Turing GPU capable of up to 14 TFLOPS of single-precision floating-point performance for traditional rasterization, while the RTX 2080 measures in at about 10.5 TFLOPS, and the RTX 2070 is capable of about 8 TFLOPS.

Turing GPUs also include RT cores for accelerating operations important to ray-tracing such as testing bounding volume hierarchies and ray-triangle intersection, plus Nvidia's tensor cores for performing AI denoising of raytraced scene elements. Nvidia CEO Jensen Huang says the processing power of these accelerated elements of the chip will require new measures of performance to account for the heterogeneous usage of resources to produce hybrid-rendered scenes, as those resources could account for tens of tera-ops beyond the typical single-precision FLOPS count.

The new cards rely on Nvidia's RTX stack to blend the performance of rasterization and the visual fidelity of ray tracing for effects like highlights, shadows, reflections, and ambient occlusion. RTX is a back-end for Microsoft's DirectX Ray Tracing (DXR) API announced back in March. Epic Games has already integrated DXR features into version four of the Unreal Engine, for just one example of industry adoption.

The GeForce RTX series introduces a new Founders Edition cooler design that drops the centrifugal blower-style fan we know and love for a dual-fan open-air design that's closer to partner boards than designs of the past. Founders Edition cards have three DisplayPorts, a USB Type-C connector capable of hooking up to future VirtualLink devices, and an HDMI port. The RTX 2070 trades a DisplayPort for a DVI port.

  Shader

processors

Base

clock

(MHz)

Boost

clock

(MHz)

Memory

type

Memory

size

Memory

bus

width

Memory

bandwidth

Board

power

Suggested

price

RTX 2080 Ti FE 4352 1350 1635 GDDR6 11 GB 352-bit 616 GB/s 250 W $1199
RTX 2080 FE 2944 1515 1800 8 GB 256-bit 448 GB/s 215 W $799
RTX 2070 FE 2304 1410 1710 185 W $599

The GeForce RTX 2080 Ti Founders Edition will carry a suggested price of $1,199, while the RTX 2080 Founders Edition will list for $799 and the RTX 2070 will go for $599. Partner cards will apparently start at $999 for the RTX 2080 Ti, $699 for the RTX 2080, and $499 for the RTX 2070. Nvidia will begin taking pre-orders for the RTX 2080 Ti and RTX 2080 cards today, and they'll begin shipping on or around September 20. Details about partner cards are still rolling in, and we'll be writing them up as soon as we're able.

Comments closed
    • BIF
    • 1 year ago

    I want to know how well the 2080 TI is going to fold. It’s time for a warp core refit!

    • Gastec
    • 1 year ago

    After reading the comments bellow (but not only on this site) the following leitmotif emerges: “GTX 2080 Ti price is insane, but I’ve never said that because Nvidia is a for-profit-only company” 🙂
    So, in case those hallucinogenic mushrooms or your cryptocoin-speculator easy money makes you loose track of numbers late at night let me refresh your Clouded #resisting-to-common-sense minds: the MSRP of the GTX 1080 Ti FE was [b<]$699[/b<] at launch. The price of GTX 2080 Ti FE is [b<]$1199[/b<]. That's a 71,4% increase over the past generation. No, that's not insanity, it's just GREED! And it should be punished accordingly. PS: then again, all the crypto-miners and speculators will have no qualms about buying these cards, as they didn't actually work for the money, and Nvidia clearly knows it 😉

      • BIF
      • 1 year ago

      But have you seen the size of that chip? It’s as big as a wing, and that is expensive to make.

      • stefem
      • 1 year ago

      Maybe you are the one hallucinated, Touring is more than 50% bigger than Pascal and cost doesn’t scale linearly with size…

    • Austin
    • 1 year ago

    🙁 Oh nVidia. The prices may have been justified IF the performance merited it and although we don’t have solid data yet you can make some educated estimations. From memory for traditional non RT rendering the 2080 is likely to be SLOWER than the 1080TI albeit 20%ish faster than the 1080.

    😮 Eagle eyed people have revealed the RT Tomb Raider was running on the big daddy 2080TI and only hit between 30 and 70 FPS … at 1080! Who is going to plump $1000+ to run games at 1080? Worth trying out sure, interesting certainly but playable? Not even on the 2080TI and how do you think the 2070 would manage?

    🙂 Okay so this is really early stuff and nothing is fully optimised or independently tested yet but RT is looking like it’s far too early and the cards simply aren’t fast enough in traditional rendering for the suggested pricing especially after a long 2 year wait. Why didn’t nVidia give benchmarks in their presentation and why hasn’t NDA lifted to show us all how these cards perform?

      • Krogoth
      • 1 year ago

      Wow, a trip down memory lane.

      It has been a while since you have last posted here.

      • techguy
      • 1 year ago

      “likely to be slower than 1080 Ti”

      /eyeroll

      [url<]https://techreport.com/news/34022/nvidia-releases-its-first-official-benchmarks-for-the-rtx-2080[/url<]

        • Krogoth
        • 1 year ago

        2080 will probably struggle to match 1080Ti in 1440p/4K gaming performance simply because it doesn’t have the ROPs and raw shading performance of GP102. It’ll get close but no cigar but it will at least consume less power to do it.

        2080’s gimmick over 1080Ti will end-up being that is smaller and easier on power and it happens to support “ray-tracing” acceleration.

          • techguy
          • 1 year ago

          [url<]https://techreport.com/news/34022/nvidia-releases-its-first-official-benchmarks-for-the-rtx-2080[/url<] ROP and geometry performance become less relevant at higher resolutions as the bottleneck shifts toward shading and bandwidth constraints.

            • Krogoth
            • 1 year ago

            Actually, ROPS and geometry still matter. Just ask Hawaii, Fiji and Vega about that.

            • techguy
            • 1 year ago

            That’s a different matter altogether as you’re now comparing architectures from different IHVs, and you don’t have to tell me that AMD GPU architecture is inferior to Nvidia’s in certain metrics (geometry likely foremost among them):

            [url<]https://techreport.com/forums/viewtopic.php?t=115821[/url<]

    • freebird
    • 1 year ago

    The silver lining is that previously over-priced GTX 10xx series cards (and RX 580s) are now very affordable with the glut of GTX 10xx cards due to drop in crypto demand…

    If I didn’t have several 1070s and Vega 56s right now, I’d be buying 1 or 2 1080Tis when you can get it for less than what the 1080 was selling for just a month or 2 ago.

    For only $526.50 + tax

    [url<]https://www.amazon.com/ZOTAC-GeForce-352-bit-Graphics-ZT-P10810D-10P/dp/B06XXVVQYH[/url<]

    • Chrispy_
    • 1 year ago

    The one thing I will say about that 2080Ti is that the reference cooler looks sweet.

    And by sweet, I mean high-quality, precision engineered, compact, and absolutely no robot dragon monster military camo RGBLED mecha-streisand montage of Tiawanese styling.

    • f0d
    • 1 year ago

    Wow so many upset about the price
    If its too expensive for the price/performance nobody will buy it.

    You don’t need the fastest to play games, there will be cut down cheaper versions that will be capable gpus as well as specials on older gpus and the competition will have some sales to counter this new release
    So its a win overall for consumers imo

    You can bet most (not all) people complaining are those that never bought the highest end gpu anyways or they are fans of the competition
    most people buying the fastest of anything usually don’t care much about the price

      • Gastec
      • 1 year ago

      So which are you, one who never complains or one who never cares? 🙂

        • f0d
        • 1 year ago

        Its out of my price range and i just don’t care
        And why should I? Its not like my computer is going to stop because nvidia released a card
        I never was able to get into the computer fanboy wars, sure I’m a tech enthusiast but I never cared who made my stuff just that new stuff is made and its all interesting

    • WhatMeWorry
    • 1 year ago

    Just wanted to put out some thoughts/opinions of mine. Or raw meat to hungry wolves 🙂

    I’m not trying to disparage ray tracing in general but for games, at least, when all said and done, isn’t ray tracing just for making games look more realistic?

    I’m not a huge game player, but at 1440 resolution, I can’t recall ever thinking to myself that this game is inferior graphically and it needs is ray tracing to become acceptable or just better.

    I believe ray tracing (and thus these cards) will be important, but just in more limited, non-game niches. I know about Moore’s law, but I don’t see this much functionality ever getting into the cost and power constraints of the mass market console and PCs. Not to mention the software to be written and adopted. Hell, the industry can’t even agree on a graphics standard: DirectX versus Opengl. And the bastards at Apple has dropped opengl for Mantle.

    In short, I predict that ray tracing will kinda end up like like Virtual Reality.

      • Krogoth
      • 1 year ago

      Ray-tracing will most likely remain in the professional graphical scene for the foreseeable future.

      These new features are going to be game changers in the professional 3D graphics world. You no longer have to resort to renting out massive render farms for most projects.

      It will be another five to ten years before it’ll start taking serious hold on the gaming scene (Next generation console or two will have to it).

      • techguy
      • 1 year ago

      What many people seem to not realize is that ray tracing (combined with insane levels of SSAA) is literally the reason why movie CGI looks so good compared to video games. When we have that level of power on-tap and can achieve that level of graphics in real-time, we’ll have reached the final plateau for graphical fidelity. I’m on board.

    • techguy
    • 1 year ago

    At this point we’re just waiting on reviews. Actually, we’re waiting to find out when the reviews will be out. I can’t believe that wasn’t communicated by NV yesterday. Previous events they made a big hub-bub about handing out cards to the press and reviews were up in a week or two. This time: nothing.

    When will we have independently-verified performance testing for RTX parts?

      • chuckula
      • 1 year ago

      [quote<]Actually, we're waiting to find out when the reviews will be out. I can't believe that wasn't communicated by NV yesterday.[/quote<] They claim the parts will be on sale as of September 20 (a month after their dog-n-pony show). That's probably a strong bet for your review date availability too.

        • techguy
        • 1 year ago

        I’m well aware of the sell date, if reviews aren’t up until that day that’s quite a leap of faith NV is asking of buyers.

          • chuckula
          • 1 year ago

          I don’t care who the company is: Never trust a pre-order. Never.

            • techguy
            • 1 year ago

            Hence why I’m asking “when will the reviews be available” instead of just pre-ordering…

            • K-L-Waster
            • 1 year ago

            This times a Tera-

    • thx1138r
    • 1 year ago

    Lot of people complaining about the $1200 price tag. Not me. It’s really just a ploy to make the $600 RTX 2070 look reasonable. Hint, it’s not. Honestly, you’d need a serious monitor/PC setup to make such a card worthwhile and the sad fact is that the slightly sharper graphics are simply not going to make that game you are playing any more fun. $200 GPU’s FTW.

      • tipoo
      • 1 year ago

      The old Gold Apple Watch to make us forget that the Steel was still eye watering, plus every restaurant menu

      • Johnny Rotten
      • 1 year ago

      Here’s the problem: UWD (3440×1440) or 4K monitors at high refresh rates (120hz+)…

      Widescreen *does* change the experience imo. And a 1080Ti is stretching to drive that at high framerates (>100hz) with todays games.

        • thx1138r
        • 1 year ago

        Sure, all the extra bells and whistles like 1440p, ultra wide, 4K, high refresh rate, etc. all make a noticeable difference to the gaming experience. I’m not saying they don’t. My contention is that the difference they make is relatively small and increasingly fleeting. From my own experience after getting engrossed in a game solidly for an hour or two I’ve ofter been surprised by looking at the settings afterwards and seeing that I’ve been playing the game sub-optimally in some respect, say with medium instead of high settings and you know what, it made absolutely no difference to my enjoyment of the game. A high-end graphics card simply cannot turn a bad game into a good one, and a good game is too engrossing for you to notice the minutiae of graphics card settings and performance.

    • LoneWolf15
    • 1 year ago

    Looks like its time to buy a closeout GTX 1070 and go SLI.

    Not a single one of of those is priced in a range I’d be interested in.

      • derFunkenstein
      • 1 year ago

      The same was true for me when the 1080 and 1070 launched. Cheaper parts always come later.

    • Chrispy_
    • 1 year ago

    AMD needs to quit slacking on the GPU front:

    [list<][*<]Polaris is amazing if you don't overclock it and ruin the power efficiency (aka, the RX580) [/*<][*<]Vega is a bit meh, but it's the best Freesync experience I can get right now and the HBM2 supply issues mean that availability and pricing are still higher than everything else, leading to poor performance/$[/*<][/list<] Even if you're not a fan of AMD's GPUs, we need them to keep Jensen Huang's price creep in check.

      • Krogoth
      • 1 year ago

      AMD RTG doesn’t see any strategic reason to invest its limited R&D resources on a market that is about to undergo significant demand destruction in the next decade.

      They are going back to ATI’s pre-Radeon days with the more dependable route of securing iGPU and semi-integrated solutions in the next way of OEM systems and laptops.

        • NoOne ButMe
        • 1 year ago

        more like AMD knows even if they make a product that has similar performance, more VRAM (more future proof), at the same cost and uses slightly more power, and allows a cheaper build w/ freesync… They will be–no, they were–outsold 10x by the Nvidia part.

        Still, look at 480/470 and 580/570 sales versus 1060 3GB/6GB. It happened.

        AMD’s branding issues/Nvidia’s great branding-marketing led to this.
        Oh, and AMD’s complete and total failure to keep marketshare in gaming laptops. That hurts a lot.

          • Krogoth
          • 1 year ago

          It is because Nvidia stole the thunder with massive marketing and fiscal successes of GTX 8xxx-9xxx family. It was so successful that it managed to keep sales of GTX 2xx family afloat despite the fact that HD 4xxx family were compelling options at the time (I know countless people who got a GTX 260/GTX 275 over a HD 4850, 4870 and 4890).

          It also help them weathered the disappointment of GTX 480/470. GTX 460/560 outsold practically every AMD discrete SKU at the time despite the fact that its direct competitor HD 68xx were better options outside of tessellation(was a niche at time).

          Kepler resounding success against GCN 1.0/1.2 pretty much sealed AMD RTG’s fate in the high-end discrete market. Polaris appears to be a short-lived mid-range fluke that fell as soon as 1060 came out. 970 utterly crushed GCN 1.2/1.4 in sales despite the whole 3.5GiB/0.5GiB memory partition controversy.

            • anotherengineer
            • 1 year ago

            Pretty much. Even if AMD RTG put out something twice as good for 1/2 the price of Nvidia, people would still pay for nvidia, it’s almost like the Apple RDF.

            I think AMD knows this, and would probably love some of those high margin profits, but wouldn’t get much if they can’t sell any cards.

            My budget has always been in the $200-$300 range and that’s where it’s staying. The way things are going, integrated will probably be fine for me once my current card goes out of driver support.

            • techguy
            • 1 year ago

            Nonsense. I buy AMD cards when they make sense for my budget and my performance demands. The last time for me that was the R9 290 (crossfire). Before that it was the 7950. Prior to that was all the way back to the X1900 (crossfire).

            AMD needs to compete for my money, they’re not a charity.

            • synthtel2
            • 1 year ago

            580s and below have been priced at parity with or below their Nvidia equivalents for a couple months now. How many times has “it’s a shame AMD isn’t competing on price” been said in that time, even just on this site?

            • techguy
            • 1 year ago

            Most people that are active on PC hardware sites such as this one aren’t interested in $200 graphics cards. Clearly we’re not talking about them in this news post discussion thread. AMD hasn’t competed at the high end for more than a brief stretch over the last 10 years in CPUs or GPUs, and never both at the same time.

            I will consider their products when they meet my needs. Until then, they’ll continue to play second fiddle.

            • synthtel2
            • 1 year ago

            The $200 segment looks less relevant now than it has in the past because progress in general is slowing down (so if we enthusiasts want real gains we’re going to end up paying more), but the $200 segment still massively outsells everything above it, and Nvidia still massively outsells AMD in that segment even when AMD is price-competitive in it. Available performance and price-performance ratios are clearly not the only things going on here.

            • NoOne ButMe
            • 1 year ago

            TL:DR: AMD had and has competitive products and still was outsold by over 10:1 in the competitive segments per most recent Steam Hardware Survey.

            AMD competed. They had cards as fast or close to as far as Nvidia at similar performance/dollar ratios.

            And Nvidia outsold them, oh, about 10:1 (current steam marketshare 12.5% to 1.14%) with the 1060 3/6GB against the 470/480 and their 500 series counterparts.

            Some of that will be due to miners having taken more AMD cards. But I don’t think it truly makes a serious impact.

            Than 1050/ti against 460/560? 15.78% for NVidia to .63% for AMD.

            Over 20:1–a large chunk of this is notebooks, but still, “only” 10:1.

            AMD had competitive products, AMD had a cheaper ecosystem.
            Hell, they still have it for those segments, but we’ll still see Nvidia cards sell 10x+.

            Yeah, AMD had slightly higher power usage (matters to a very small, <5% of market). A tad slower. But that might make you think NVidia outsells, oh, 2:1, maybe 3:1, even 4:1-5:1 on Steam now given AMD was purchased more heavily by miners.

            • anotherengineer
            • 1 year ago

            I buy them also, however I know people here that have bought a 1060 3GB for $50 more than an RX580 with 8GB, just cause they think or believe it’s better or superior in every way.

            • Redocbew
            • 1 year ago

            That’s true of just about any brand which becomes widely known to some degree. People are creatures of habit, and as much as it seems ridiculous at times the practice of branding wouldn’t be so ubiquitous if it didn’t work.

            That said, there do seem to be some companies which are better at it than others.

    • Zizy
    • 1 year ago

    I guess I will have to buy a console for the first time 🙁

      • Kretschmer
      • 1 year ago

      Or…buy one of the exceedingly capable Pascal or Vega GPUs that exist.

      • ColeLT1
      • 1 year ago

      Lots of 1080ti’s on sale

      • Klimax
      • 1 year ago

      From high-end to crippled console experience? Doesn’t make sense.

      • tipoo
      • 1 year ago

      Not like you have to buy the bleeding edge, or else then jump all the way down to consoles. If the PS5 launches in 2020 with a Navi GPU around the 1080s performance (as is expected), you’ll still be able to do better with options cheaper than these RTX parts are today.

      Granted if it’s Zen + Navi, a console at launch is a perfectly good cost/performance ratio.

      • jihadjoe
      • 1 year ago

      That China-exclusive AMD Fenghuang looks mighty attractive if someone can hack PS4/XBone firmware onto it.

        • thx1138r
        • 1 year ago

        Man I hope somebody puts one on a mini-ITX board… or a shuttle/Nuc style barebones box….. or even a Mac Mini 😉

        • tipoo
        • 1 year ago

        [quote<]if someone can hack PS4/XBone firmware onto it.[/quote<] So it's not attractive, because that's not happening? Heh. Even when a console is blown wide open for homebrew, that's a very different thing from an OS dump, which is next to impossible working backwards from hardware. Operating systems are compiled down to binary before install, so even if you went to extreme lengths to extract it from the storage, you wouldn't have usable source code, and good luck running a black box OS from binary you don't know anything about on new hardware. All Homebrew does usually is bypass the thing that says "yes, this is a valid token to install"

    • Klimax
    • 1 year ago

    Now all I need is Iray to use it and DAZ to get that new version. Cutting down substantially on render times will be worth that price.

      • BIF
      • 1 year ago

      I agree with this. I think it will happen, and look forward to DAZ making an announcement.

    • travbrad
    • 1 year ago

    I become more and more glad I stuck with a 1080p 144hz monitor. Looks like my 970 is going to be in use for a loooong time.

    • Laykun
    • 1 year ago

    A little worrying. This seems like a “wait for RTX gen.2” kind of moment. For starters they boast mostly how many RTX-OPs the card can perform, but during NONE of the demos do they comment on the most important factor, frames per second. A fully RTX processed scene (the ILM star wars demo) ran at 45ms a frame, which is pretty worrying for people wanting to run games at 144fps (45ms translates to 22.2 FPS). The very fact that actual real world performance is not mentioned once during their demo presentations should be a warning sign. Sure, it’s nice having all this RTX-OPs horse-power but what does it actually translate into at 1440p/4k, even with the hybrid approaches. I have a bit of a problem that these cards are being advertised as 6x the performance of the last generation as joe blow is now going to instantly think that that will translate into 6x speed on traditional rendering, where as the numbers show a much more modest increase in rasterized rendering performance, it feels disingenuous to not make a point of how it’ll perform while rasterizing.

    Furthermore, will the imbalance of processing for traditional rasterised rendering and raytracing mean one pipeline will effectively be waiting on another to finish and thus cause one or the other to go effectively un-used? It looks like the new SM design combines RTX and FP32/INT32 into the same unit with FP32 and RTX running in parrallel, but if there’s more work for RTX than FP32 it really just seems you’re going to leave FP32 idle and waiting while the frame finishes rendering, that sounds like a graphics programmers nightmare (almost seems like going back to the days of fixed vertex and pixel pipelines).

    I’m very cautiously optimistic, it definitely seems like the future but I just can’t help but see caveats.

    • Techtree101
    • 1 year ago

    No HDMI 2.1? Sad :(. I guess the spec wasn’t ready in time. Not exactly needed this year, but next year I’d like to see an HDMI 2.1 video card paired with a 21:9 4K(~) monitor.

      • LostCat
      • 1 year ago

      Only thing I’ve seen qualified for HDMI 2.1 is the X1X…still waiting on the actual displays and such :/

      • BorgOvermind
      • 1 year ago

      HDMI is obsolete.

        • Krogoth
        • 1 year ago

        Nope, HDMI isn’t obsolete. It is the universal interface for the A/V world.

        For computers, it replaced old S-Video, component, composite ports that used be on video cards.

    • NovusBogus
    • 1 year ago

    $600 for a x70 card? Holy smoke, and here I was thinking I’d get one of those to replace my aging 960.

    Worth noting, though, is that next-gen silicon is almost always more efficient but the 2070’s listed power requirement is 175-185W vs. 150W for the 1070 so it’s quite likely they bumped the performance to match the new pricing. One wonders if perhaps NV is running out of low hanging fruit and is going to try goosing the speeds while they search for the next big thing.

      • shiznit
      • 1 year ago

      This isn’t next gen silicon. TSMC 12nm is a custom 16nm.

    • renz496
    • 1 year ago

    btw when we can expect review to come out?

      • Pwnstar
      • 1 year ago

      Mid September.

    • marvelous
    • 1 year ago

    What a rip. Electronic prices have come down except GPU memory and CPU.

    2080 is supposed to perform similar to 1080TI. I knew they weren’t going to kill sales of their old stocks just to sale their new cards.

      • OptimumSlinky
      • 1 year ago

      I dunno, CPU prices are pretty solid right now. Ryzen brought much needed bang-for-your-buck.

        • Krogoth
        • 1 year ago

        Threadripper and Ryzen 7 completely upstaged the enitre HEDT scene. It has never been a better time for power user to setup a DIY workstation/low-end server.

    • synthtel2
    • 1 year ago

    When games aren’t leaning on the new raytracing features, the 2070 is probably similar to Vega (8 TFLOPS and 448 GB/s look like the only reliable numbers we’re going to get). A die without the raytracing hardware could crush Vega in perf/$, but the market isn’t quite such that it’s in Nvidia’s best interests to do that (maybe when Intel dGPUs hit the scene again). Selling a small die for $500 and raking in the profits would be normal, but taking the opportunity to bring in a whole new die-area-intensive feature set as standard really is more Nvidia’s style.

    An often-misunderstood thing is that we’re not looking at replacing conventional rasterization anytime soon. If you try to raytrace all the things in realtime, it’s going to look like garbage and/or run like a slideshow. What’s going on here is just replacing some particular fiddly effects with raytracing, which is still potentially slow (too slow for a 1080 Ti) but saves graphics programmers a lot of time and effort. Of course they aren’t putting their new tech up against any seriously stiff competition; we’re at the bottom end of raytracing’s capabilities and mostly the top end of other techniques, so there’s a lot more room to scale, but I’ll stay skeptical of its IQ/perf ratios even on these new cards until we see it out in the wild.

    As far as reflected environments, it’s pretty much the holy grail everyone thinks of it as. The usual way of doing that is a janky mess, and I can’t wait until raytracing is standard enough that we can forget about the current way. Shadows, maybe. Raytraced shadows really are great, but the competition from shadow mapping is competent. I’m very skeptical about raytraced diffuse lighting (AO in this case). It just takes too many rays.

      • caconym
      • 1 year ago

      I think AO should be at least passable, given how well nvidia’s denoiser seems to work in other renderers. I do worry that frustum-culling of geometry is going to diminish the perceived realness of raytraced reflections somewhat, unless that’s somehow addressed in an API.

      We never had to worry about culling impacting reflections before, since everything was cube-mapped or screen-space.

        • synthtel2
        • 1 year ago

        I’m sure they can make AO work with the help of that denoiser, it just seems unlikely to be very good at IQ per time taken to render compared to world-space AO that already exists. That’s never stopped Nvidia before, though. “Hey gamers, RTAO is the best thing since sliced bread! Don’t look at those comparisons showing it takes 3x as long to run and doesn’t look any better, just buy a 2080 Ti, slam all the settings to ultra, and don’t worry about it!” (… the story of half of GameWorks technologies.)

        I expected it to be storing the whole scene and doing incremental updates; is it not? Dealing with the whole scene makes being strictly immediate-mode infeasible, but if they did want to stay immediate, picking a bounding box or sphere centered on the camera seems like a better bet. Sticking with the camera’s frustum is faster, but a pretty glaring weakness in image quality.

    • Thresher
    • 1 year ago

    Holy crap these are expensive.

    But if there is something all the mining taught them, it was that people were willing to pay that much to get a card.

      • Krogoth
      • 1 year ago

      Exactly, the whole mining craze was an accidental marketing test. Just to see if people were willing to go the extra mile on inflated prices. Nvidia didn’t have to risk mindshare and prestige to do it.

      They got the data and found that there were still enough buyers that would still be comfortable with higher MSRP price points and they could sell off their old stock without having to do price cuts.

      • albundy
      • 1 year ago

      i guess they wanna kill gaming. i wonder how this will affect game publishers sales.

        • K-L-Waster
        • 1 year ago

        Seeing as the existing cards that existing gamers have in their existing systems *still work*… probably not much at all.

    • mikepers
    • 1 year ago

    I was tempted to pick up a Zotac 1080 mini @$449 via Newegg – but I’m guessing the 2070 will likely match or beat it performance wise for not much more money…

    Plus I like that the 20XX cards have the USB Type C connectivity. Did anyone catch if that is a regular port and can drive a monitor that has USB-C input? (not sure if “VirtualLink” is just another mode or something more specific)

      • DancinJack
      • 1 year ago

      Pretty sure it’s soley for VL. VL is for VR, but I haven’t seen much hype around it anywhere else.

        • mikepers
        • 1 year ago

        Yeah, I did a bit more digging and I’m leaning that way as well.

        The VL web site mentions it’s an alt mode of USB Type-C so maybe??

        Bottom line – guess I won’t be picking one of these up until they ship and I start seeing some reviews…

        • mikepers
        • 1 year ago

        Saw this over at Anandtech:

        “Finally, for display outputs, NVIDIA has confirmed that their latest generation flagship once again supports up to 4 displays. However there are actually 5 display outputs on the card: the traditional 3 DisplayPorts and a sole HDMI port, but now there’s also a singular USB Type-C port, offering VirtualLink support for VR headsets. As a result, users can pick any 4 of the 5 ports, with the Type-C port serving as a DisplayPort when not hooked up to a VR headset.”

        So looks like that port might be able to drive a monitor…

    • DeadOfKnight
    • 1 year ago

    Is that…a lack of SLI capability?

      • derFunkenstein
      • 1 year ago

      Multi-GPU is dead. Long live multi-GPU capability!

        • DeadOfKnight
        • 1 year ago

        Supposedly getting a new life with VR, but I haven’t heard much about it in awhile.

      • chuckula
      • 1 year ago

      So all your other posts are conpsiracy-theory level whining about how bad these cards suck and the fact that they cost money.

      And now you are upset that Ngreedia doesn’t force you pay for two of them?

      You need to pick a consistent position here.

      Incidentally, the higher-end 2080 & 2080Ti cards do have SLI, although I doubt it’s worth the effort:
      [quote<]While this wasn’t mentioned in NVIDIA’s Gamescom presentation itself, NVIDIA’s GeForce 20 Series website confirms that SLI will once again be available for some high-end GeForce RTX cards. Specifically, both the RTX 2080 Ti and RTX 2080 will support SLI. Meanwhile the RTX 2070 will not support SLI; this being a departure from the 1070 which did offer it.[/quote<] [url<]https://www.anandtech.com/show/13249/nvidia-announces-geforce-rtx-20-series-rtx-2080-ti-2080-2070[/url<]

        • DeadOfKnight
        • 1 year ago

        I never argued a position. I have a GTX 1080 in my system and will probably upgrade to one of these after the benchmarks come out. But you can’t deny that the way Nvidia sells their cards is a bunch of smoke and mirrors. Not to me though, I’ll wait for the reviews. All that smoke and mirrors is unnecessary. Is the product good? Sure. Is it 10x better? No, it’s not, and this happens every time. It’s better in a select few cases where their proprietary tech is used.

      • DancinJack
      • 1 year ago

      Dual GPUs via NVLink bridge. Gonna be essentially the same function with a better interface. /shrug

    • Xenolith
    • 1 year ago

    I’m assuming I have to wait for the 2060 before getting a mini-itx variant.

    • mikepers
    • 1 year ago

    EVGA 2080 XC GAMING is up for preorder at Newegg for $750…

    [url<]https://www.newegg.com/Product/Product.aspx?Item=N82E16814487404&ignorebbr=1[/url<] The prices go up from there...

      • Freon
      • 1 year ago

      Paid $760 for my Gigabyte Auros 1080 Ti almost a year ago. Not a lot of progress. 🙁

    • DeadOfKnight
    • 1 year ago

    I love how they always manage to limit the scope of their performance comparisons to how well they can run this new feature rather than how well it can play most games. Notice also that they only say that it runs it faster, not that it runs fast.

    Don’t get me wrong, real time raytracing is probably going to be a big deal in the coming years, but here it joins gameworks as the next gimmick Nvidia uses to sell cards, likely implemented in a proprietary manner such that no other card can run it.

      • rahulahl
      • 1 year ago

      Firstly if something runs faster than existing options, it is considered fast.
      Second, the ray tracing is part of Direct X now. So if AMD want to, they can come up with a solution that lets their cards run it as well. Its not a game works gimmick. Unlike hairworks, It actually makes a lot of sense. Not to mention I love the idea of cheaper AA based upon the deep learning.

        • DeadOfKnight
        • 1 year ago

        All good points, except for one. Just because it’s faster, doesn’t make it fast. If enabling RTX mode causes fps to drop below 30, it’s going to be considered an unusable feature by most people. It’s an important feature for sure, as it moves the industry in the direction it needs to go for the future, but it should not be the main selling point for these cards. The DLAA definitely sounds interesting, and I look forward to seeing it in action. That should be the headlining feature for gamers IMO.

    • JosiahBradley
    • 1 year ago

    Seeing as anything negative gets auto downvoted, I’m just going to rate the announcement on the Krogoth Scale.

    78T Krogoths.

    Edit: Yes yes let the hate run through you. This is a terrible launch.

      • chuckula
      • 1 year ago

      Take that Navi card and strike me down! Do it boy!

      [Looks at the throne]

      Ok crap I forgot: It hasn’t launched yet. Ok, take a couple of those Vegas over there and try to maybe give me a flesh wound or something. It’s a start!

    • DancinJack
    • 1 year ago

    Still would have liked Adaptive Sync, but DP 1.4 is a step in the right direction as far as DP does.

      • JosiahBradley
      • 1 year ago

      According to nVidia they are supporting DP 1.4a which is a variant of eDP 1.4a which includes panel self refresh or VRR. Let the rumor mill mill away.

      [url<]https://nvidianews.nvidia.com/news/10-years-in-the-making-nvidia-brings-real-time-ray-tracing-to-gamers-with-geforce-rtx[/url<]

    • ptsant
    • 1 year ago

    I am old enough to remember the exact same reactions when nVidia first touched the $500 barrier.

    The fact is that high-end GPUs are becoming more like cars. Yes, there is improvement between generations, but a 2005 Porsche is still a Porsche, not the equivalent of a 2018 Prius. So, you might be paying top dollar, but it’s probably going to last quite a bit, unlike the time of the Geforce 6600GT or the Radeon 9800, when performance doubled every 18months.

    • Eversor
    • 1 year ago

    Guys, the crazy expensive card is a 750mm² GPU… there is no place on this planet where that can be cheap, margins aside. While Nvidia has been charging a lot for GPUs in all segments, a chip this size is always going to be particularly expensive, like the Titan V was.

    Hopefully, after it has settled, we will see the RTX 2080 go to $459-$499 and the 2070 somewhere below $400. The Ti? They’ll go to datacenters instead.

    Note: 8800 GTX was a 484mm² die with a launch price of $599.

      • NoOne ButMe
      • 1 year ago

      It’s probably not just a 754mm^2 die, given the Quadro RTX 5000 exists. Given Nvidia’s volumes a smaller die, about 60-70% of the large die. “Just” 500mm^2 or so.

        • Eversor
        • 1 year ago

        The 2080 Ti is using a TU102 GPU, which is that big die. The smaller TU104 is for the 1080 and it seems closer to 500mm². The original 1080 was 314mm².

    • Johnny Rotten
    • 1 year ago

    So no actual benches from anyone for another month? No seed cards for review sites?

    • Leader952
    • 1 year ago

    Nice to see the real published official board power of 185W for the RTX 1070, 215W for the RTX 2080 and 250W for the RTX 2080 TI instead of the leaked PNY 285W that was spread about as FUD.

      • Pwnstar
      • 1 year ago

      Because leaks are always correct, right?

      • Phartindust
      • 1 year ago

      Was that PNY board configured with stock settings? May have been slightly overclocked, hence the extra 35w, or just an engineering sample that was pulling a bit more juice.

      Either way, doesn’t seem out of the realm of possibility we will see hot clocked cards pulling 285w

    • Kretschmer
    • 1 year ago

    My 1080Ti is a…value card?

    To me the relevance of these cards depends on how quickly Nvidia can push ray-tracing down to their mass market SKUs. If the new technique requires a 2070 or above and requires a lot of effort to implement, we probably won’t see RT often outside of a few Nvidia-sponsored titles. If it is easy to implement or available via a shiny new 2060 or 2050, then these cards will be (expensive) monsters.

      • Beahmont
      • 1 year ago

      Yeah, the 2060 and 2050 are the real cards to watch out for. If they have a mere double ray tracing performance of what a 1080 has, then it’s pretty much game over for traditional rasterization.

      And from what the developers are indicating, adding Ray Tracing is tedious, but not difficult. It’s just adding information to the models and world details. So it’s something they can just throw a few interns and a project lead at and have it done. From their perspective it’s practically free performance and practically free image quality.

      • derFunkenstein
      • 1 year ago

      I think it’ll be a while before RTX hits $300-and-under graphics cards. There’s a reason this is the largest piece of silicon Nvidia ever shipped, and it’s not because of the traditional rendering pipeline.

    • brucek2
    • 1 year ago

    Is there still a lot of Pascal inventory out there? If so I wonder if one purpose for announcing high prices now is to convince some buyers that they are safe to buy previous generation today at good prices?

    • blastdoor
    • 1 year ago

    Hmm…. I was starting to find myself persuaded that perhaps discrete GPUs would indeed become a niche, but if real time ray-tracing becomes a thing everybody wants, then discrete GPUs might be around for quite a while longer.

    Or…. perhaps discrete CPUs will become a niche 🙂

      • chuckula
      • 1 year ago

      What if Threadripper 2 and Turing were clear proof that…. INTEGRATED GRAPHICS IS DYING!

      • Pwnstar
      • 1 year ago

      The top of the GPU market will always be discrete, due to size and power reasons. But you are correct that that segment might become niche.

    • USAFTW
    • 1 year ago

    Will the FE tax also spill over to partner boards like with the initial batch of the 10-series? AFAICR, no partner cards worthy of note were priced below the FE version.

    • derFunkenstein
    • 1 year ago

    I think before we cry too much about the price tag (which is elevated because it’s a Founder’s Edition pre-order) we should first look at performance.

    And let’s face it, part of this is a fee for getting into real-time ray tracing at the ground level. Either you’re willing to pay it or you’re not.

    • MOSFET
    • 1 year ago

    I wanted to find out why the name changed from Jen-Hsun to Jensen, and I learned some things.

    “After college he was a Director at LSI Logic and a microprocessor designer at Advanced Micro Devices, Inc. (AMD). He is the uncle of AMD CEO, Lisa Su.” -Wikipedia

      • derFunkenstein
      • 1 year ago

      And today he was the daddy of AMD.

    • Stochastic
    • 1 year ago

    So when can we expect 7nm cards? Late 2019? 2020? My guess is that’ll be the true next big performance jump.

    Also, if raytracing acceleration isn’t baked into Navi, we probably won’t see widespread support for it for a long time. Raytracing would indeed be a paradigm shift in realtime computer graphics, but I don’t think we’re there yet. It also remains to be seen what the performance hit is for enabling RT.

      • chuckula
      • 1 year ago

      Do I expect Nvidia to have some 7nm product out in 2019? Yes.

      Do I expect it to be a [b<]consumer[/b<] product? (and that even includes $1000+ halo parts): No. Instead I'm expecting the [b<]really[/b<] high-end Volta replacement to ship in some capacity and even that is probably late 2019.

      • K-L-Waster
      • 1 year ago

      So you’re expecting that games won’t use ray tracing unless AMD implements it in hardware?

      I’m not expecting it to be widespread any time soon either, but it would be short sighted of game makers to ignore the chip maker that has 70% of the AIB market…

        • NTMBK
        • 1 year ago

        It’ll probably be in as an optional Ultra graphics setting, especially if Nvidia sponsor the game.

          • Leader952
          • 1 year ago

          No it will be a simple switch (or auto detected) for Ray Tracing to be enabled. Look at the examples of game trailers published today it was a simple on/off type switch.

            • NTMBK
            • 1 year ago

            You… seem to be describing a graphics setting?

        • Stochastic
        • 1 year ago

        I would agree with you IF the PS5/Xbox Whatever use Nvidia chips. If they use AMD chips, then AMD still wields a ton of influence among developers. Of course, this is assuming it’s difficult to swap from rasterization to raytracing. Nvidia is making it sound like this will be relatively trivial for AAA studios. I guess we’ll find out soon enough.

          • Beahmont
          • 1 year ago

          You’re not understanding the process then. Developers love this. It makes their life stupidly easier. They get better graphics for just building better world models. They don’t have to do much of anything to get much better graphics performance and looks. Ray Tracing on Turing is all internal to the card, it just has to have the information to work. It’s not hard at all for the developers to just add the information needed to the models.

            • smilingcrow
            • 1 year ago

            The point is that they are doing extra work that only benefits a small percentage of users when you add together consoles and PCs.
            Which is why NV will need to subsidise this a lot initially I imagine which is another reason why the cards are so expensive maybe!
            Buy these cards and you are indirectly paying developers to implement features unique to these cards.

            • Beahmont
            • 1 year ago

            Umm… No. That’s not how this works actually.

            Adding the additional world info is just data entry. It’s cheap and easy. On the other side they get stop doing much more expensive post processing work for additional art work to clean up graphics issues with traditional rasterisation.

            Developers like Nvidia’s Ray Tracing implementation because it reduces their overall work and increases performance and quality.

            • nanoflower
            • 1 year ago

            I think it’s you that is missing the point being made. It’s not how easy it is to implement. It could be done by Nvidia and it wouldn’t matter if only a few percentage of gamers can take advantage of it (because the cards are so expensive.) It’s not until a large percentage of gamers have cards that can use DXR features that developers ” get to stop doing that much more expensive post processing work for additional art work to clean up graphics issues with traditional rasterisation.” Given how the market typically goes we are looking at a least a few years before the market penetration of DXR features in game machines those gains you are talking about will take some time to arrive.

            • VincentHanna
            • 1 year ago

            Or they could simply half a$$ the “mid-range” post, design for ray-tracing, save 1000s of work hours, and get very pretty marketing shots and demos with little-to-no effort.

            Of course, I’m sure no corporation would ever think that way.

          • Klimax
          • 1 year ago

          It’s in Unreal Engine. There shouldn’t be much of difficulty of enabling option. Most expenses related to this feature will be in testing.

        • NoOne ButMe
        • 1 year ago

        [url<]https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam[/url<] Based on MSRP pricing, it will be something around 3-4% of Steam users surveyed. Optimistically 10% of target market. Excluding consoles.

        • VincentHanna
        • 1 year ago

        I mean, the entire purpose of Ray Tracing is to make the life of game designers easier. Anyone who continuously beats their head against the wall of designing realistic looking lighting and/or stage design when RTX is a thing, would be particularly foolish.

      • HisDivineOrder
      • 1 year ago

      I expect AMD to announce something like OpenRTX that competes with nVidia and completely negates the feature for several generations because few companies will want to wade into the waters of a format war.

        • Leader952
        • 1 year ago

        So you want AMD to ignore Microsoft’s DirectX 12: DirectX Raytracing (DXR) and produce a proprietary API. That is really stupid.

          • Pwnstar
          • 1 year ago

          A hypothetical OpenRTX would not be proprietary, what are you talking about?

            • Phartindust
            • 1 year ago

            Well AMD already has Radeon Rays available. I don’t know enough to say that it is the same or similar to RTX, but it seems to be doing the same thing. Would someone with more knowledge of game dev and APIs care to comment?

            [url<]https://gpuopen.com/gaming-product/radeon-rays/[/url<]

            • BurntMyBacon
            • 1 year ago

            As I understand it, Radeon Rays is equivalent to if nVidia would have implemented RTX as Cuda code to be run on top of the GPGPU cores. There does not appear to be any fixed function or ray tracing dedicated hardware involved here.

    • smilingcrow
    • 1 year ago

    Those diamond lined Ostrich leather jackets don’t come cheap folks.
    Jensen has to pay for them somehow and he wears a new one every day; only joking about the dry cleaners.

    As expensive as it is I suspect that the majority would prefer one of these alongside an 8C CPU rather than a TR 2950X with a $400 GPU.
    So maybe this is HEDT for graphics cards?
    If they’d called it a Titan maybe they’d have got away with it but people will compare it to the 1080 Ti and their brains will explode.

    • USAFTW
    • 1 year ago

    Can’t wait to read up on how they managed to beat a Titan Xp with that 2070 despite the significantly fewer number of ALUs. Will the reviews also be up a month from now?

      • DancinJack
      • 1 year ago

      Sept 20 retail release, so no later than that.

      edit: FWIW that’s for the 2080/Ti, not the 2070. I suspect more like a Nov/Dec timeframe for 2070.

    • K-L-Waster
    • 1 year ago

    And based on the pictures we don’t even get a leather jacket this time around…

    • christos_thski
    • 1 year ago

    So Jensen Huang managed to talk approx 2 hours presenting a new GPU without a single effing mention of performance relative to previous gens? (please don’t tell me that nvidia’s arbitrary RTflops metric counts, that’s like comparing hairworks performance. let’s call it “shinyflops” until a game actually makes use of raytracing for anything better than an additional reflection here and there).

    Whoa.

      • NTMBK
      • 1 year ago

      Yeah, I’m curious too. They blew a bunch of die area on features that most games won’t use (yet), Tensor Cores and RT Cores. Might not be that big a jump over the 1080ti in traditional games.

        • christos_thski
        • 1 year ago

        I was thinking that too, though we might be wrong. But I would be shocked if there were substantial raster improvements and he glossed them over just to hype RT.

        Guess AMD and intel get another chance at competing, then.

    • tipoo
    • 1 year ago

    I actually find I don’t have sticker shock – they substantially improved traditional raster performance at the same time as adding a massive new logic block for RT, on a 1_nm node, it was going to be expensive.

    I feel like the real game starts with 7nm, when this feature set can be sold at a smaller size/price.

    I’m also wondering if AMD has a response to this – this is a whole new architecture that took years of work, and if they’re going to have an RT response soon, they’d have had to have been working at it for years.

      • leor
      • 1 year ago

      Did I miss the part where he spoke about traditional raster performance? I didn’t catch the beginning, but it sounded like he wanted to re-frame how we look at performance with RT only.

        • drfish
        • 1 year ago

        You didn’t miss it. Everything was based on fancy new made-up metrics.

        • rahulahl
        • 1 year ago

        Its in the part where he mentioned the infiltrator demo running at 78 FPS on 2080Ti compared to 30 FPS on 1080Ti?

        I might be misremembering the exact numbers, but it was in there.

          • Demetri
          • 1 year ago

          Wasn’t that with the neural network AA technique he was talking about which uses tensor cores? DLSS they called it. I’m still not clear if it was a fair comparison.

            • rahulahl
            • 1 year ago

            If DLSS is something you can apply via the control panel to existing games, then I would consider that a fair comparison. To me it basically is a cheaper AA, which is good because I have AA enabled in all my games. So decreasing the cost of AA sounds like a fair boost in performance to me.

    • NTMBK
    • 1 year ago

    Poor Volta

      • chuckula
      • 1 year ago

      #WeWereRight

      #WeNeverSaidPoorTuring

      #TuringMakesYouPoor

        • Srsly_Bro
        • 1 year ago

        #PoorUs

    • End User
    • 1 year ago

    I learned my lesson. I’m going to stick with my GTX 1080 Founders Edition until the 2080 Ti partner cards ship with kick ass coolers.

      • DancinJack
      • 1 year ago

      I don’t disagree with what you said, but these coolers look significantly better than the previous Founders Edition blowers.

        • End User
        • 1 year ago

        Perhaps. I’ll wait for the benchmarks to be sure. I can wait a few months this time around.

        • Leader952
        • 1 year ago

        They are also quoted as 1/5 the noise level of the GTX 1080 Ti at full power. If that is true I don’t believe that 3rd party cooling will be any quieter.

        • End User
        • 1 year ago

        Boom!

        [url<]https://www.evga.com/technology/icx2/[/url<]

      • Kretschmer
      • 1 year ago

      I like the blower cooler on my 1080 Ti FE. It’s great for a mITX case.

        • End User
        • 1 year ago

        I do like the blower style from the point of view of it dumping the hot air directly to the outside of the case.

        If I go with an open air cooler I’ll have to rethink my case fan arrangement. 🙁

      • DavidC1
      • 1 year ago

      The dual fan setups by their nature cool better and are quieter than the blower ones. With the blower not only you have one fan meaning it needs to spin at higher RPMs, it needs to expel the air all the way to the back of the card creating that vacuum cleaner noise.

    • DancinJack
    • 1 year ago

    So it looks like all the OEM cards can (and should) start 50-100 bucks less than the Founders Editions. Jensen said starting at 499 so I assume that means 2070 OEM price.

    edit: Actually
    2080 Ti from 999
    2080 from 699
    2070 from 499

    Those prices are better, for sure. Now let’s hope OEMs don’t jack them up too far.

      • DavidC1
      • 1 year ago

      Please go to Newegg.com and tell me how much the cards are?

      MSRPs have been misleading since the 1080 and Founder’s Edition came out. FE has the crappiest coolers and charge most. AIBs are going to price it high as they have better quality PCBs and fans.

        • DancinJack
        • 1 year ago

        749 for the lowest 2080. 50 bucks isn’t INSANE, but it’d still be nice to models at 699.

        [url<]https://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709%20601321492%20601321493&IsNodeId=1&bop=And&Order=PRICE&PageSize=36[/url<]

    • chuckula
    • 1 year ago

    For all the outrage, let’s do the timewarp again:

    [quote<]AMD's Radeon Vega Frontier Edition graphics card launches today, and we have official answers for most of the questions that have arisen about the Vega GPU's specs and capabilities over the past couple of months. [b<]The Frontier Edition family will comprise a pair of cards: an air-cooled model that's launching today for $999, and a liquid-cooled version that's launching in Q3 for $1499.[/b<] AMD isn't talking about the specs or performance of the liquid-cooled card today, but it's divulging enough for me to conclude that my predictions of the chip's clocks and peak performance in May were correct for the most part.[/quote<] [url<]https://techreport.com/news/32163/updated-radeon-vega-frontier-edition-launches-today-for-999-and-up[/url<] Before you say "but that's not the same!" the "frontier editions" aren't official workstation cards, those are the "WX" series parts that launched later (in fact the $1000 Vega-56 WX equivalent literally just launched within the last week... so the $999 "Frontier Edition" was cheap in comparison). I think the difference in the level of outrage here is that nobody was really all that interested in the Frontier Edition products so there was little whining about price. However, the level of interest in Turing is a whole lot higher, hence the increased level of indignation.

      • cynan
      • 1 year ago

      But that’s not the same! Frontier Edition cards [i<]were not[/i<] gaming cards. And with the HBM2 delays, there were only like 10 for sale at launch anyway, regardless of consumer interest. People can be outraged all they want, but if Nvidia an get $1000+ for their flagship gaming card, then so be it. People were paying way more for a 1080 Ti only a few months ago.

        • derFunkenstein
        • 1 year ago

        [quote<] Frontier Edition cards were not gaming cards[/quote<] Please. They used the same GPU as the current Vega 64. They use the same drivers. AMD could say they're not gaming cards but that doesn't change what they actually are.

          • synthtel2
          • 1 year ago

          Being a gaming card requires a price tag not massively out of line with gamers’ expectations. Some non-gaming segments are at least as much defined by wanting the best available and being insensitive to cost as by any particular feature that may not be available on gaming cards. Neither of them is Firepro/Quadro-class, and regardless of why exactly the price is so high, gamers aren’t going anywhere near that stuff unless they’ve got money coming out their ears.

          Personally, I think Titan and Frontier are both ridiculous. Whether all that extra money is buying some dregs of extra performance because Nvidia’s in a position to halo like crazy or slightly earlier access because AMD promised investors it’d be a product in 1H doesn’t change the picture much in my mind.

          • Krogoth
          • 1 year ago

          Actually, the drivers and firmware for Frontiers have support for some professional-tier stuff and have 10-bit color.

          It is more accurate to say that Frontier Vegas are really just Vega-based FirePro “rejects” that ate too much power at load.

          Frontiers are AMD RTG’s answer to Nvidia’s Titan line-up. Both are really just marketing experiments at trying to sell off Quadro/FirePro “rejects” as “prosumer/hobbyist” SKUs.

            • derFunkenstein
            • 1 year ago

            ACKSHOOLEY every GCN Radeon and later had [url=https://community.amd.com/thread/200047<]10-bit color support[/url<] in full-screen applications. And I can't find anything to prove that it works with windowed apps like Photoshop. So here's a big [citation needed] for your trouble. I know they marketed it as a "not gaming card" but let's be honest about what it is.

            • Krogoth
            • 1 year ago

            Actually, none of the customer-tier Radeon SKUs have proper 10-bit color support. It is locked behind firmware and software just like Nvidia customer-tier SKUs. Despite the fact that the silicon can easily do it and they are equipped with proper ports for it (HDMI/Displayport.)

            You need to get a Quadro/Titan/Frontier/FirePro if you want proper 10-bit color support.

            • derFunkenstein
            • 1 year ago

            ACKSHOOLEY you still need to back up your argument. I submitted evidence that directly cites AMD and Club 3D sources and you say “no that’s not right”. Well, fine then. Prove it.

            • Krogoth
            • 1 year ago

            If you actually read your sources. It is painfully obvious that proper 10-bit color support is still locked behind firmware and software limitations not hardware. The software is looking for Quadro/FirePro/Titan/Frontier cards not Geforce/Radeon cards.

            It is good, old market segmentation at work. I have a bloody Vega RX 64 on hand and I can tell you first-hand that it doesn’t have proper 10-bit color support. I cannot use 10-bit color in Windows 10 UI or in most applications that actually use it.

            • derFunkenstein
            • 1 year ago

            [quote<]in Windows 10 UI[/quote<] Exactly. That doesn't mean that they can't do 10-bit color, though.

            • Krogoth
            • 1 year ago

            The silicon can easily do it. GCN 1.x/Post-Fermi silicon are able to do 10-bit color. The problem is that proper 10-bit color support is locked behind firmware and software limitations due to market segmentation.

            Nvidia and AMD RTG are not going to be cannibalizing the healthy margins on their 2D professional artist and video editing market.

          • cynan
          • 1 year ago

          I’ve lost count over the years how many times workstation cards have had the exact same silicon as gaming SKUs. Often, the difference has only been drivers. And there very much was a “PRO” version driver for the FE at launch, irrespective of whether it works with RX Vega drivers.

          If we’re being honest, the real reason the frontier edition ever existed was so AMD could get a limited production (related to gaming SKUs) Vega-based card to market, in the midst of the HBM2 supply fiasco, by the end of H1 2017 because Lisa Sue promised. But I don’t see why that disqualifies it from being a workstation card. You (and many) may think it was just a glorified RX Vega (though it did have more VRAM), and you have a point. But given the tenuous distinctions between pro and gaming SKUs in the past, I don’t see why the FE is being singled out.

      • synthtel2
      • 1 year ago

      Frontier was a Titan-equivalent. That one really doesn’t compare.

      • Demetri
      • 1 year ago

      Nobody was interested in Frontier because we knew products with the same game performance at half the price were only a couple months away. If you told me a $500 2080 Ti would be available in November, I wouldn’t be shaking my head at today’s $1000+ price tag.

      • NoOne ButMe
      • 1 year ago

      AMD was up front and clear about what these cards were, and included extra driver support.
      Bad buys, and same cooling solutions as reference cards, yes. But AMD wasn’t trying to tell everyone they should buy these cards instead of AIB cards.

      Nvidia charged an extra $50-100 bucks for the reference cooler.

      • DragonDaddyBear
      • 1 year ago

      My memory may be fuzzy but couldn’t you switch between gaming and professional drivers with that card? It is that something all Vega can do?

      • Redocbew
      • 1 year ago

      Wha…?

      Oh, yeah. There was a product launch today, so there’s going to be drama. I wonder if there’s a quantum mechanical explanation for the constant churn and speculation about these things. You might expect it to die down after a while, but it never really goes away sort of like the vacuum fluctuations that permeate so called empty space.

      Maybe this is why I’ve never understood the whole fanboy outrage thing.

      • Sahrin
      • 1 year ago

      What bunch of idiotic tribal whataboutism.

        • chuckula
        • 1 year ago

        So basically every post you’ve ever made is a real example of “idiotic tribal whataboutism” but a real-world factually accurate example that doesn’t conform to your prejudicial hatred is suddenly “tribal whataboutism”. Check.

        So we’re back to the usual anything AMD does is always good because of my subjective emotional state… again.

        Once again, the major takeaway here and thing that REALLY gets under the skin of shills like you: When Nvidia charges a lot for a product, people get upset [b<]because they want the product[/b<]. When AMD does the same thing -- and they do -- people don't get upset [b<]because nobody cares[/b<]. As for the prices, Nvidia makes you pay a lot for interesting technology. Since Vega has launched AMD has been making you pay a lot to support the mining market.

          • thx1138r
          • 1 year ago

          Sorry, how does buying a Vega mean you are supporting the mining market exactly? Surely if you buy a new Vega to play games you are taking one away from the miners, and making it slightly harder for them to get one themselves, i.e. hurting the mining market.

          • Sahrin
          • 1 year ago

          Lol, I’ve been on tr longer than you’ve been alive, kid.

            • chuckula
            • 1 year ago

            You just insulted me as a neckbeard yesterday and today I’m supposedly less than 12 years old?

            Fascinating…..ly stupid on your part.

            Funny how you also literally just said that you can’t keep up with a 12 year old.

            • Jeff Kampman
            • 1 year ago

            Enough of the insult-posting from both of you, thanks.

      • Sahrin
      • 1 year ago

      What price did AMD’s top-line 1080 Ti competitor Vega 64 launch at?

      You’re also rather stupid implying that we were all ok with the launch priceof Vega. I don’t remember saying anything positive about those numbers, either

    • DavidC1
    • 1 year ago

    I don’t think the pricing has to do with mining, nor inventory.

    I think it has to do with wanting higher revenue. They did that for the past few years.

      • Krogoth
      • 1 year ago

      Mining craze spike tested the waters for higher MSRP and there were non-miners who were still getting cards at inflated prices.

      I was expecting $999 MSRP for 2080Ti, but Nvidia is being more bold with $1,199 MSRP. I suspect it will still sell like hot cakes.

        • kuraegomon
        • 1 year ago

        If it turns out to be a big step up in 4K performance, then yep, many hotcakes will be sold.

        • K-L-Waster
        • 1 year ago

        Last gen the FE cards were ~$100 over the MSRP for 3rd party boards. If the same happens this time around the custom AIB 2080 TI may be closer $1099… which is still vertigo inducing of course.

        • DavidC1
        • 1 year ago

        Little to do with mining.

        If you can convince majority of buyers to pay more, then that results in direct increase in revenue. That’s what they want.

        Every quarter they say revenue increased by a lot and PC gaming isn’t dead. But how much of that is volume increase, and how much is the increased price? This is a tactic.

          • Krogoth
          • 1 year ago

          Mining craze indirectly caused it. Etailers milked the market when demand outstrip supply. It tested the waters to see if the market was willing to accept higher MSRP. It turns out that high-end market is more than willing to and Nvidia wants a bigger piece of the pie.

            • DavidC1
            • 1 year ago

            Mining was merely a distraction.

            Price increase has been happening for generations. Top cards used to be $600 max. Slowly, they increased prices. They charged you a mere $649 for 980 Ti, a 600mm2 part.

            Then they called the next gen “1080” and priced it starting at $599, for a 314mm2 card. Really though you were paying $650 to $700 because the reference version, no-wait-Founder’s-Edition was at $699 with a noisy blower, and for AIB cards with much better coolers to cost $100 less was crazy.

            Nevermind the Titan cards.

            You think its limited to graphics?

            Intel’s Extreme Editions were considered overpriced at $999. The 7980XE cost what $2k? Companies have become much better at extracting money, and increasing revenue. It’s not just the tech world.

            • Beahmont
            • 1 year ago

            So basic economics and base inflation don’t happen? Card prices can’t stay the same unless costs stay the same. If costs change the price will change. Costs to manufacture have gone up. So prices are going to go up as well. Thinking you can pay the same price for a top end card or CPU for ever is just plain dumb.

            And that’s not even taking into account that the performance has also been increasing each generation of GPU’s. And if Nvidia are to be believed, performance of these cards is well more than double the previous generation but cost didn’t double.

            This analysis just isn’t rational.

            • DavidC1
            • 1 year ago

            “And if Nvidia are to be believed, performance of these cards is well more than double the previous generation but cost didn’t double. ”

            Double the performance in ray tracing is not double the performance. The increase in non ray tracing is likely going to be 30% or so.

      • tipoo
      • 1 year ago

      Seriously wondering what the AMD answer to this is going to be. The RT bits are a whole new architecture years in the making, if they’ll have a response soon they’d have to have been working on it for years.

    • Star Brood
    • 1 year ago

    1200 for a video card. I really don’t care how insane the price/performance ratio is. These would be used for video games. Video games. Sorry man, no. For pro use, charge whatever you want. But VIDEO GAMES. Come on. These prices are astronomical. The high end cards used to be completely reasonable.

      • K-L-Waster
      • 1 year ago

      If the price is too high (and it may well be — that’s a lotta sticker shock right there…) they won’t sell much and NV will drop the prices to move units.

      Wouldn’t recommend being an early adopter at the very least. Wait 6 months or so and see how it shakes out.

        • renz496
        • 1 year ago

        they will drop price when they able to move more pascal and when AMD release something to the consumer market.

        • albundy
        • 1 year ago

        considering this, it looks like i might hang on to my 970ti for a decade.

          • techguy
          • 1 year ago

          I would do that too if I had a one-of-a-kind card. Should probably get it signed by Jen-Hsun and put it in a museum.

        • Sahrin
        • 1 year ago

        No they won’t. We’re competing with big data and the finance industry for good dice, and they’re willing to pay a lot more than we are.

        If they don’t sell to us, nVidia will just turn around and sell them to the ibanks and 2nd-tier deep learning companies for five times what we are paying.

        That’s what this increases all about. Enforcing the higher margins they’ve been getting from the pro space on gamers.

          • Krogoth
          • 1 year ago

          Actually, gaming-tier silicon aren’t attractive to big data and major AI learning customers because they eat too much power(Thermal limitations for clustering).

            • Sahrin
            • 1 year ago

            Right, that must be why nVidia’s biggest growth segment is deep learning.

            Volta is literally their first datacenter-only product and the most popular version of that card comes with a display head.

          • K-L-Waster
          • 1 year ago

          No one is forcing you to buy one.

          Seriously, folks, newly launched graphics cards being too expensive is the epitome of first world problems. If you feel it’s too expensive, keep your money in your wallet.

      • renz496
      • 1 year ago

      and video games is not a necessities in life. and some people get mad that their expensive hobby becoming more expensive haha. but realistically if you just want to play games there is no need to be “high end” for it. current mid range hardware already capable of doing that. and it already doing a very good job at it. the thing is we consumer keep failing for GPU maker marketing. 1080p? no we should push for 4k. already got 4k? then we should push for 4k 144hz. and we want to play everything with max out setting without sacrificing those 144FPS. else we can’t call it as “gaming”.

        • Gastec
        • 1 year ago

        I’m pretty sure the ones selling us these electronic products don’t have the “I really don’t need to make so much money” attitude.

      • rudimentary_lathe
      • 1 year ago

      A few years ago a new GTX 970 could be had for ~$300 USD. Two generations later that same tier now goes for $600 USD. Sure the new cards are much more capable, but that’s always been the case and the pricing has never blown up to this extent. And Nvidia’s margins suggest there isn’t much cost pressure.

      So… monopoly pricing anyone?

        • techguy
        • 1 year ago

        Yeah, the pricing situation is not good for consumers. We can’t lay the blame entirely at the feet of Nvidia though, they simply would not be able to sell these products at these inflated prices (compared to previous generations) if there were ANY competition in graphics. At this point, AMD’s highest tier SKU will compete with the RTX/GTX 2060 – maybe. Without a 7nm refresh to bring down prices, we might just see Vega go EOL as who in their right mind would buy a $600 card that competes with a $300 card and consumes 2-3x the power?

          • BurntMyBacon
          • 1 year ago

          Two points of interest:
          1) I thought Vega 7nm was for the professional market only. IIRC Navi was the 7nm chip for the consumer market. So isn’t Vega going EOL in the consumer market anyways? (O_o)

          2) Given the theoretically low (based on clocks and resources) improvement of the announced RTX2xxx series cards over their GTX1xxx series counterparts in traditional rasterization, I’m having some doubts as to whether the RTX/GTX 2060 you mentioned will match Vega64. I’m going to wait for some reviews of the current cards before I pass judgement.

          That’s not to say that Vega isn’t priced poorly for its performance. Combination of mining and use of more expensive memory technology hasn’t been kind here. Even if AMD continues to use HBM for their highest end SKUs, I think AMD would do well to release GDDR based high-end options instead of relegating them to midrange and lower.

        • Krogoth
        • 1 year ago

        They are competing against themselves and their shareholders don’t want to them get rid of the healthy margins on the old stock yet.

        The idea is that Nvidia wants most customers to clear out its excessive Pascal stock (mainly GP104, GP106) until the lesser Turing SKUs enter into the fray.

        • K-L-Waster
        • 1 year ago

        AMD repeatedly shooting themselves in the foot in the GPU space doesn’t make NVidia a monopolist. It isn’t Jensen’s fault only one team showed up ready to play.

      • aspect
      • 1 year ago

      I bet this isn’t even the true Ti version in the sense that they’ll release something like a Ti Black edition in under a year for like $1500.

      • sonofsanta
      • 1 year ago

      I just checked a 980 launch review, and that top-end card came out at $549. That’s cheaper than the third tier card here (because we all know, full well, the partners [i<]won't[/i<] bring their cards out at the suggested price, but as close to the FE price as they can get away with). Absolutely ridiculous. This is even more absurd as market inflation than the iPhone X.

      • Chrispy_
      • 1 year ago

      $1200 buys you [i<]a complete gaming PC[/i<] that achieves 95% of what is possible from current games.

      • Kretschmer
      • 1 year ago

      Don’t be a bleeding edge adopter, then. Nvidia knows that it can soak these folks, and they know that they’re being soaked.

      Pascal is still a fine choice.

      • Eversor
      • 1 year ago

      For perspective:

      [code<]- 2080 Ti, TU102, 754mm² - 2080 , TU104,~500mm² (it's smaller than TU102) - 1080 Ti, GP102, 471mm² - 1080 , GP104, 314mm² ($599 at launch) - 8800GTX, G80 , 484mm² + I/O chip ($599 at launch) [/code<] Now, please ELI5 how they're so massively jacking up prices when these are so different chips than last gen. The 2080 Ti is a "prosumer" card that packs a lot of silicon... Ask someone for the math on yields for those. The Pascal cards were overpriced. You got midrange sized chips for high-end prices, with a 256bit wide memory bus. These? Surely the Founder Editions, or all of them if rendering current game code is not a reasonable improvement over Pascal.

      • freebird
      • 1 year ago

      Yeah, I would like to see more games implement DX12 Multi-GPU support.
      Several games do.

      [url<]https://linustechtips.com/main/topic/860201-dx12-multigpu-game-support/[/url<] I'm sure many PC Gamers would appreciate the ability to add a 2nd (reasonably priced) GPU for a mid-lifer kicker/upgrade. I wouldn't love to find a Tech/gaming website that test how well Multi-GPU gaming can or can't be... (hint)

      • TurtlePerson2
      • 1 year ago

      Look at cars, you can pay $12,000 or $500,000 for a two seat car with a small trunk. One accelerates twice or three times as fast as the other. Some people pay for the more expensive car and think it’s worth it.

      The way I see it, it’s nice that there are people willing to pay ridiculous sums for things, as it subsidizes the low-to-medium tier products that I buy.

      • kvndoom
      • 1 year ago

      We’ve past the $1000 price point for smartphones. “Whatever the market will bear,” as they say. Ugh.

      • BorgOvermind
      • 1 year ago

      So we get a card with 10-to-12% more performance and about 10+% more expensive (comparing each 7, 8 and Ti).
      Pass. My 1070 is fine for a few more years.
      That dude makes a face like presenting a fake card, just like nV did before. (Let’s recall the fake one that they used wood screws for: [url<]http://overmind.ro/Rev/images/FakeF_NoCnx.jpg[/url<] ) And why do they have to name them like Radeons ? They did not ran out of alphabet yet. G was fine. Oh...maybe because they inspired themselves a little (more) from the Radeon's shader 4:1 architecture since the 1k series.

    • aporetic
    • 1 year ago

    My guess is nVidia saw the eyewatering prices people were willing to pay during the mining craze and realized that they were leaving a lot of money on the table that otherwise went to scalpers and middlemen. It’s frustrating, but unless/until AMD can cook up a proper response there’s no reason we should expect anything different.

      • kuraegomon
      • 1 year ago

      This is probably the most intelligent response to the, ahem, [i<]elevated[/i<] prices that I've seen here. This is simple supply/demand curve stuff. NVidia now has purchasing data that suggests that their dominant performance and market-share position on the high end can be translated into (much) higher margins. Only AMD can save us from this, by becoming more competitive. NVidia is a publicly-traded company, with a responsibility to its shareholders to maximize revenue and profits. Their prices will drop if-and-only-if the market responds by not buying sufficient units at their current price.

        • Thresher
        • 1 year ago

        It will be interesting to see if the 1080 demand stays level, even with the availability of these new cards.

        If it does, then it means people don’t see the value in the new cards.

          • Krogoth
          • 1 year ago

          1080 already has been phased out as 1070Ti (much cheaper to make and yields 98% of 1080’s performance)

        • VincentHanna
        • 1 year ago

        You are assuming something that is kindof central here.

        That because the price went up, their margins went up.

        This chip is 60% larger and has the equivalent of a spare GPU dedicated to accurate light rendering. Maybe the extra $100 went there?

      • Eversor
      • 1 year ago

      If mining remains viable, there is type of subsidy for the GPU’s price, while doing something very useful with otherwise idle computing resources. So you have to put the price into perspective of these new “subsidies”.

      You could’ve always done useful things with idle hardware (folding) but this is probably one of the first eras in the history of computing where you actually also get money back for it. Keep that in mind.

        • drkskwlkr
        • 1 year ago

        Mining has not been viable since at least late February, early March. No GPU purchased at or above retail price in 2018 will ever mine enough currency to pay for itself, let alone cover the electricity bill.

      • adamlongwalker
      • 1 year ago

      I agree. I also think that they priced it this way to keep the 10 series relevant, since they have more than a 1 million overstock (rumored to be 3 million) of these chipsets that Nvidia has.

      Current lowest 3rd party prices new I have found today for the 1080ti is going for 600 bucks new.

      That’s only a 100 dollar drop from their launch date. This is not a huge drop in pricing for an older chipset and the same thing goes for the regular 1080.

      1’ve Already seen the Asus putting a premium on their cards So their 2070 card is going for around 870 dollars at this moment.

      You see where I’m getting at? This is why the late launch date. I believe that they are trying to get rid of old stock asap. $600 bucks for a new 1080ti is rather tempting.

      Nvidia however has virtually got a monopoly on the video card market to the point of having a major influence with the video gaming industry. An industry that I am well versed in.

      To put it mildly I am concerned about all of this because of the lack of competition that is involved right now. Nvidia can do whatever it it pleases and everyone else has to tow the line.

      • JustAnEngineer
      • 1 year ago

      NVidia’s evil marketing geniuses must have had a really good [url=https://www.youtube.com/watch?v=IGqwqxRF598<]laugh[/url<] when they saw those prices.

      • NovusBogus
      • 1 year ago

      This, pretty much. NV will charge what it thinks the market will bear, and the market will either bear it or not. It’s how these things work.

      • anotherengineer
      • 1 year ago

      Nah they seen the mark ups Apple can do and said lets do that too, and people paid it, so that’s the way its going to be now going forward.

    • DancinJack
    • 1 year ago

    One thing I will drag Nvidia for–no mention of VRR/VESA Adaptive Sync. Though I wasn’t necessarily expecting it either, but it’d still be nice.

      • brucethemoose
      • 1 year ago

      Is Turing HDMI 2.1?

        • DancinJack
        • 1 year ago

        [quote=”Nvidia”<]DisplayPort 1.4, HDMI 2.0b3[/quote<] 🙁

      • Krogoth
      • 1 year ago

      Only for mobile SKUs.

    • chuckula
    • 1 year ago

    Cheap: No.

    Overpriced: If the RTX 2070 is about on-par with the GTX-1080Ti but has ray tracing extras, then probably not. If it’s only on par with the GTX-1080, then we’ll see.

      • mutercim
      • 1 year ago

      Yes it is. $1200 is an insane price, even if your GPU intended for games can get to 3000 fps at 8K in every game.

      That said it’s also insane to complain that a for-profit company with no competitors in its field is asking for what it thinks people will pay. That’s what companies DO. If you don’t like the price don’t buy the product. It’s that simple.

        • torquer
        • 1 year ago

        Purely subjective. Plenty of people spend WAY more than $1200 on hobby stuff:

        RC cars
        Golf Clubs
        Road bikes
        Offroad bikes
        Car parts
        Workout equipment

        etc etc etc. “insane” to one person is standard fare to another

          • ptsant
          • 1 year ago

          Although you have a perfectly valid argument, the fact is that people who CHOSE Golf or horse riding or yacht sailing as a hobby were aware of the costs and planned for those. And inversely, when you choose PC gaming as a hobby, you have certain expectations.

          If a $500/year hobby becomes a $1000/year hobby, people are likely to complain. So, yes, it is subjective but that doesn’t mean it exists in a vacuum.

          EDIT:
          Apparently I wasn’t very clear. I’m not arguing that the cards should be priced differently. In fact, as long as $200-$400 cards improve, I’m happy. But I don’t buy the argument that just because a person X spends $100K on racing cars another person Y should not care about GPU prices, just because both are hobbies. Past prices shape individual expectations and norms. Just because the market can bear a certain price, doesn’t mean that I, as an individual, have to accept this price as normal within my subjective frame of reference. Again, this is not a guide to how GPUs (or anything) should be priced, but rather an explanation of why I’d rather compare with the past than extrapolate from different hobbies.

            • derFunkenstein
            • 1 year ago

            The only valid argument is what the market will bear. If Nvidia can’t sell these things, expect price drops. If they can, then is it really that insane?

            • ptsant
            • 1 year ago

            I never said the prices are insane. But the prices that make sense for nVidia, the market in general, PC enthusiasts and specific individuals do not have to agree. If the market can bear $5000 GPUs but the readership of this site can’t, this seems a very legitimate reason to complain (even though complaining won’t change anything).

            • DancinJack
            • 1 year ago

            Yeah, because a set of good Irons in golf used to cost 1200 bucks? C’mon man. That’s a really weak argument.

            • derFunkenstein
            • 1 year ago

            Yeah there’s no way you can spend more today than you could a year ago on a [url=https://www.golfgalaxy.com/p/callaway-epic-irons-steel-17cwympcstl4wirn/17cwympcstl4wirn?uniqueID=3115245<]set of golf irons[/url<]. note: i know nothing about golf but no hobby on earth gets cheaper as time passes, unless it's collecting dust.

            • DancinJack
            • 1 year ago

            I’ve been golfing since I was 8. The comparison is just ludicrous. Golfing is insanely more expensive than it used to be.

            • derFunkenstein
            • 1 year ago

            I think we’re in agreement here, but you’re getting negged for it.

            • DancinJack
            • 1 year ago

            Oh we are, I was just emphasizing that I have considerable experience in this particular arena.

            • jihadjoe
            • 1 year ago

            I think the difference is that those irons will still be pretty good in 10 years time, but today’s top-dog GPU will be midrange fare in 3 years.

            • smilingcrow
            • 1 year ago

            Except that as with golf clubs there are a wide range of price points.
            If your budget is $500 then you stick with that so nothing has really changed.

            • K-L-Waster
            • 1 year ago

            Difference is with GPUs there’s a segment of the gamer population who seem think they have some sort of inalienable right to top drawer performance for bargain basement prices…

            • Waco
            • 1 year ago

            Nobody needs these cards to play games. GPUs from years ago (and modern APUs) are still perfectly capable of driving an enjoyable experience assuming you aren’t addicted to sliders that go to 11.

            • Redocbew
            • 1 year ago

            Ackshully, I’m pretty sure it does. Regardless of circumstance if you can’t wheel and deal your way out of paying X amount for Y widget, then you’re stuck with the choice to buy or not. The fact that it might suck doesn’t mean much unless everyone else agrees that it sucks and refuses to buy.

            Aside from that, whingers will always find a reason for whinging. They don’t need help.

          • Goty
          • 1 year ago

          Can confirm. At one point, my road bike was worth more than my car.

          • Redocbew
          • 1 year ago

          Can also confirm.

          Bike people are weird. There was a point for me in which I thought anyone who had such a fascination with the inner workings of the wheel must surely be a geek at heart, and therefore share some common interests. There was no such thing and I did not recognize the place in which I found myself. Bad reference point, I suppose.

          • f0d
          • 1 year ago

          I have gone back to cars as a hobby and they are way more expensive than computers
          My mitsubishi delica obsession has cost me over 12k the last few years
          My custom watercooled beast ( 2 360 rads, massive case skt2011 when it came out and a bunch of stuff) i had only cost me about 3k when I built it and I’m still using it
          I don’t even think gpus have got much more expensive, i bought a tnt2 ultra from Asus with shutter glasses in the 90s for $550 and that felt much more expensive then $1200 does now

            • Waco
            • 1 year ago

            Yep. Cars can be far more expensive, even on the cheap end of the hobby. Hell, my QuickJacks were $1200. I appreciate them a lot more than I would a new GPU, though.

          • cphite
          • 1 year ago

          How many of those things do you have to replace every couple of years?

          A good RC car can last several years if it’s maintained; same thing for bikes, and even workout equipment. A good set of golf clubs can last you decades if you take care of them.

          And for each of these things, while there are certainly people who will go out and buy the latest and greatest whenever it comes out, that is mostly optional. It’s not a requirement just to maintain the same level within the hobby.

          You can buy a good bike, keep it maintained, and it’ll ride just as good after five years as it did the day you bought it. You can ride the same routes, handle the same obstacles; the only variable is you. It’s incredibly unlikely you’re ever going to find that you can’t ride the same courses as your friends are riding unless you buy a new bike.

          If your hobby is playing the newest games at the highest settings, you basically have to plan on upgrading every couple of years. And, even if you’re willing to buy something less than cutting edge – say, the $600 card instead of the $1200 card and play at less-than-maxed-out setting – if you expect to maintain the same level of performance, you still have to replace every few years.

            • techguy
            • 1 year ago

            What your analogy fails to account for is the fact that new video games are more demanding than old ones. It would be akin to expecting your 5-year-old Huffy to go tackle some insane downhill mountain course at 70mph.

          • WhatMeWorry
          • 1 year ago

          And don’t forget the always popular booze, gambling, and whores.

      • CScottG
      • 1 year ago

      -for a consumer cards BEFORE “mining”, the prices are absolutely overpriced.

      THEN you add-in the pent-up demand for a new card.. (..again, another consequence of “mining”.)

      ..and at that point “overpriced” is rather meaningless.

      • Austin
      • 1 year ago

      The specifications indicate the RTX 2080 will be about 20% faster than the GTX 1080 but 15% slower than the GTX 1080TI.

      Early (okay very early) indications are that even the RTX 2080TI will struggle to RT render a game at 1080p so forget trying it on the 2080 or 2070. Traditional render at 4K or struggle with RT at 1080p on a $1000+ card. I can see why nVidia didn’t reveal any benchmarks and why there’s still NDA in the way of independent reviews.

      RT performance is based on the new Tomb Raider game as demonstrated showing the 2080TI achieving only 30-70 FPS at 1080p.

        • techguy
        • 1 year ago

        But they did, and I’ve already mentioned this to you:
        [url<]https://techreport.com/news/34022/nvidia-releases-its-first-official-benchmarks-for-the-rtx-2080[/url<]

    • DancinJack
    • 1 year ago

    Bunch of whiners every time Nvidia does anything.

    What you guys should be upset about is AMD severely underperforming in the GPU space.

      • karma77police
      • 1 year ago

      You go ahead and pay $1,200

        • DancinJack
        • 1 year ago

        I won’t be, but there are plenty that will. There just isn’t another card that can get you that type of performance. Definitely not from AMD. Until then, keep whining all you want at Nvidia I guess.

          • karma77police
          • 1 year ago

          There is nothing wrong with Vega 64. Performs right betweeb 1080 and 1080ti and in some games runs better than 1080ti. We are still to see new AMD card, i am sure it will be between 2080 and 2080ti but for like way cheaper.

            • DancinJack
            • 1 year ago

            I doubt it. But here’s to hoping!

            • derFunkenstein
            • 1 year ago

            with a 2019 launch, it better be! 😆

            • drkskwlkr
            • 1 year ago

            AMD decided not to compete on the discrete GPU market but instead to make money from CPUs and gaming consoles. At this point in time, Intel is more likely able to offer a competitive alternative to high-end Nvidia than AMD. Get over it.

      • Demetri
      • 1 year ago

      I’m sure AMD would like to compete if they had the means, but they don’t. I can’t really blame AMD for something that isn’t their choice. Nvidia jacking up prices is their choice, so yeah I’ll throw some shade at them for that.

        • DancinJack
        • 1 year ago

        What are you talking about? Ryzen is doing just fine. They’ve put out amazing GPUs in the past. What don’t they have now that they had in the past decade?

          • Demetri
          • 1 year ago

          The only reason Ryzen is somewhat competitive is because of Intel’s missteps, and it still gets killed in single threaded performance. AMD was lucky Intel has been stagnating since Sandy.

          “What don’t they have now that they had in the past decade?”

          Money. R&D.

            • DancinJack
            • 1 year ago

            When in the last decade did they have Money and R&D?

        • HisDivineOrder
        • 1 year ago

        Nobody forced AMD to buy ATI, go deep into the red doing so, and drag the entire PC industry down by effectively only having enough cash afterward to compete in either CPU’s or GPU’s at one time.

        Nvidia knows AMD is in their “compete with Intel” period, so they can do whatever they like.

          • DancinJack
          • 1 year ago

          In the meantime, Intel is putting a ton of resources into building a legit GPU.

          /shrug

      • HisDivineOrder
      • 1 year ago

      I can be upset about both.

        • DancinJack
        • 1 year ago

        Hard way to live my friend.

    • PrincipalSkinner
    • 1 year ago

    $1200?!
    nGreedia strikes again.

      • chuckula
      • 1 year ago

      But they didn’t strike FIRST.

        • PrincipalSkinner
        • 1 year ago

        It’s the union workers that did.

    • karma77police
    • 1 year ago

    $1,199 for 2080 ti and $800 for 2080, f. you NVidia. Nvidia 1080ti for <$600 is the best option right now…seriously and fuck 4k Gaming…2K 144Hz Monitor will do a thing.

      • Srsly_Bro
      • 1 year ago

      No doubt. This doesn’t make me feel bad about a1080Ti I picked up earlier this year. I wonder if the high price is to release a new gpu line and also clear out inventory. The high price is high.

        • karma77police
        • 1 year ago

        It is time to get Nvidia 1080ti. I was waiting and finally decided, 1080ti is a right card for right price now <= $600. 4K Gaming huh? Like i want to pay Nvidia 4k 144Hz GSync monitor for $1200+, another f. no. I am happy with regular 2k 144Hz, it does the job.

          • Srsly_Bro
          • 1 year ago

          Unless the 2070 is faster for $600, it could be smart to wait a bit longer.

          • renz496
          • 1 year ago

          no you should be happy with 1080p 60FPS. if you do that you will never get mad at the insane pricing of high end GPU.

        • Krogoth
        • 1 year ago

        Bingo

      • GasBandit
      • 1 year ago

      Still sittin pretty at 60hz in 1080p with my 1060 3 gig, and loving it.

        • Srsly_Bro
        • 1 year ago

        But have you experienced 1080Ti 1440 144hz Gsync?

          • JustAnEngineer
          • 1 year ago

          Who can afford expensive proprietary G-Sync?

            • Krogoth
            • 1 year ago

            “G-Sync Tax” is a drop in the bucket when compared to the cost of these new SKUs though.

            • ptsant
            • 1 year ago

            Not when you have to buy a new monitor to get G-Sync. It’s not like you just pay a $100 upgrade for your existing monitor, it’s the fact that migration from FreeSync to G-Sync means a whole new monitor (yeah, you can sell the old one).

            • K-L-Waster
            • 1 year ago

            The premium over a non-G-Sync is what, the cost of 3 AAA games? It’s not like we’re talking about a second mortgage here…

            • JustAnEngineer
            • 1 year ago

            Take the 144 Hz IPS 27″ 2560×1440 Nixeus EDG-27 at $440 as your reference for what variable refresh rate monitors cost when they follow the VESA adaptive sync standard.

            • jihadjoe
            • 1 year ago

            Or the price difference between Vega 64 and a GTX 1080. Thanks, miners!

            • Kretschmer
            • 1 year ago

            The people who buy G-Sync monitors?

        • Redocbew
        • 1 year ago

        Stay strong, bro.

Pin It on Pinterest

Share This