AMD’s Radeon R9 295 X2 graphics card reviewed

Several weeks ago, I received a slightly terrifying clandestine communique consisting only of a picture of myself in duplicate and the words, “Wouldn’t you agree that two is better than one?” I assume the question wasn’t truly focused on unflattering photographs or, say, tumors. In fact, I had an inkling that it probably was about GPUs, as I noted in a bemused news item.

A week or so after that, another package arrived at my door. Inside were two small cans of Pringles, the chips reduced to powder form in shipping, and a bottle of “Hawaiian volcanic water.” Also included were instructions for a clandestine meeting. Given what had happened to the chips, I feared someone was sending me a rather forceful signal. I figured I’d better comply with the sender’s demands.

So, some days later, I stood at a curbside in San Jose, California, awaiting the arrival of my contacts—or would-be captors or whatever. Promptly at the designated time, a sleek, black limo pulled up in front of me, and several “agents” in dark clothes and mirrored sunglasses spilled out of the door. I was handed a document to sign that frankly could have said anything, and I compliantly scribbled my signature on the dotted line. I was then whisked around town in the limo while getting a quick-but-thorough briefing on secrets meant for my eyes only—secrets of a graphical nature, I might add, if I weren’t bound to absolute secrecy.

Early the next week, back at home, a metal briefcase was dropped on my doorstep, as the agents had promised. It looked like so:

After entering the super-secret combination code of 0-0-0 on each latch, I was able to pop the lid open and reveal the contents.

Wot’s this? Maybe one of the worst-kept secrets anywhere, but then I’m fairly certain the game played out precisely as the agents in black wanted. Something about dark colors and mirrored sunglasses imparts unusual competence, it seems.

Pictured in the case above is a video card code-named Vesuvius, the most capable bit of graphics hardware in the history of the world. Not to put too fine a point on it. Alongside it, on the lower right, is the radiator portion of Project Hydra, a custom liquid-cooling system designed to make sure Vesuvius doesn’t turn into magma.

Mount Radeon: The R9 295 X2

Liberate it from the foam, and you can see Vesuvius—now known as the Radeon R9 295 X2—in all of its glory.

You may have been wondering how AMD was going to take a GPU infamous for heat issues with only one chip on a card and create a viable dual-GPU solution. Have a glance at that external 120-mm fan and radiator, and you’ll wonder no more.

If only Pompeii had been working with Asetek. Source: AMD.

The 295 X2 sports a custom cooling system created by Asetek for AMD. This system is pre-filled with liquid, operates in a closed loop, and is meant to be maintenance-free. As you can probably tell from the image above, the cooler pumps liquid across the surface of both GPUs and into the external radiator. The fan on the radiator then pushes the heat out of the case. That central red fan, meanwhile, cools the VRMs and DRAM on the card.

We’ve seen high-end video cards with water cooling in the past, but nothing official from AMD or Nvidia—until now. Obviously, having a big radiator appendage attached to a video card will complicate the build process somewhat. The 295 X2 will only fit into certain enclosures. Still, it’s hard to object too strongly to the inclusion of a quiet, capable cooling system like this one. We’ve seen way too many high-end video cards that hiss like a Dyson.

There’s also the matter of what this class of cooling enables. The R9 295 X2 has two Hawaii GPUs onboard, fully enabled and clocked at 1018MHz, slightly better than the 1GHz peak clock of the Radeon R9 290X. Each GPU has its own 4GB bank of GDDR5 memory hanging off of a 512-bit interface. Between the two GPUs is a PCIe 3.0 switch chip from PLX, interlinking the Radeons and connecting them to the rest of the system. Sprouting forth from the expansion slot cover are four mini-DisplayPort outputs and a single DL-DVI connector, ready to drive five displays simultaneously, if you so desire.

So the 295 X2 is roughly the equivalent of two Radeon R9 290X cards crammed into one dual-slot card (plus an external radiator). That makes it the most capable single-card graphics solution that’s ever come through Damage Labs, as indicated by the bigness of the numbers attached to it in the table below.

Peak pixel

fill rate

(Gpixels/s)

Peak

bilinear

filtering

int8/fp16

(Gtexels/s)

Peak

shader

arithmetic

rate

(tflops)

Peak

rasterization

rate

(Gtris/s)

Memory
bandwidth
(GB/s)
Radeon HD
7970
30 118/59 3.8 1.9 264
Radeon HD
7990
64 256/128 8.2 4.0 576
Radeon
R9 280X
32 128/64 4.1 2.0 288
Radeon
R9 290
61 152/86 4.8 3.8 320
Radeon
R9 290X
64 176/88 5.6 4.0 320
Radeon
R9 295 X2
130 352/176 11.3 8.1 640
GeForce GTX
690
65 261/261 6.5 8.2 385
GeForce GTX 770 35 139/139 3.3 4.3 224
GeForce GTX 780 43 173/173 4.2 3.6 or 4.5 288
GeForce GTX
Titan
42 196/196 4.7 4.4 288
GeForce GTX
780 Ti
45 223/223 5.3 4.6 336

Those are some large values. In fact, the only way you could match the bigness of those numbers would be to pair up a couple of Nvidia’s fastest cards, like the GeForce GTX 780 Ti. No current single GPU comes close.

There is a cost for achieving those large numbers, though. The 295 X2’s peak power rating is a jaw-dropping 500W. That’s quite a bit higher than some of our previous champs, such as the GeForce GTX 690 at 300W and the Radeon HD 7990 at 375W. Making this thing work without a new approach to cooling wasn’t gonna be practical.

Exotic cooling, steep requirements

AMD has gone out of its way to make sure the R9 295 X2 looks and feels like a top-of-the-line product. Gone are the shiny plastics of the Radeon HD 7990, replaced by stately and industrial metal finishes, from the aluminum cooling shroud up front to the black metal plate covering the back side of the card.

That’s not to say that the 295 X2 isn’t any fun. The bling is just elsewhere, in the form of illumination on the “Radeon” logo atop the shroud. Another set of LEDs makes the central cooling fan glow Radeon red.


Oooh.

I hope you’re taken by that glow—I know I kind of am—because it’s one of the little extras that completes the package. And this package is not cheap. The suggested price on this puppy is $1499.99 (or, in Europe, €1099 plus VAT). I believe that’s a new high-water mark for a consumer graphics card, although it ain’t the three frigging grand Nvidia intends to charge for its upcoming Titan Z with dual GK110b chips. And I believe the 295 X2’s double-precision math capabilities are fully enabled at one-quarter the single-precision rate, or roughly 2.8 teraflops. That makes the 295 X2 a veritable bargain by comparison, right?

Well, whatever the case, AMD expects the R9 295 X2 to hit online retailers during the week of April 21, and I wouldn’t be shocked to see them sell out shortly thereafter. You’ll have to decide for yourself whether 295 X2’s glowy lights, water cooling, and other accoutrements are worth well more than the $1200 you’d put down for a couple of R9 290X cards lashed together in a CrossFire config.

You know, some things about this card—its all-metal shroud, illuminated logo, secret agent-themed launch, metal briefcase enclosure, and exploration of new price territory—seem strangely familiar. Perhaps that’s because the GeForce GTX 690 was the first video card to debut an all-metal shroud and an illuminated logo; it was launched with a zombie apocalypse theme, came in a wooden crate with prybar, and was the first consumer graphics card to hit the $1K mark. Not that there’s anything wrong with that. The GTX 690’s playbook is a fine one to emulate. Just noticing.


The Radeon HD 7990 (left) and R9 295 X2 (right)

Assuming the R9 295 X2 fits into your budget, you may have to make some lifestyle changes in order to accommodate it. The card is 12″ long, like the Radeon HD 7990 before it, but it also requires a mounting point for the 120-mil radiator/fan combo that sits above the board itself. Together, the radiator and fan are 25 mm deep. If you’re the kind of dude who pairs up two 295 X2s, AMD recommends leaving a one-slot gap between the two cards, so that airflow to that central cooling fan isn’t occluded. I suspect you’d also want to leave that space open in a single-card config rather than, say, nestling a big sound card right up next to that fan.

More urgently, your system’s power supply must be able to provide a combined 50 amps across the card’s two eight-pin PCIe power inputs. That wasn’t a problem for the Corsair AX850 PSU in our GPU test rig, thanks to its single-rail design. Figuring out whether a multi-rail PSU offers enough amperage on the relevant 12V rails may require some careful reading, though.

Now for a whole mess of a issues

The Radeon R9 295 X2 is a multi-GPU graphics solution, and that very fact has triggered a whole mess of issues with a really complicated backstory. The short version is that AMD has something of a checkered past when it comes to multi-GPU solutions. The last time they debuted a new dual-GPU graphics card, the Radeon HD 7990, it resulted in one of the most epic reviews we’ve ever produced, as we pretty much conclusively demonstrated that adding a second GPU didn’t make gameplay anywhere near twice as smooth as a single GPU. AMD has since added a frame-pacing algorithm to its drivers in order to address that problem, with good results. However that fix didn’t apply to Eyefinity multi-display configs and didn’t cover even a single 4K panel. (The best current 4K panels use two “tiles” and are logically treated as dual displays.)

A partial fix for 4K came later, with the introduction of the Radeon R9 290X and the Hawaii GPU, in the form of a new data-transfer mechanism for CrossFire known as XDMA. Later still, AMD released a driver with an updated frame pacing for older GPUs, like the Tahiti chip aboard the Radeon R9 280X and the HD 7990.

And, shamefully, we haven’t yet tested either XMDA CrossFire or the CrossFire + 4K/Eyefinity fix for older GPUs. I’ve been unusually preoccupied with other things, but that’s still borderline scandalous and sad. AMD may well have fixed its well-documented CrossFire issues with 4K and multiple displays, and son, testing needs to be done.

Happily, the R9 295 X2 review seemed like the perfect opportunity to spend some quality time vetting the performance of AMD’s current CrossFire solutions with 4K panels. After all, AMD emphasized repeatedly in its presentations that the 295 X2 is built for 4K gaming. What better excuse to go all out?


Source: AMD.

So I tried. Doing this test properly means using FCAT to measure how individual frames of in-game animation are delivered to a 4K panel. Our FCAT setup isn’t truly 4K capable, but we’re able to capture one of the two tiles on a 4K monitor, at a resolution of 1920×2160, and analyze performance that way. It’s a bit of a hack, but it should work.

Emphasis on should. Trouble is, I just haven’t been able to get entirely reliable results. It works for GeForces, but the images coming in over HDMI-to-DVI-to-splitter-to-capture-card from the Radeons have some visual corruption in them that makes frame counting difficult. After burning a big chunk of last week trying to make it work by swapping in shorter and higher-quality DVI cables, I had to bail on FCAT testing and fall back on the software-based Fraps tool in order to get reliable results. I will test XMDA CrossFire and the like with multiple monitors using FCAT soon. Just not today.

Fraps captures frame times relatively early in the production process, when they are presented as final to Direct3D, so it can’t show us exactly when frames are reaching the screen. As we’ve often noted, though, there is no single place where we can sample to get a perfect picture of frame timing. The frame pacing and metering methods used in multi-GPU solutions may provide regular, even frame delivery to the monitor, but as a result, the animation timing of those frames may not match their display times. Animation timing is perhaps better reflected in the Fraps numbers—depending on how the game engine tracks time internally, which varies from game to game.

This stuff is really complicated, folks.

Fortunately, although Fraps may not capture all the nuances of multi-GPU microstuttering and its mitigation, it is a fine tool for basic performance testing—and there are plenty of performance challenges for 4K gaming even without considering frame delivery to the display. I think that’ll be clear very soon.

One more note: I’ve run our Fraps results though a three-frame low-pass filter in order to compensate for the effects of the three-frame Direct3D submission queue used by most games. This filter eliminates the “heartbeat” pattern of high-and-then-low frame times sometimes seen in Fraps results that doesn’t translate into perceptible hitches in the animation. We’ve found that filtered Fraps data corresponds much more closely to the frame display times from FCAT. Interestingly, even with the filter, the distinctive every-other-frame pattern of multi-GPU microstuttering is evident in some of our Fraps results.

The 4K experience

We’ve had one of the finest 4K displays, the Asus PQ321Q, in Damage Labs for months now, and I’ve been tracking the progress of 4K support in Windows, in games, and in graphics drivers periodically during that time. This is our first formal look at a product geared specifically for 4K gaming, so I thought I’d offer some impressions of the overall experience. Besides, I think picking up a $3000 4K monitor ought to be a prerequisite for dropping $1500 on the Radeon R9 295 X2, so the 4K experience is very much a part of the overall picture.

The first thing that should be said is that this 31.5″ Asus panel with a 3840×2160 pixel grid is a thing of beauty, almost certainly the finest display I’ve ever laid eyes upon. The color reproduction, the uniformity, the incredible pixel density, the really-good-for-an-LCD black levels—practically everything about it is amazing and wondrous. The potential for productivity work, video consumption, or simply surfing the web is ample and undeniable. To see it is to want it.

The second thing to be said is that—although Microsoft has made progress and the situation isn’t bad under Windows 8.1 when you’re dealing with the file explorer, desktop, or Internet Explorer—the 4K support in Windows programs generally is still awful. That matters because you will want to use high-PPI settings and to have text sizes scaled up to match this display. Reading five-point text is not a good option. Right now, most applications do scale up their text size in response to the high-PPI control panel settings, but the text looks blurry. Frustrating, given everything, but usable.

The bigger issues have to do with the fact that today’s best 4K displays, those that support 60Hz refresh rates, usually present themselves to the PC as two “tiles” or separate logical displays. They do so because, when they were built, there wasn’t a display scaler ASIC capable of handling the full 4K resolution. The Asus PQ321Q can be connected via dual HDMI inputs or a single DisplayPort connector. In the case of DisplayPort, the monitor uses multi-stream transport mode to essentially act as two daisy-chained displays. You can imagine how this reality affects things like BIOS screens, utilities that run in pre-boot environments, and in-game menus the first time you run a game. Sometimes, everything is squished up on half of the display. Other times, the image is both squished and cloned on both halves. Occasionally, the display just goes black, and you’re stuck holding down the power button in an attempt to start over.

AMD and Nvidia have done good work making sure their drivers detect the most popular dual-tile 4K monitors and auto-configure them as a single large surface in Windows. Asus has issued multiple firmware updates for this monitor that seem to have helped matters, too. Still, it often seems like the tiling issues have moved around over time rather than being on a clear trajectory of overall improvement.

Here’s an example from Tomb Raider on the R9 295 X2. I had hoped to use this game for testing in this review, but the display goes off-center at 3840×2160. I can’t seem to make it recover, even by nuking the registry keys that govern its settings and starting over from scratch. Thus, Lara is offset to the left of the screen while playing, and many of the in-game menus are completely inaccessible.

AMD suggested specifying the aspect ratio for this game manually to work around this problem, but doing so gave me an entire game world that was twice as tall as it should have been for its width. Now, I’m not saying that’s not interesting and maybe an effective substitute for some of your less powerful recreational drugs, because wow. But it’s not great for real gaming.

Another problem that affects both AMD and Nvidia is a shortage of available resolutions. Any PC gamer worth his salt knows what to do when a game doesn’t quite run well enough at the given resolution, especially if you have really high pixel densities at your command: just pop down to a lower res and let the video card or monitor scale things up to fill the screen. Dropping to 2560×1440 or 1920×1080 would seem like an obvious strategy with a display like this one. Yet too often, you’re either stuck with 3840×2160 or bust. The video drivers from AMD and Nvidia don’t consistently expose even these two obvious resolutions that are subsets of 3840×2160 or anything else remotely close. I’m not sure whether this issue will be worked out in the context of these dual-tile displays or not. Seems like they’ve been around quite a while already without the right thing happening. We may have to wait until the displays themselves get better scaler ASICs.

There’s also some intermittent sluggishness in using a 4K system, even with the very fastest PC hardware. You’ll occasionally see cases of obvious slowness, where screen redraws are laborious for things like in-game menus. Such slowdowns have been all but banished at 2560×1600 and below these days, so it’s a surprise to see them returning in 4K. I’ve also encountered some apparent mouse precision issues in game options menus and while sniping in first-person shooters, although such things are hard to separate precisely from poor graphics performance.

In case I haven’t yet whinged enough about one of the coolest technologies of the past few years, let me add some about the actual experience of gaming in 4K. I’ve gotta say that I’m not blown away by it, when my comparison is a 27″ 2560×1440 Asus monitor, for several reasons.

For one, game content isn’t always 4K-ready. While trying to get FCAT going, I spent some time with this Asus monitor’s right tile in a weird mode, with only half the vertical resolution active. (Every other scanline was just repeated.) You’d think that would be really annoying, and on the desktop, it’s torture. Fire up a session of Borderlands 2, though, and I could play for hours without noticing the difference, or even being able to detect the split line, between the right and left tiles. Sure, Crysis 3 is a different story, but the reality is that many games won’t benefit much from the increased pixel density. Their textures and models and such just aren’t detailed enough.

Even when games do take advantage, I’m usually not blown away by the difference. During quick action, it’s often difficult to appreciate the additional fidelity packed into each square inch of screen space.

When I do notice the additional sharpness, it’s not always a positive. For example, I often perceive multiple small pixels changing quickly near each other as noise or flicker. The reflections in puddles in BF4 are one example of this phenomenon. I don’t think those shader effects have enough internal sampling, and somehow, that becomes an apparent problem at 4K’s high pixel densities. My sense is that, most of the time, lower pixel densities combined with supersampling (basically, rendering each pixel multiple times at an offset and blending) would probably be more pleasing overall than 4K is today. Of course, as with many things in graphics, there’s no arguing with the fact that 4K plus supersampling would be even better, if that were a choice. In fact, supersampling may prove to be an imperative for high-PPI gaming. 4K practically requires even more GPU power and will soak it up happily. Unfortunately, 4X or 8X supersampling at 4K is not generally feasible right now.

Don’t get me wrong. When everything works well and animation fluidity isn’t compromised, gaming at 4K can be a magical thing, just like gaming at 2560×1440, only a little nicer. The sharper images are great, and edge aliasing is much reduced at high PPIs.

I’m sure things will improve gradually as 4K monitors become more common, and I’m happy to see the state of the art advancing. High-PPI monitors are killer for productivity. Still, I think some other display technologies, like G-Sync/Freesync-style variable refresh intervals and high-dynamic-range panels, are likely to have a bigger positive impact on gaming. I hope we don’t burn the next few years on cramming in more pixels without improving their speed and quality.

Our testing methods

As ever, we did our best to deliver clean benchmark numbers. Our test systems were configured like so:

Processor Core i7-3820
Motherboard Gigabyte
X79-UD3
Chipset Intel X79
Express
Memory size 16GB (4 DIMMs)
Memory type Corsair
Vengeance CMZ16GX3M4X1600C9
DDR3 SDRAM at 1600MHz
Memory timings 9-9-9-24
1T
Chipset drivers INF update
9.2.3.1023

Rapid Storage Technology Enterprise 3.6.0.1093

Audio Integrated
X79/ALC898

with Realtek 6.0.1.7071 drivers

Hard drive Kingston
HyperX 480GB SATA
Power supply Corsair
AX850
OS Windows
8.1 Pro
Driver
revision
GPU
base

core clock

(MHz)

GPU
boost

clock

(MHz)

Memory

clock

(MHz)

Memory

size

(MB)

GeForce
GTX 780 Ti
GeForce 337.50 875 928 1750 3072
2 x
GeForce GTX 780 Ti
GeForce 337.50 875 928 1750 3072 (x2)
Radeon
HD 7990
Catalyst 14.4 beta 950 1000 1500 3072
XFX
Radeon R9 290X
Catalyst 14.4 beta 1000 1250 4096
Radeon
R9 295 X2
Catalyst
14.4 beta
1018 1250 4096 (x2)

Thanks to Intel, Corsair, Kingston, Gigabyte, and OCZ for helping to outfit our test rigs with some of the finest hardware available. AMD, Nvidia, and the makers of the various products supplied the graphics cards for testing, as well.

Also, our FCAT video capture and analysis rig has some pretty demanding storage requirements. For it, Corsair has provided four 256GB Neutron SSDs, which we’ve assembled into a RAID 0 array for our primary capture storage device. When that array fills up, we copy the captured videos to our RAID 1 array, comprised of a pair of 4TB Black hard drives provided by WD.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

In addition to the games, we used the following test applications:

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Crysis 3


Click on the buttons above to cycle through plots of the frame times from one of our three test runs for each graphics card. You’ll notice that the lines for the multi-GPU solutions like the R9 295 X2 and two GTX 780 Ti cards in SLI are “fuzzier” than the those from the single-GPU solutions. That’s an example of multi-GPU micro-stuttering, where the two GPUs are slightly out of sync, so the frame-to-frame intervals tend to vary in an alternating pattern. Click on the buttons below to zoom in and see how that pattern looks up close.


The only really pronounced example of microstuttering in our zoomed-in plots is the GTX 780 Ti SLI config, and it’s not in terrible shape, with the peak frame times remaining under 25 ms or so. The thing is, although we can measure this pattern in Fraps, it’s likely that Nvidia’s frame metering algorithm will smooth out this saw-tooth pattern and ensure more consistent delivery of frames to the display.

Not only does the 295 X2 produce the highest average frame rate, but it backs that up by delivering the lowest rendering times across 99% of the frames in our test sequence, as the 99th percentile frame time indicates.

Here’s a broader look at the frame rendering time curve. You can see that the 295 X2 has trouble in the very last less-than-1% of frames. I can tell you where that happens in the test sequence, when my exploding arrow does its thing. We’ve seen frame time spikes on both brands of video cards at this precise spot before. Thing is, if you look at the frame time plots above, Nvidia appears to have reduced the size of that spike recently, perhaps during the work it’s done optimizing this new 337.50 driver.


These “time spent beyond X” graphs are meant to show “badness,” those instances where animation my be less than fluid—or at least less than perfect. The 50-ms threshold is the most notable one, since it corresponds to a 20-FPS average. We figure if you’re not rendering any faster than 20 FPS, even for a moment, then the user is likely to perceive a slowdown. 33ms correlates to 30 FPS or a 30Hz refresh rate. Go beyond that with vsync on, and you’re into the bad voodoo of quantization slowdowns. And 16.7 ms correlates to 60 FPS, that golden mark that we’d like to achieve (or surpass) for each and every frame.

Per our discussion above, the GTX 780 Ti SLI aces this test by never crossing the 50-ms threshold. The R9 295X X2 is close behind—and solidly ahead of a single Hawaii GPU aboard the Radeon R9 290X. That’s the kind of real-world improvement we want out of a multi-GPU solution. This is where I’d normally stop and say we’ll want to verify the proper frame delivery with FCAT, but in this particular case, I’ll skip that step and call it good. Subjectively speaking, Crysis 3 on the 295 X2 at 4K is amazingly fluid and smooth, and this game has the visual fidelity to make you appreciate the additional pixels.

Assassin’s Creed 4 Black Flag


Uh oh. Click through the plots above, and you’ll see occasional frame time spikes from AMD’s multi-GPU solutions, both the HD 7990 and the R9 295 X2. Those same spikes are absent from the plots of the R9 290X and the two GeForce configs. The spikes have a fairly modest impact on the 295 X2’s FPS average, which is still much higher than a single 290X card’s, but they’re reflected more clearly in the latency-sensitive 99th percentile metric.


The 295 X2 is still faster than a single R9 290X overall in Black Flag, but its multi-GPU scaling is marred by those intermittent slowdowns. Meanwhile, the GTX 780 Ti SLI setup never breaches the 33-ms barrier, not even once.

Battlefield 4

Thanks to the hard work put in by Johan Andersson and the BF4 team, this game is now an amazing playground for folks who want to understand performance. I was able to collect performance data from the game engine directly here, without the use of Fraps, and I grabbed much more of it than I can share in the context of this review, including information about the CPU time and GPU time required to render each frame. BF4 supports AMD’s Mantle, where Fraps cannot go, and the game now even includes an FCAT overlay rendering option, so we can measure frame delivery with Mantle.

I’m on board for all of that—and I even tried out two different frame-pacing options BF4 offers for multi-Radeon setups—but I didn’t have time to include it all in this review. In the interests of time, I’ve only included Direct3D results below. Trust me, the differences in performance between D3D and Mantle are slight at 4K resolutions, where the GPU limits performance more than the CPU and API overhead. Also, given the current state of multi-GPU support and frame pacing in BF4, I think Direct3D is unquestionably the best way to play this game on a 295 X2.

Still, we’ll dig into that scrumptious, detailed BF4 performance data before too long. There’s much to be learned.



Check each one of the metrics above, and it’s easy to see the score. The R9 295 X2 is pretty much exemplary here, regardless of which way you choose to measure.

Oddly enough, although its numbers look reasonably decent, the GTX 780 Ti SLI setup struggles, something you tell by the seat of your pants when playing. My insta-theory was that the cards were perhaps running low on memory. After all, they “only” have 3GB each, and SLI adds some memory overhead. I looked into it by logging memory usage with GPU-Z while playing, and the primary card was using its RAM pretty much to the max. Whether or not that’s the source of the problem is tough to say, though, without further testing.

Batman: Arkham Origins



Well. We’re gliding through the rooftops in this test session, and the game must be constantly loading new portions of the city as we go. You’d never know that when playing on one of the GeForce configs, but there are little hiccups that you can feel all along the path when playing on the Radeons. For whatever reason, this problem is most pronounced on the 295 X2. Thus, the 295 X2 fares poorly in our latency-sensitive performance metrics. This is a consistent and repeatable issue that’s easy to notice subjectively.

Guild Wars 2



Uh oh. Somehow, the oldest game in our roster still doesn’t benefit from the addition of a second GPU. Heck, the single 290X is even a little faster than the X2. Not what I expected to see here, but this is one of the pitfalls of owning a multi-GPU solution. Without the appropriate profile for CrossFire or SLI, many games simply won’t take advantage of additional GPUs.

Call of Duty: Ghosts



Hm. Watch the video above, and you’ll see that the first part of our test session is a scripted sequence that looks as if it’s shown through a camera lens. This little scripted bit starts the level, and I chose to include it because Ghosts has so many fricking checkpoints riddled throughout it, there’s practically no way to test the same area repeatedly unless it’s at the start of a mission. By looking at the frame time plots, you can see that the Radeons really struggle with this portion of the test run—and, once again, the multi-GPU configs suffer the most. During that bit of the test, the 290X outperforms the 295 X2.


Beyond those opening seconds, the 295 X2 doesn’t perform too poorly, although the dual 780 Ti cards are faster, but by then the damage is done.

Thief

I decided to just use Thief‘s built-in automated benchmark, since we can’t measure performance with AMD’s Mantle API using Fraps. Unfortunately, this benchmark is pretty simplistic, with only FPS average and minimum numbers (as well as a maximum, for all that’s worth).

Watch this test run, and you can see that it’s a struggle for most of these graphics cards. Unfortunately, Mantle isn’t any help, even on the single-GPU R9 290X. I had hoped for some gains from Mantle, even if the primary benefits are in CPU-bound scenarios. Doesn’t look like that’s the case.

As you can see, Thief‘s developers haven’t yet added multi-GPU support to their Mantle codepath, so the 295 X2 doesn’t perform at its best with Mantle. With Direct3D, though, the 295 X2 easily leads the pack.

Power consumption

Please note that our “under load” tests aren’t conducted in an absolute peak scenario. Instead, we have the cards running a real game, Crysis 3, in order to show us power draw with a more typical workload.

Yeah, so this is the same test rig in each case; only the graphics card changes. Dropping in the R9 295 X2 raises the total system power consumption at the wall outlet to an even 700W, over 130W higher than with dual GTX 780 Ti cards.

Noise levels and GPU temperatures

The good news here is that, despite its higher power draw and the presence of a water pump and an additional 120-mm fan, the Radeon R9 295 X2 isn’t terribly loud at all. This is progress. A couple of generations ago, the Radeon HD 6990 exceeded 58 dBA in the same basic test conditions. I’m not sure I want to see all future dual-GPU cards come with a radiator appendage hanging off of ’em, but I very much prefer that to 58 dBA of noise.

We couldn’t log the 295 X2’s temperatures directly because GPU-Z doesn’t yet support this card (and you need to log temps while in full-screen mode so both GPUs are busy). However, the card’s default PowerTune limit is 75°C. Given how effective PowerTune is at doing its job, I’d fully expect the 295 X2 hit 75°C during our tests.

Notice, also, that our R9 290X card stays relatively cool at 71°C. That’s because it’s an XFX card with an excellent aftermarket cooler. The card not only remained below its thermal limit, but also ran consistently at its 1GHz peak clock during our warm-up period and as we took the readings. Using a bigger, beefier cooler, XFX has solved AMD’s problem with variable 290X clock speeds and has erased the performance difference between the 290X’s default and “uber” cooling modes in the process. The performance results for the 290X on the preceding pages reflect that fact.

Conclusions

Let’s sum up our performance results—and factor in price—using our world-famous scatter plots. These overall performance results are a geometric mean of the outcomes on the preceding pages. We left Thief out of the first couple of plots since we tested it differently, but we’ve added it to a third plot to see how it affects things.

As usual, the best values will tend toward the top left of the plot, where performance is high and price is low, while the worst values will gravitate toward the bottom right.


As you can see, the 295 X2 doesn’t fare well in our latency-sensitive 99th percentile FPS metric (which is just frame times converted to higher-is-better FPS). You’ve seen the reasons why in the test results: frame time spikes in AC4 and Arkham Origins, struggles in a portion of our Call of Duty: Ghosts test session, and negative performance scaling for multi-GPU in Guild Wars 2. These problems push the R9 295 X2 below even a single GeForce GTX 780 Ti in the overall score.

AMD’s multi-GPU struggles aren’t confined to the 295 X2, either. The Radeon HD 7990 is, on paper, substantially more powerful than the R9 290X, but its 99th percentile FPS score is lower than a single 290X card’s.

The 295 X2 does somewhat better if you’re looking at the FPS average, and the addition of Thief makes the Radeons a little more competitive overall. Still, two GTX 780 Ti cards in SLI are substantially faster even in raw FPS terms. And we know that the 295 X2 struggles to produce consistently the sort of gaming experience that its hardware ought to provide.


Source: AMD

I’ve gotta say, I find this outcome incredibly frustrating and disappointing. I believe AMD’s hardware engineers have produced probably the most powerful graphics card we’ve ever seen. The move to water cooling has granted it a massive 500W power envelope, and it has a 1GB-per-GPU advantage in memory capacity over the GeForce GTX 780 Ti SLI setup. Given that we tested exclusively in 4K, where memory size is most likely to be an issue, I fully expected the 295 X2 to assert its dominance. We saw flashes of its potential in Crysis 3 and BF4. Clearly the hardware is capable.

At the end of the day, though, a PC graphics card requires a combination of hardware and software in order to perform well—that’s especially true for a multi-GPU product. Looks to me like the R9 295 X2 has been let down by its software, and by AMD’s apparent (and, if true, bizarre) decision not to optimize for games that don’t wear the Gaming Evolved logo in their opening titles. You know, little franchises like Call of Duty and Assassin’s Creed. It’s possible AMD could fix these problems in time, but one has to ask how long, exactly, owners of the R9 295 X2 should expect to wait for software to unlock the performance of their hardware. Recently, Nvidia has accelerated its practice of having driver updates ready for major games before they launch, after all. That seems like the right way to do it. AMD is evidently a long way from that goal.

I dunno. Here’s hoping that our selection of games and test scenarios somehow just happened to be particularly difficult for the R9 295 X2, for whatever reason. Perhaps we can vary some of the test scenarios next time around and get a markedly better result. There’s certainly more work to be done to verify consistent frame delivery to the display, anyhow. Right now, though, the 295 X2 is difficult to recommend, even to those folks who would happily pony up $1500 for a graphics card.

I occasionally post pictures of expensive graphics cards Twitter.

Comments closed
    • novv
    • 5 years ago

    This is definitely a great product, a real piece of great achievement from the engineering point of view. I see most of the comments saying that this is not worth the price asked. But a Porsche 911 does it worth the money ? You cannot afforded then don’t buy it, but respect the achievement made by the manufacturer. Is there any other single card out there who can compete with this ? I don’t think so. Is there another CPU that can compete with top of the line from Intel ? No, but no one is saying that Intel charges too much for their products. I just see, from many comments, the envy generated by a great product.

    • ronch
    • 6 years ago

    I was just reading about 3dfx over at Wikipedia. Amazing how far desktop 3D graphics has evolved since the early days of 3D gaming.

    • Arclight
    • 6 years ago

    It would have been better for AMD to just fine tune the R9 290x and launch a higher clocked, better cooled version. BTW how are stocks doing in the US for the R9 series? Are the cards still overpriced?

      • adam1378
      • 6 years ago

      Paid $430 for the Xfx 290 Double D. Tigerdirect too.

    • deruberhanyok
    • 6 years ago

    [i<]Early the next week, back at home, a metal briefcase was dropped on my doorstep, as the agents had promised. It looked like so:[/i<] I love these promo cases! I've got an NVIDIA one from back in the day when they sent me a 6800GT to review. I can't bring myself to get rid of it, despite the fact that it's too small to store much of anything. I imagine this one is a fair bit large. The 6800GT wasn't exactly a gigantic card. Good to see promo swag is still fun sometimes. 🙂

    • WaltC
    • 6 years ago

    [quote<]I've gotta say, I find this outcome incredibly frustrating and disappointing. I believe AMD's hardware engineers have produced probably the most powerful graphics card we've ever seen.[/quote<] Well, then reading this: [url<]http://www.hardocp.com/article/2014/04/08/amd_radeon_r9_295x2_video_card_review[/url<] ...should make you feel a whole lot better and completely restore your faith in AMD....;)

    • CaptTomato
    • 6 years ago

    Another idiotic card, less idiotic than the 3k Titan, but idiotic.

      • Airmantharp
      • 6 years ago

      I think its a very elegant card with the best solution to an old problem we’ve seen so far, but please, don’t let that keep the hate from flowing through you!

        • CaptTomato
        • 6 years ago

        Pardon me, but I’m entitled to like or dislike whatever, in this case, not only will one have to fork over an arm and leg, they’ll still have to suffer XFIRE problems which aren’t going away.
        Are you planning on buying one…..I doubt it as you know it lacks value, unlike a single card solution.

          • Airmantharp
          • 6 years ago

          It has plenty of value, but not for my purposes. I’ll elaborate:

          -It still doesn’t have enough RAM for real 4k, i.e., where the textures and other details are befitting the increased clarity provided by the jump in resolution
          -I’d only want to use two GPUs at most, and I’m not currently seeking to set up a portable system that this solution would facilitate
          -I am really, really looking forward to more premium G-Sync monitors, which AMD has yet to provide a full competing solution for

          So no, I’m not in the market for this card- but I still find it to be quite a good way to approach the problem.

            • CaptTomato
            • 6 years ago

            IOW, you’ve bought into the hype+ the further insanity of the ultra class of GFX cards.

            • Airmantharp
            • 6 years ago

            Video cards are just tools that let me play what I want to play, how I want to play it. Some people don’t mind their 19″ 1024×768 monitors, I do.

            • CaptTomato
            • 6 years ago

            You just finished telling us that it has plenty of value but not for your purposes…….isn’t that because it lacks specs and value, if so, what it’s value, LOL?

            • Airmantharp
            • 6 years ago

            A BMW M3 has value, but not for my purposes.

            How hard is that to understand?

        • Krogoth
        • 6 years ago

        You got cut back on the green kool-aid. It is making you look very silly.

        Titan Z is a decent value at best if you want to GPGPU related stuff, but it is a terrible value for gaming uses. 780Ti SLI yields the same level of performance for half of the cost.

        Unless space is a premium. 2x Titan Blacks make more sense for 2/3 of cost and the same level of performance.

        • ronch
        • 6 years ago

        Elegant, except for the power consumption.

          • Airmantharp
          • 6 years ago

          I can agree on that, though I’ll note that at least it gets the resulting heat out of the case. Not going to help the power bill, though!

    • LoneWolf15
    • 6 years ago

    “Alongside it, on the lower right, is the radiator portion of Project Hydra”

    They better have Hugo Weaving in the advertising campaign.

    HAIL HYDRA!!!

      • NeelyCam
      • 6 years ago

      I kinda like the new TV show Marvel’s Agents Of S.H.I.E.L.D.

        • LoneWolf15
        • 6 years ago

        If you like that, watch “Arrow” in its second season on CW. While Agents of S.H.I.E.L.D. isn’t bad and has gotten better with time, “Arrow” beats it hands down.

    • daviejambo
    • 6 years ago

    Can somebody please explain the issue with AMD drivers as for 3 years I’ve used their cards (and in crossfire). No issues with them

      • Airmantharp
      • 6 years ago

      That’s easy: [b<]you're lucky.[/b<]

      • moose17145
      • 6 years ago

      Yea I am not seeing it either, and I have been running ATI/AMD cards almost exclusively since the 9800 Pro days (epic card btw), and yea no show stopping issues here. I am currently running a R9 290 that I am extremely pleased with. I have never run SLI or CrossFire though, so that makes me immune to 90+% of any potential problems right there though… seems like many of these “problems” only come into existence when a second GPU is added. What I am curious about is what a few others have asked, is if adding a third or fourth GPU for tri / quad sli / xfire helps with some of the microstutter issues. I have heard a few reports that it does. Granted adding even more GPUs can add to even more issues… but I think it would be fascinating to see the results none the less

      • Bensam123
      • 6 years ago

      They’re niche scenarios. Running crossfire with eyefinity (2+ monitors linked together for one big display) or 4k, which almost no one owns.

      I’ve also not had issues with AMD drivers. TR seems to only report on AMD issues.

        • Airmantharp
        • 6 years ago

        To be fair, AMD has had more issues, and more serious issues, over time. They don’t seem to ever stop.

      • daviejambo
      • 6 years ago

      So nobody can tell me. I’ve got 440 games on steam probably ran them all at least once without an issue ..

        • Airmantharp
        • 6 years ago

        If you didn’t run the newest games as they came out, you probably missed a lot of the issues; hence, you’re lucky :).

          • daviejambo
          • 6 years ago

          I buy games on launch all the time , near enough all AAA games

          Could it be that there is nothing wrong with the drivers ?

            • Ph.D
            • 6 years ago

            I think you have to accept that your personal (positive) experience does not mean much if so many other people do not have the same experience.

            E.g. I am running 2x 6970s in crossfire and things like updating my drivers has been a huge challenge more often than not.

      • daviejambo
      • 6 years ago

      Nobody has actually gave me any examples of AMD drivers being rubbish

      Actually thinking about it , sleeping dogs would hang on the loading screen on release day , but I fixed it by installing the beta driver at the time

      Am just not seeing it guys

        • Airmantharp
        • 6 years ago

        I switched back to Nvidia after the crap that was 6900 CFX, so I don’t have a recent example, sorry- but that was hell, and I rarely pick up games on release.

    • Bensam123
    • 6 years ago

    I don’t think most gamers will be buying anything close to 4k anytime soon. It’d definitely be nice if you guys tested something more down to earth, say like high refresh rate monitors (120/144hz). I’m sure 4k is great for watching 4k videos and what not, but completely putting aside the ability to drive it and all the other issues associated with 4k, content in games simply isn’t that high definition and probably wont be for awhile.

    On the other hand pretty much every game in existence can benefit from a high refresh rate monitor.

    “Still, I think some other display technologies, like G-Sync/Freesync-style variable refresh intervals and high-dynamic-range panels, are likely to have a bigger positive impact on gaming. I hope we don’t burn the next few years on cramming in more pixels without improving their speed and quality.”

    Why isn’t testing focused on high refresh rates then instead of high PPI? :l

    It’s curious why GuildWars 2 was left on there, but Thief was taken out when GW2 doesn’t seem to support crossfire, unless it was meant as a ‘shame on you’ to AMD. There are quite a few other games that AMD properly supports in crossfire that could’ve been tested instead.

    • Chastened
    • 6 years ago
    • BIF
    • 6 years ago

    Can we TRY folding with this card? Even just one round of two WUs? Pretty please?

    • NeelyCam
    • 6 years ago

    Sooo… NVidia is winning and AMD losing? Again? That about sums it up?

      • Wild Thing
      • 6 years ago

      Hmm interesting observation.
      I would have thought it was the other way around.
      This new thing looks to be the fastest video card on the market,is faster than GTX780Ti SLI and runs pretty cool and quiet.
      Other than that your post might have some merit.

    • l33t-g4m3r
    • 6 years ago

    [quote<]My sense is that, most of the time, lower pixel densities combined with supersampling (basically, rendering each pixel multiple times at an offset and blending) would probably be more pleasing overall than 4K is today.[/quote<] That's generally my opinion as well. 4k is more problems than it's worth, not to mention wastefully expensive. The only thing I can think of to fix 4k shimmering would be fxaa, or some variation of it, because there's no good way to supersample that resolution.

    • UnfriendlyFire
    • 6 years ago

    I should mention that AMD’s drivers for single GPUs are adequate. Especially if you install RadeonPro.

    Crossfire, no experience with it.

    I should also mention that the drivers for Intel’s Sandybridge IGPs are piece of shit. Mine keeps crashing when resuming from sleep and locking up the laptop, and often breaks Waterfox when its built-in PDF reader is running upon resuming. Autodesk? Driver goes into the lalala-land.

    Also, the movie quality on the Intel’s IGPs are noticeably lower than when running Nividia’s or AMD’s GPUs. Especially if it’s trying to render a fuzzy circle in the movie, which turns into blocky stuff.

      • USAFTW
      • 6 years ago

      RadeonPro is great. I wish AMD would buyout the guy and implement it’s dynamic v-sync to it’s drivers. Not to mention FPS cap and clock profiling for individual applications.

    • JustAnEngineer
    • 6 years ago

    Did you run ANY benchmarks with a WQHD (2560×1440 or 2560×1600) display?

    • Meadows
    • 6 years ago

    Finally!

    The CPU branch merged with the GPU branch at AMD.
    At last we can get videocards that consume more power for less performance.

    • Billstevens
    • 6 years ago

    To be fair, AMD is performing well in two game tests that probably actually matter, ie Battle Field 4 and Crysis 3. The Cryengine 3 and Frostbite 3 game engines will be used in a number of next gen games so you want your next high end video card to handle these well.

    Arkham origins uses the unreal 3 engine but it may not be representative of other unreal 3 games which honestly aren’t very demanding except for heavy direct x 11 settings.

    The other games… is anyone really buying high end cards worrying about whether they can handle Call of Duty or Assassins Creed? These guys need to stop making their own crap engines and start using one of the high end game engines…

    Unreal 3 engine games looked great and ran silky smooth for almost half a decade, then your computer would get crippled by some one off crap game engine that looked like mud next to the unreal engine.

    Those resources they wasted making a useless game engine could have been spent making their game not suck.

      • Airmantharp
      • 6 years ago

      Pretty much this- most of the games I’m actually interested in will be on Frostbite 3, and BF4 will probably remain the best test case for that engine given the melding of decent graphics, physics, and multiplayer hysterics.

      Only game outside of that niche will likely be the new The Witcher game :).

        • Billstevens
        • 6 years ago

        Yeah, really looking forward to all Bioware games using that engine, they look great! Battle Field 3 Frostbite graphics were probably the last update that impressed me prior to the original Crysis Engine and Unreal 3.

        Haven’t made the transition to BF 4 yet. Trying to hold out on upgrading till the next wave of video card so there is a better chance I can handle these engines well.

        I am looking forward to seeing how initial Unreal 4 games look. My only interest in the Crysis 3 engine is the impending release of StarCitizen.

          • Airmantharp
          • 6 years ago

          BF4 is surprisingly easier to drive than BF3, I’ve found. No one in my community really had an issue with BF4 if they were already playing BF3 :).

    • Chrispy_
    • 6 years ago

    I’ve been dealing with [i<]idiots[/i<] all day so (ironically) please forgive my unforgiving mood: [list<][*<][b<]Five Hundred[/b<] Watts. [/*<][*<][b<]One[/b<] 120mm radiator.[/*<][/list<] AMD has [i<]no idea[/i<] how cooling works; [b<]NO IDEA!![/b<] /facepalm.

      • chuckula
      • 6 years ago

      Despite the claims by the M3gatrons of the world that I’m somehow wholly against this part, I think the cooling solution is actually pretty effective. The noise levels that TR recorded were quite reasonable and the chips seem to operate within their designed temperature range.

      Like I said, having to hang the cooler is a little inconvenient, but from a technical standpoint AMD seems to have done a solid job with the cooler.

        • Chrispy_
        • 6 years ago

        Seems to be unoverclockable;
        The throttling limit has been brought down to 75C to reduce leakage from higher temps,

        The 8-pins are delivering 213W each;
        that’s already over-spec and I’m not sure I’d want to push it very far.

        Whilst the noise levels are reasonable, it’s still both the loudest card tested, and the cards chosen for comparison are among the noisiest cards available today. You’d [i<]hope[/i<] that added watercooling would at least make it a little quieter than two 290X or 780Ti cards....

          • Airmantharp
          • 6 years ago

          I’d hope that the third-party guys are cooking up a 140mm or dual-120mm radiator setup to cool their overclocked versions, but as Chucky says, the current implementation seems very effective all things considered.

        • M3gatron
        • 6 years ago

        You are don’t flatter yourself.
        The testing methodology of this blog makes the 295×2 look worst than pretty much any
        other tech site. You can’t deny that, even if you and others tried. You all failed.

      • Firestarter
      • 6 years ago

      The card is relatively cool and quiet for a 500W monster, I fail to see the problem

      • cynan
      • 6 years ago

      While I agree that 3×120 would have been probably been closer to an ideal target for that kind of heat dissipation, particularly with any overclocking in mind, there is the obvious trade off of making this cooling “solution” as compatible as possible. Which means a single fan-sized radiator.

      But yeah. It [i<]is[/i<] just a tad raminiscent of the sub-optimal stock cooling that shipped with Hawaii when it debuted.

        • Lazier_Said
        • 6 years ago

        Compatible? A $1500 card with an external cooler and a 500W power draw has already thrown compatibility out the window.

        Going all the way to water and then making it noisy anyway is doing it wrong.

          • Airmantharp
          • 6 years ago

          You know, the original R9 290x cooler was arguably loud. This thing isn’t.

      • Waco
      • 6 years ago

      Water cooling GPUs and not CPUs allows for MUCH higher water temperatures. Deltas of 10-15 C over room temperature are not out of the ordinary…

      • Srsly_Bro
      • 6 years ago

      m3gatron’s alt account here. ^^ check the IP

    • GeForce6200
    • 6 years ago

    I’m confused by the 500 watt claim that this card can draw. Please correct me if my math is wrong but two 8PIN PCIE connectos is 150W max each. (2X150=300) from connectors then through the PCIE bus is 75W for 1.0,2.0,2.1. Can PCIE 3.0 allow 150 or 300 watts from the slot itself? Been a while since I have looked at the specs.

      • Airmantharp
      • 6 years ago

      Probably just a safety thing to try to keep people from trying to use borderline junk PSUs.

      • f0d
      • 6 years ago

      the two 8 pin connectors at 150w are a pciesig requirement yes but most power supplys are able to greatly exceed this requirement

      i ran a single 6pin cable to each of my brothers gtx670s (which required 2 of them on each) running sli and he had no troubles at all running it for a couple of years now with a corsair AX760i

      sure the pcie specs say 150w max per 8pin cable, but it has been well known for while now that quality power supplys can greatly exceed that

      • Machupo
      • 6 years ago

      not to mention the “500W requirement” and then the immediately following “50A [@ 12VDC] requirement”…

      Very surprised that there isn’t a third 8-pin power connector on the card, or that 500W nonsense could be as Airmantharp said, just FUD to preclude crappy PSU blowout. Hopefully, someone who would drop 1.5 kilobux on a GPU would have the sense not to use a Deer PSU, lol.

        • entropy13
        • 6 years ago

        [quote<]Very surprised that there isn't a third 8-pin power connector on the card[/quote<] Over at TPU, someone addressed that by saying they [url=http://www.techpowerup.com/reviews/AMD/R9_295_X2/images/front.jpg<]ran out of room[/url<].

    • ssidbroadcast
    • 6 years ago

    Gee whiz, I sure hope liquid-cooled doesn’t become the standard with future video cards. Remember the good ol’ days when a decent video card like an ATi 9800 Pro only took 1 slot? Double-wide cards are enough. The line must be drawn heeyah!!

      • Airmantharp
      • 6 years ago

      I’m all in for quiet solutions that get all that heat out of the case- especially in situations where using more than one card is preferable due to workload.

    • alientorni
    • 6 years ago

    i have one last thing to say on this article.

    what happened to that glorious nvidia directx driver? is it running on this review? LOL

      • Airmantharp
      • 6 years ago

      Where are all of those glorious Mantle games?

      I mean really, it’s not like Nvidia needed some savior of a driver to begin with. Most of the work here will be on Microsoft’s part.

        • alientorni
        • 6 years ago

        Mantle games are out there and are doing what they’ve been made for. Reducing CPU overhead.

          • Airmantharp
          • 6 years ago

          Check, spigzone.

    • Chrispy_
    • 6 years ago

    I’m still reading the review, but it’s just another example of 4K not being ready.

    So many things about 4K are in beta, ugly workarounds, not yet available, or still unavailable because of paper launches. We don’t have bandwidth for 4K streaming, there’s no readily-available 4K media, 4K screens are really only 4K panels, because everything from the DSP to the input hardware has to be doubled up and taped together behind the scenes.

    Then, finally, there’s GPU performance. 4K requires four times the fill rate of a 1080p panel, yet even SLI and Crossfire users are having to dial back settings just to run at 5760×1080, which is only three times the resolution, not four.

      • Airmantharp
      • 6 years ago

      MST blows, and GPU’s aren’t regularly shipping with enough RAM, 4GB being bare minimum for current-gen games.

      But everything else is there. Hell, even cellphones can shoot 4k video, and consumer-level 4k cameras are available from every major manufacturer; even YouTube and Netflix are ready to stream it.

      And on the computing front, when it comes to any sort of gaming or graphics work, we’ve been ready for 4k. All we really need is for the industry to pull their heads out.

        • Concupiscence
        • 6 years ago

        Nominal 4K’s nice, I guess. But what’s the de facto bitrate? Are YouTube and Netflix eagerly working on h.265 support to optimize what bandwidth they have available? If not, what’s the plan? It feels like there’s a huge push to try and duplicate the excitement behind the unveiling of 1080p, but the infrastructure isn’t there to support it. What’s being lashed together as an interim solution until that backbone exists seems like a bunch of penguins fumbling around in the dark, each trying to knit part of a tacky Christmas sweater with no fingers or thumbs.

          • UnfriendlyFire
          • 6 years ago

          And then you have the ISPs.

          “Bob, we need to deliver four times the bandwidth for the 4K streaming.”

          “Implement a 200 Gb data cap and strangle Netflix. No more traffic congestion issues.”

      • puppetworx
      • 6 years ago

      Also textures and shading still aren’t great for 4K as noted in the review. Until 4K screens are mainstream games are unlikely to target that resolution. Early adopters beware.

      • stdRaichu
      • 6 years ago

      Indeed, I was all set to snap up a couple of the new Dell UP2414Q 24″ 4k monitors (for use in photography more than anything else, although I’m also relishing not being able to see the individual pixels any more) until I found out they were still using MST – I don’t think there’s a single 4k out there yet that doesn’t. People have had lots of issues with that part of the monitor (e.g. one half of the screen picks up changes, the other doesn’t, etc etc).

      Holding out for monitors that’ll present themselves as a 4k display natively and then we’ll finally see how it stacks up…

    • Milo Burke
    • 6 years ago

    [quote<]AMD may well have fixed its well-documented CrossFire issues with 4K and multiple displays, and [b<]son[/b<], testing needs to be done.[/quote<] Shouldn't that be "soon", not "son"?

      • Firestarter
      • 6 years ago

      son, I am disappoint

        • Vaughn
        • 6 years ago

        Maybe I missed an internet meme.

        But why do people keep posting “I am Disappoint”

        The word is disappointed is it not?

        Sound like a high school drop out!

          • Airmantharp
          • 6 years ago

          For your education:

          [url=http://i1.kym-cdn.com/photos/images/newsfeed/000/003/866/nfNeT7YvTozx0cv7ze3mplZpo1_500.gif<]Son, I am disappoint[/url<]

      • ronch
      • 6 years ago

      Purportedly, yes.

      • Phartindust
      • 6 years ago

      Reading comprehension fail.

    • Anovoca
    • 6 years ago

    The radiator design is kinda interesting. From the look of it, to make the water pump properly, it is being fed from one gpu to the next before hitting the radiator to be cooled. Based on which direction this flow occurs, wouldn’t the health and longevity of the second processor be significantly less than the first? I would be curious to see what the individual gpu temps are. The unavoidable nature of this design is that one of the processors is always going to be washed in already hot liquid.

      • Airmantharp
      • 6 years ago

      The water doesn’t really heat up enough to make a real thermal difference- maybe the second GPU runs ~5c hotter under load, if that. This is especially true if the flow rates are kept higher, so it’s really not a big deal. Custom water-cooling rigs are regularly built with GPUs in series to no ill effect.

        • Waco
        • 6 years ago

        Even with my dual 4870X2s and a hot Phenom II heavily overclocked my water delta (from output of radiator to output of the second GPU block) was under 1 C with good flow.

        It’s a non-issue.

      • BIF
      • 6 years ago

      All that matters when the water hits the second GPU’s block is that it still has available heat-holding overhead.

      Remember how cooling works: We’re actually transporting heat energy away from the components; not transporting cold (negative energy) to the components.

      To have a separate loop per GPU would double the cost of the cooling solution (2 pumps, 2 radiators, hoses, fittings, clamps, and water), but would not provide “double the cooling”, and this card is already complex enough and expensive enough as it is.

    • Ominence
    • 6 years ago

    As Scott pointed, the results are a little puzzling to TR themselves… I assume that’s so anyway. Given the somewhat 1:5 conclusions rating, I’d like to just throw this out there.

    For memory limitation tests and also processing capability:

    1. Use DX as the control API, or OpenGL
    2. Driver settings at default
    3. In game settings maxed out- no skimping on any detail
    However disable TXAA, FXAA, HBAO, PhysX, etc on NVIDIA
    And disable AAA, MLAA, HDAO, TressFX, etc on AMD
    4. MSAA, SSAA, SSAO on either with other proprietary only features disabled when comparing true generic vanilla variety apples to apples (think of it as the same type of apples from two different farms with very similar farming techniques but different geographic locations and cultivating environment).
    5. Minimum AA level equivalent to 4xMSAA

    No place for me to suggest stuff to the experts who investigated FTV and micro-stutter but I believe there are significant advantages in mature proprietary implementations that are disadvantageous to the other side…. all the same however, it’s the gaming environment and experience that matters above anything else so we do not need to necessarily level the playing field either.

    As an owner of an UHD monitor, I know when my games start to choke and usually I need 4xAA to do that with 3GB frame buffer. 2x is effectively a non issue with 3GB.

    Hope this was helpful.

      • Damage
      • 6 years ago

      And here I was just gonna compare performance and memory use to a Titan. 🙂

        • Ominence
        • 6 years ago

        Response to the TR noob from da man himself! How cool is that:)

        …Although not exactly reflecting the AA comparison I suggested in my OP… some food for thought below.

        Unigine Valley- not updated since release- so must work?

        4x maxed out UHD: 15-19minFPS
        8x: 2-4min

        Benchmarks are usually optimised for, to an extent, in driver updates and my setup couldn’t keep up on a single screen! So 3GB and possibly even 4GB is not overkill and frame buffer limits are very real in high stress scenarios. Those results are repeatable as well- happens in the rain scenes.
        Top it up with game codes that are not optimised and you can end up with a real memory hog that will potentially cripple performance. Efficient design isn’t really a strong point in the PC gaming industry:(

        3960x, 32GB 1866, 4x CFX 7970GHz (CPU/GPU on water)

    • albundy
    • 6 years ago

    hope the R&D was worth it. not sure how they plan to sell this very expensive card.

      • Airmantharp
      • 6 years ago

      It’s more about the prestige, wouldn’t you think?

        • Prestige Worldwide
        • 6 years ago

        Worldwide-wide-wide…..

    • Ninjitsu
    • 6 years ago

    So…I guess Mantle is even more pointless, isn’t it? I mean, other than the concept of low CPU overhead…there’s nothing worth the effort of another API.

    Look at how BF4 and Thief launched, you end up wondering how much of that was because they had to support 6 different code paths (PS3,4; Xbone, 360; DX11, Mantle). Thief still looks like it’s buggy and unoptimised.

    What happens in the end? DX11 performance is equal or better at graphically bound settings, and dual GPU works. And Nvidia also does almost as well with DX11.

    I really don’t think AMD should continue pushing Mantle anymore, they’ve achieved in pushing the lower overhead part of the equation.

    Might as well save money and resources there and pump it back into DX, OpenGL drivers. Contribute to DX12 and all, but let MS take the brunt of the costs.

    • bwcbiz
    • 6 years ago

    Somebody in the AMD marketing department skipped a step in that super-secret delivery. Obviously the combination on the briefcase should have been 2-9-5, 2-9-5 (295 X2).

      • Ninjitsu
      • 6 years ago

      Yeah like Nvidia with the crop circle, before Tegra K1 was announced (it had “192” written all over it in Braille, to represent the stream processor count of a Kepler SMX).

    • flip-mode
    • 6 years ago

    God, that sucked.

      • danny e.
      • 6 years ago

      I think it blew.

        • Meadows
        • 6 years ago

        Neither, it’s filled with liquid.

          • Haserath
          • 6 years ago

          So it… Flowed?

            • puppetworx
            • 6 years ago

            It convected.

            • USAFTW
            • 6 years ago

            I think it’s better to say it pumped.

          • cynan
          • 6 years ago

          Can’t any fluid be sucked or blown?

    • USAFTW
    • 6 years ago

    What kind of world are we living on! A 1500$ card is a bargain, compared to a 3000$ competition which no one has reviewed yet.

      • SomeOtherGeek
      • 6 years ago

      A world that is way out there…

    • USAFTW
    • 6 years ago

    As an AMD fan, I don’t think it’s okay for them to release a card that doesn’t comply to PCI-E power regulations. This card has a peak power consumption (in techpowerup review) of 646 watts. That, minus 75 watts for PCI-E slot is 571 watts. Each 8-pin connector is specified for 150 watts. 571/150= 3.8
    So this card need another two 8-pin connectors.

      • sschaem
      • 6 years ago

      Not sure you have the math right.

      First a CPU does use more power under load then idle.

      A Core i7-3820 is rated at 130W , and the system seem to use ~70watt at idle, with a 80% efficiency power supply.

      So 700W-70W * 87% = 548w – 130w = 418w

      150 + 150 + 75 = 375w

      So yea, above spec…

      Checked, anand seem to have a way different power usage model.
      95W iddle, 683W load. gving 381watt

      My guess is that the card actually does peek at 375w += 5%

        • USAFTW
        • 6 years ago

        I thought they were taking PSU inefficiencies into account. Either way, it’s above spec.
        Also, techpowerup numbers are for GPU only, rest of system excluded. Or maybe that’s what they claim.

          • sschaem
          • 6 years ago

          For sure Anand report power draw at the wall.

          But in any event I think you pointed out a valid reason for concerns.

          Tech report & Anand report seem to show a possible limit of 375w ~ 5%
          techpowerup numbers are completely out there and dont match…

    • jessterman21
    • 6 years ago

    Love the difference in this review to the other glowing reviews on other sites. I wonder if the Radeon hardware team absolutely hates the driver team…

    I definitely agree that 4K is mostly pointless – most games just don’t have the textures and models to look any better at a resolution higher than 1200p. Case in point, I play most of my older games at a custom resolution of 2160×1350 (highest my monitor will display), which either makes games look great, or makes flaws more visible. I had to turn Arkham Asylum back down to 1920×1200 for this reason…

    • bfar
    • 6 years ago

    Are AMD drivers really up to scratch?

    Any thoughts from those who have recently owned cards from both the green and red team?
    Older games? Last gen product support? API support? Linux support? Multi gpu profile support? Are they up to speed with the competition?

    For someone who has occasionally and tentatively hovered the mouse cursor over the buy button on an AMD card, it has to be asked.

      • odizzido
      • 6 years ago

      I’ve generally found that ATI is better with old games as Nvidia cards just seem to stop working properly with older games. However Nvidia cards I’ve had better luck with newer games. And they both have stupid annoying bugs they won’t fix.

        • Airmantharp
        • 6 years ago

        You’d think that ATI would only be good for older games since they’ve been gone for like a decade now :).

    • wirerogue
    • 6 years ago

    awesome!!! i can’t wait to try some mining with this card.

    • rogue426
    • 6 years ago

    Damage, Are you keeping this card when testing is done or does it go back to AMD?

    • entropy13
    • 6 years ago

    Apparently it’s 500W in a typical gaming scenario (TechPowerUp, and it’s only the card’s consumption measured there), so that means the 295 X2 accounts for at least 70% of power usage…

    Just to add that W1zzard would probably have a new GPU-Z version out ASAP (although I think there’s a beta already? LOL). He was able to measure load temps of around 60C. Temps aren’t compared across cards though, but based on the 290X from Powercolor with the PCS+ cooler, He measured a 68C load temp for it so it’s somewhat comparable with what the TR crew got.

    • Prestige Worldwide
    • 6 years ago

    4K…. irrelevant to 99% of all users

    Then again, so is a card like this or the Titan Z

      • Krogoth
      • 6 years ago

      At least, Titan Z is a beast at GPGPU related stuff that involves DP and a decent deal (when you compare to real Quadros and Telsas).

        • M3gatron
        • 6 years ago

        Nice try.
        Titan Black SLI would do the same job for 1000$ less.
        So yeah Zombie is useless.

          • Ninjitsu
          • 6 years ago

          Unless you want a single card solution.

            • M3gatron
            • 6 years ago

            And pay 1000$ premium for that??? and less performance.
            Titan Black is already overpriced and a dual gpu should cost no more than 2000$.
            Nv fans keep saying the Zombie is a workstation card but forget about Titan Black. What is that for??

            • maxxcool
            • 6 years ago

            To bad the litecoin miners will make this card 1500$ 🙂 and unavailable to most.

        • Ninjitsu
        • 6 years ago

        Well, GCN isn’t a slouch at OpenCL stuff either…

    • Krogoth
    • 6 years ago

    Not impressed.

    This guy is the total opposite of the Geforce 690. Slower than 290X CF, eats more power, requires exotic cooling and despite having such cooling it is still kinda loud.

    Not even a die-hard ATI fanboy and cryptominer would want this.

      • derFunkenstein
      • 6 years ago

      Actually the die-hard miner is the only person I expect to really want this. 14 top-end AMD GPUs in 7 PCI-express slots. Lots of power required for that, though.

        • Krogoth
        • 6 years ago

        Not really, the up-front cost is too high and it is cheaper to get 2 normal 290s for 2/3s of the cost and get nearly the same throughput.

      • Airmantharp
      • 6 years ago

      I’ll have to agree, and then agree to disagree- yes, it’s not like a GTX690 in overall finesse, but for shear performance for the amount of noise it makes, and that it’s ‘exotic cooling’ isn’t all that exotic anymore, it makes a whole lot of sense. Particularly if you find yourself a winning lottery ticket and want to put two of them in a system, something that you’re just not going to do with the R9 290Xs :).

    • kamikaziechameleon
    • 6 years ago

    Here we have the same bottle neck AMD has struggled with the last 10 years. GPU’s that have AMAZING hardware and generally solid $ to GPU value. But then you get AMD’s primitive driver approach that has never consistently been any good. They are too busy producing mantel to address the fact that they have never gotten a handle on how to reliably and consistently produce the software portion of their products. This begs the question what is the future of Mantel if they’ve never been able to consistently resolve basic driver support for their products?

    🙁

    Atleast they’ll always have bitcoin mining.

    • sunaiac
    • 6 years ago

    AC4, Batman, and DX9 game 😀
    How to absolutely make sure you can destroy a dual GPU AMD card ? Choose your games right 🙂

      • Ninjitsu
      • 6 years ago

      There’s like, ONE DX9 game in there.

        • sunaiac
        • 6 years ago

        Huge mistake totally invalidating my point corrected sir 🙂

    • chµck
    • 6 years ago

    You win again, Nvidia
    -Someone who wishes AMD was more competitive.

    • ronch
    • 6 years ago

    Waitaminit… how would someone using this thing fit another water cooling solution to cool his/her FX-9590?

      • HisDivineOrder
      • 6 years ago

      Now imagine it:

      FX9590
      Quadfire (2x R9 295 X2).

      I dub thee, The AMD Raging Piledriver of the South Seas.

      • Wild Thing
      • 6 years ago

      That’s a pretty wicked piece of hardware…do want!
      Just had a read of the testing at HardOCP,they gave it the Editor’s Choice Gold Award!
      It sure does well in BF4 against the GTX780Ti SLI setup.
      Wish it wasn’t so expensive tho 🙁

      Not sure why that says @ronch….I must have clicked some wrong button somewhere..oops

      • DPete27
      • 6 years ago

      Mount the CPU radiator to the top case panel.

      • Prestige Worldwide
      • 6 years ago

      Could easily be mounted in the front or bottom air intakes, or a top exhaust.

        • Airmantharp
        • 6 years ago

        Yup.

        You’d expect that changing out a ~$100 chassis isn’t going to be an issue for someone buying a $1500 GPU, if their current enclosure is so antiquated that it can’t be reconfigured to support this card.

    • killadark
    • 6 years ago

    BF4 does seem to be a memory hog it uses 3.3 gb of my 4gb on my 290 almost every game i play.
    Has not broken that barrier yet though maybe limited not sure

      • Ninjitsu
      • 6 years ago

      As strange as it sounds, this may be because of consolitis.

    • ronch
    • 6 years ago

    First it was the 225w FX-9370 and FX-9590. Then this. Look, AMD, someone out there cares about power consumption, you know. Power ain’t free.

    Like I’ve said before, finesse is very important. A 2.0L normally-aspirated Honda engine that puts out 240 hp earns a lot more respect than a 4.0L pushrod V8 engine from, say, GM or Dodge that puts out that same amount of power.

      • HisDivineOrder
      • 6 years ago

      Shhhhhh. The hardcore who don’t give a fig about performance per watt or even the fact their computers are on the verge of catching on fire might hear you.

      And why should that matter to you? Because they have computers that are on the verge of erupting into fireballs that they could throw at you. That’s why.

      • killadark
      • 6 years ago

      it sure is cheap here at least in Saudi Arabia ;P
      and believe it or not literally free in kuwait

        • ronch
        • 6 years ago

        Oh, but isn’t it hot enough in Saudi Arabia? An FX-9590 rig with one of these things will make your country hotter by 1C.

      • sschaem
      • 6 years ago

      ??? yes, it will use 25% more power then a dual SLI system, but it also can be 10% faster.
      So its not crazy on the power front.

      What make no sense is for the 295x to be priced HIGHER then two 780 TI…
      Untill you start to test OpenCL and more varied GPGPU workload that leverage integer functions.

        • ronch
        • 6 years ago

        Well, that water cooling kit helps raise prices above the competition.

          • sschaem
          • 6 years ago

          And this also keep it quieter then a single 780 ti for gaming.
          But this should add ~$70 to the production cost

          The issue mainly is two MSRP of $499 turn into $1500…
          so $500 for one less slot and quieter operation?

            • ronch
            • 6 years ago

            AMD was inspired by the Titan Z.

        • USAFTW
        • 6 years ago

        Well, It doesn’t make sense for nVidia to launch a Dual GPU titan with slower frequencies and triple slot cooling that’s slower than dual Titans and cost $1000 more. Also, for gaming, It costs twice two 780 Ti’s and performs worse. Also is three slots wide, meaning no quad-sli. So, what’s the merit to that?
        I’m gonna go as far as saying this makes more sense than a Titan. Even though, It doesn’t make any sense to me. Why not just wait for Maxwell and save yourself from being laughed at?

      • Airmantharp
      • 6 years ago

      Not if said 2.0L engine needs to get to 7000RPM+ to actually use all of that power, when the pushrod V8 has it from 1500RPM.

    • El_MUERkO
    • 6 years ago

    I returned a pair of 290x’s and picked up a pair of Titan Black’s because AMD’s crossfire implementation was diabolically bad. SLI is better, but not by much.

    • jdaven
    • 6 years ago

    Lol! The reviews at sites like TR, AT and Techpowerup are all coming to different conclusions. Pick your site, pick your game and this is either the worst card ever or the best. What is a consumer to do?

    Oh that’s right nothing because the card costs ONE THOUSAND FIVE HUNDRED DOLLARS!!!

    Can’t wait to see what the $3000 Titan Z card can do for the uber rich. /sarcasm

      • spuppy
      • 6 years ago

      I trust the reviews that:

      -Disclose their testing methodology
      -Show a video of their exact benchmark run
      -Disclose exactly what settings were used for each game
      -Use settings – either resolution or details – that push the new card to its limit
      -Redo benchmarks each time when new drivers are released
      -Use either FRAPS or FCAT or both, using frame time measurements, not just FPS

        • derFunkenstein
        • 6 years ago

        I agree, and I would also add to it that I trust reviews that don’t have vendor-sponsored editorial sections. Anandtech takes a huge credibility hit on every GPU and CPU review they do because they take AMD’s money. And what did they say?

        [quote<]Between the price tag and the unconventional cooler it’s certainly a departure from the norm, but for those buyers who can afford and fit this beastly card, it sets a new and very high standard for just what a dual-GPU should do.[/quote<] Huh...

          • slowriot
          • 6 years ago

          I don’t visit AT so I don’t know their exact situation. But now seems like a time to point out that TR’s Graphics forum section is sponsored by Nvidia.

      • HisDivineOrder
      • 6 years ago

      Well, between those three sites, AT is bought and paid for by AMD.

      That’s not opinion. That’s fact. So I don’t trust a word they say about AMD. They got their special, “AMD Section” that was paid for as a pseudo-review, but mostly advertising part of the site. Considering that reviews or anything AMD in fact falls under the purview of that AMD Red section, I’d suggest not trusting anything they say without verifying it somewhere else.

      Like TR.

        • derFunkenstein
        • 6 years ago

        not sure who minused you, but I fixed it.

        • Ninjitsu
        • 6 years ago

        Tom’s Hardware seems to have come to a similar conclusion as AT, though from what I gather THG is not very popular with you folks.

      • Krogoth
      • 6 years ago

      It is AMD’s attempt to steal the Titan Z’s thunder.

      It is too bad that 2x780Ti run circles around 295X for the same cost and less power consumption.

        • chuckula
        • 6 years ago

        Due to the Titan-Z’s ludicrous pricing, AMD did succeed on the first point, but you are right that the 780TI in SLI seems like a saner solution… if anything in this price range can be called “sane”.

        • Wild Thing
        • 6 years ago

        Well it seems not all testers think that…
        [url<]http://www.hardocp.com/images/articles/1396914364xmjh6xHKlw_3_2.gif[/url<] That's seems pretty definitive.

      • entropy13
      • 6 years ago

      For TPU, once the score goes below 9.0, it’s a case of “still good but there are better alternate options”.

      • Ninjitsu
      • 6 years ago

      Well, AT’s conclusion is basically:

      For 1440p, go 780 Ti SLI.
      For 4K, go 295 X2.

      Combine it with TR’s result (apart from the observation that 4K gaming just isn’t quite worth it yet), and you get the conclusion that if you really want to spend that much, 780 Ti SLI or Titan Black SLI are much better options.

      • puppetworx
      • 6 years ago

      PCPer gave it a Gold Award. They tested with some different games, included FCAT results and included results for 2560×1440 as well as 4K. Their R9 295 beat SLI 780 Ti in FPS and frame-variance in almost every game.

      PCPer and TR are the two sites I trust most for reviews but on this occasion it seems like the games you choose to test can result in very different conclusions. Since you can’t realistically test every single game the truth lies somewhere between the two conclusions.

    • UnfriendlyFire
    • 6 years ago

    Hopefully this GPU won’t be absurdly priced since the mining craze is starting to end.

    “I think I’ll wait for the 290X’s launch price to drop from $400 down to something more reasonable.”

    *Three months later*

    “$1000… Mother of god…”

    • UnfriendlyFire
    • 6 years ago

    Reminds me of their Volcanic Island GPUs’ and Kaveris’ launches. Iffy beta drivers for everyone.

    A shame that the hardware engineering was undone by their software.

    EDIT: The 295’s BF4 and Crysis 3 performance looks interesting. But who buys a GPU to play two or three games, other than the occasional folks that buy Nividia GPUs just for PhysX?

      • Airmantharp
      • 6 years ago

      I bought two GPUs to play BF3- but that was the most demanding title at the time, and that setup ensured that anything else I wanted to play ran quite well, aside from AMD’s drivers.

    • southrncomfortjm
    • 6 years ago

    Always fun to look at the reviews of the high end cards. I just can’t see myself upgrading to anything higher than what is needed for solid 1080p gaming until 40 inch 1600p monitors come out at sub $500 price points.

      • Billstevens
      • 6 years ago

      Oculus Rift and Star Citizen = next required upgrade. Please make it work OR and RSI : )

    • superjawes
    • 6 years ago

    [quote<]Alongside it, on the lower right, is the radiator portion of Project Hydra, a custom liquid-cooling system designed to make sure Vesuvius doesn't turn into [u<]magma.[/u<][/quote<] Found an error in the article. When exposed to atmosphere, it is called "lava" 😉

      • Ninjitsu
      • 6 years ago

      I’m not sure a volcano can “become” magma/lava anyway…

        • superjawes
        • 6 years ago

        Well the rocky bits can. If you defined a volcano as the mountain plus the magma system that created it, then part of it is already magma (+lava if it is presently erupting)..

          • Ninjitsu
          • 6 years ago

          Oh…no I just consider a volcano to be an opening that can serve (or has served) as an outlet for magma…the mountain is at times created by solidified lava from a previous eruption, but then that can’t become lava <i>again</i>, not on the surface at least.

          I mean, the mountain can’t melt, can it?

    • oldDummy
    • 6 years ago

    Nice review, sounds like a royal pita to test.
    Bleeding edge 4K hardware doesn’t appear to be ready for prime time, yet.
    What a relief, these things are expensive.
    Keeping the GTX690 with my Korean 27″ thank you.

      • Krogoth
      • 6 years ago

      I doubt it will be ready for a while. We are getting close to the limits of what silicon can handle on the manufacturing end.

        • Airmantharp
        • 6 years ago

        If these cards were manufactured on Intel 14nm I’d be inclined to agree- but TSMC’s aging and soon to be replaced 28nm process is far from the limits of silicon manufacturing.

        As far as 4k- the only real obstacle is getting fully-compliant DisplayPort connections going in the monitors. More VRAM is needed, but that’s already figured out and is only a marketing decision away from happening.

          • Krogoth
          • 6 years ago

          The scale of economies and laws of diminishing returns are rearing their ugly heads. It is getting harder and harder to cram more transistors without turning the silicon into a blast furnace and it still make it economically feasible to produce in mass quantities.

            • Airmantharp
            • 6 years ago

            I don’t disagree, I just don’t think that we’re there yet 🙂

    • Jon1984
    • 6 years ago

    Great card, crappy drivers… They should have waited to fine tune them before lauching it. Dissapointing.

      • alientorni
      • 6 years ago

      Those drivers aren’t getting any better if you test games like assassins creed iv and bstman arkham origina. Both are NVIDIA gameworks games, and they have made everything possible so AMD can’t optimize their drivers into those games.
      That’s agressive and exclusive and is one of the reasons I hate nvidia and their politics. Never inclusive, never as one ecosystem. Always pushing their own side

        • Jon1984
        • 6 years ago

        I have an AMD card, and my drivers are fine for the moment. It’s their crossfire thing that need to be polished, specially if you launch a 1500$ card.

        Although seeing reviews in other sites say the opposite.

        But I’ll stick my conclusions with those which I’m pretty sure are not biased 😉

          • daviejambo
          • 6 years ago

          I have had twin 7970s for a while

          First come out , got a bit of mic-microstutter every now and then

          Now – Never notice it

          I often wonder what folk go on about when they say things about AMD drivers as I’ve never had an issue.

          • alientorni
          • 6 years ago

          Guess what. I have an amd GPU too. And only i can say is that ac 4 and batman AO perform deliberately bad. You can see those game frametimes and fps have nothing to do with what you’re looking in your screen. Strange behaviours.

          Have you seen the results or just read? Those frametimes and fps problems aren’t just on cf, they are also on single gpus on those particular games, If you think that the issue is crossfire then tr has washed your brain to the point you can’t even analyse what you see.

            • daviejambo
            • 6 years ago

            I don’t know what you are talking about both of those games ran fine. ac4 was a bit up and down when in the middle of a city and loads of things going on but I’d imagine it’s like that on Nvida cards too

            Batman AO was a solid 60fps at 1440p for me , although I’ve only played it for a couple of hours

            • Jon1984
            • 6 years ago

            Hum, don’t know about those games, but so far with a single card I have zero problems.

            I can take my own conclusions though 😉

    • anotherengineer
    • 6 years ago

    Nice review Scott.

    I may have read right over it without noticing, but how was the monitor connected to the card, display port, or DVI?

      • Damage
      • 6 years ago

      DisplayPort. DVI doesn’t cut it for 4K.

        • ClickClick5
        • 6 years ago

        Are there any games that render in a 4K feel? In other words, I can emulate Mario Kart 64 at 4K, and it is running in 4K, but the game will just be a crisp, clean 50×50 texture copy/paste. Are there any games now that have 4K in mind, with textures to match? Metro Last Light has 2048×2048 textures and looks amazing, so when a game comes with 4K textures…goodbye VRAM.

          • Damage
          • 6 years ago

          Look at the games I chose to test (or tried to, like Tomb Raider). Most of them have detailed enough assets to take advantage of 4K. That’s part of why they’re in there. Crysis 3, BF4, CoD Ghosts, and AC4 are particularly good. They really are 4K-ready. Some other recent-ish games like Max Payne 3, Far Cry 3, and AC3 are obviously good candidates, too.

            • alientorni
            • 6 years ago

            Yeah. but we all know that ac 4 and batman AO are gameworks games that have locked their codes by nvidia so that is why this isn’t accurate about this GPU performance. Those games will never perform better, and you know AMD has nothing to do with it. It all about nvidia and their agressive politics. Test more “agnostic” games and one from each side. Like old times.

            Note aside this nvidia politics getting more aggressive are only harming the PC gaming.

        • anotherengineer
        • 6 years ago

        Thought as much. Thanks.

        I’m surprised you didn’t try a four-way 1080p panel eyefinity setup with this guy. Same amount of pixels, I wonder if it would have worked better than the single 4k display?

    • M3gatron
    • 6 years ago

    Your testing methodology makes AMD cards look like crap.
    3 Games that give a clear as day advantage to Nvidia cards.
    I don’t know if it was intentionally or not.

    You should add at least 2 games to even the field.

      • chuckula
      • 6 years ago

      How DARE TR test every single Mantle-enabled title on the market in this review! AMD doesn’t stand a chance with unfair biased ambush tactics like that!

      How DARE TR attempt to test Tombraider that uses AMD’s TressFX with an AMD video card!!

      Any so-called “reviewer” who has the unmitigated gall to run so-called “tests” on these cards instead of merely repeating the perfectly good marketing material that AMD has generously published should be shot!

        • M3gatron
        • 6 years ago

        AC 4, Batman Games always clearly favored Nvidia no matter what AMD did. CF doesn’t work on Guild Wars 2 so that was a useless test, they also used the ugly FXAA which clearly favors… you got it. And for last Call of Duty G.
        2 Mantle games you say?? They don’t look that much, and Mantle fares well against the “wonder driver” which was supposed to be faster in those exact games.

        Tomb Raider is not broken on AMD as other reviewers did manage to test this game at 4k with the 295×2.
        Metro Last Light is also great for testing as it stretches the cards to the max.
        Grid 2 also a good unbiased game, Far Cry 3, Bioshock.

          • chuckula
          • 6 years ago

          [quote<]2 Mantle games you say?? [/quote<] The mark of the fanboy: There are 2 -- count-em! -- 2 Mantle titles on the market. TR tested all of them. Rational person who actually has a brain response: Gee, AMD should work on getting more titles out on the market with Mantle! Koolaid drinker response: TR IS EVIL SCUM FOR IMPLYING THAT MANTLE DOESN'T RUN EVERY GAME IN EXISTENCE!

            • M3gatron
            • 6 years ago

            That all you’ve got??
            It figures.

            Those 2 mantle game are the most interesting part of this review as Nv with it’s PR was claiming that their 337 “wonder driver” would make them faster in those games which didn’t happen.

            You didn’t do anything to dismantle my point.
            I gave examples of unbiased games which would give a better total representation of what the 295×2 can really do in a system.

            This review was like testing games where CF scales very good and Nvidia scales very bad and the conclusion would be CF is much faster, which isn’t true.

            • maxxcool
            • 6 years ago

            337 did make them faster. the single card solution is defeated by mega-super-nvidia by 4 frames in bf4 !!! Amd defenseforce retreat to helisuper carrier!!

            • M3gatron
            • 6 years ago

            CF is still faster and on other sites the difference is much smaller like 1-2fps is not the 290x is still faster even for the single card.

            • Corion
            • 6 years ago

            [quote<]You didn't do anything to dismantle my point.[/quote<]I dunno, it sure seemed like he dissed Mantle to me.

          • superjawes
          • 6 years ago

          Alright, the AMD Defense Force is reeeeeeally getting old. Every time TR publishes a review that doesn’t show a clear advantage to AMD, someone has to cry about the methodology. “The games favor Nvidia!” So? That doesn’t invalidate the results. The games Scott tested are pretty mainstream, so if the AMD card stumbles, you should blame AMD, not Scott.

            • Billstevens
            • 6 years ago

            I have to agree. You can’t exclude major AAA game releases that people expect a high end card to crush. So what if Nvidia has close ties with certain games to optimize performance. AMD straight out paid to have Battle Field 4 to be optimized for mantle and their cards. You don’t see Nvidia’s performance sucking because AMD paid for an advantage.

            I haven’t seen numbers but it seems the root of the problem is that AMDs graphics driver software team is likely much smaller than Nvidia’s and as such they have to prioritize optimizing for fewer games than Nvidia does. Granted they seem to have their priorities straight because it seems like single card optimization is decent while the less mainstream dual card setups are what are suffering.

            If you want to bitch that AMD is getting 100% locked out of optimizing their hardware to a specific game because Nvidia paid off the game maker then you have to bring some proof because that is a news worthy offence, but that is not what is happening here.

            • alientorni
            • 6 years ago

            Is more than that. I’ve posted a long time ago about this and I was downvoted.

            [url<]http://www.extremetech.com/extreme/173511-nvidias-gameworks-program-usurps-power-from-developers-end-users-and-amd[/url<] Maybe is because the article is a little biased but that doesn't denies the fact behind gameworks.

            • Billstevens
            • 6 years ago

            The real lesson from all of this, should be one that we all have to learn now, which goes for both politics and business and that is that they are all full of sh**. Its all PR and marketing sorcery. Before I continue I would also like to state that I skimmed your link and read around a bit just to see what the general feel for this Gameworks issue has been.

            Nvidia wins a design in to to get their Gameworks libraries into a few AAA games, non of wihch rated all that well btw, and they are able to make there hardware run a bit better. AMD gets upset realizing that they would likely have to pay into Nvidia to get the same advantage or do a costly work around so they instead accept that they got screwed and complain to media outlets that Nvidia is being anti-competitive.

            AMD comes out with mantel and pays for exclusive optimization rights for AAA titles like BF4 and major games coming to use with the Frostbite engine and the try to spin it like they are the angels of openness because they wan’t Nvidia to be able to use mantle too.

            Its all PR and marketing. They is a little bit of truth to each side of the story but in reality both companies are just in heated competition and they are both trying to weave a narrative that makes you buy their card.

            Everyone has their preference and we all know these companies are pulling anti-competitive stunts to get ahead, but please don’t reinforce the PR spin that either the loosing or wining company is shoving down our throats. TR is very objective in their testing, they included some games which favored AMD and also some which clearly favored Nvidia. If they threw out every game which had a development bias toward one studio or another we wouldn’t be able to have a review.

            • M3gatron
            • 6 years ago

            […]AMD straight out paid to have Battle Field 4 to be optimized for mantle and their cards[…]

            Yest but Nvidia also got attention from Dice, their hardware also got specific optimizations for BF4.
            Batman and AC4 use Nvidia closed libraries so limited optimization for AMD.

            AC4 uses the same engine AC3 used and Batman well no need to explain. There always was a big gap between AMD and Nv in these games. Don’t tell me AMD did not want to optimize these game and ignored them for years. That is stupid.
            Nvidia doesn’t have to pay companies to cripple AMD just pay them to use their closed libraries and that is what they did.

            • Billstevens
            • 6 years ago

            I am not defending Nvidia. But you have to look at the reality and that’s that all we have to go on here that AMD was treated blatantly unfairly by the developers of the last Batman. That’s the spin they have chosen to give their response to poorer optimization for this game. I am sure there is some grain of truth to what they say but it is not likely to be the whole truth about why they didn’t get the performance they wanted in the game.

            The take away has to be that a review venue like TR cannot take sides and cherry pick games to use for a review based on the current state of PR battles between two companies. We have no idea what happened behind close doors with the batman developers, AMD and Nvidia. We don’t know what made them chose Gameworks over a more open solution, or why AMD wasn’t able to work around this. All we have are PR statements handed down to us from two competing companies. And I guarantee you that both companies statements about that situation are biased and only have as much truth as they are willing to share.

            To call a review biased because it is unwilling to act on the biased statements of a companies PR machine is just wrong.

            • maxxcool
            • 6 years ago

            Sounds fair to me.. That business.

            • ClickClick5
            • 6 years ago

            Here is something that has made me laugh for nearly 10 years…

            The Xbox 360 (the “chosen” console of last gen) was powered by a beta Radeon 2000 HD design. Games were made in favor of the console and ported to pc….where Nvidia won?

            The current gen consoles are BOTH Radeon powered….and yet….Nvidia wins playing the same port on the PC.

            Yet the Radeon cards (hardware wise) have an incredible amount of power, yet still fail.

            So, this points back to the drivers. 🙁 Come on AMD. Hire at least one more guy.

            • superjawes
            • 6 years ago

            Are you kidding? They don’t have the money to double the engineering staff!

            • nanoflower
            • 6 years ago

            You know, I would laugh if that weren’t too true. They just don’t have the money to put enough people into the drivers (and everything else they are working ) so they will continue limping along. It’s not a bad solution but anyone buying an AMD card needs to realize that it may have some issues getting the best performance with the latest games. One day I hope that will change and AMD can throw the same sort of engineering talent that Nvidia and Intel have at their drivers but that day is not today.

            • cphite
            • 6 years ago

            [quote<]Every time TR publishes a review that doesn't show a clear advantage to AMD, someone has to cry about the methodology. "The games favor Nvidia!" So? That doesn't invalidate the results.[/quote<] At some point, can we just come to the conclusion that the games favoring Nvidia is not skewing the results; but rather, that the games favoring Nvidia ARE the results...? I mean seriously, we all know that there are games that favor one over the other... but it seems to me that more often than not, the ones that favor AMD are a lot less noticeable. Of course, it may just seem that way due to the endless whining 😉

            • superjawes
            • 6 years ago

            I am against intentional rigging, back room deals, and generally anything that a business does to exploit the market instead of offering a better product for the consumer, but that’s not what people complain about.

            The complaints essentially amount to “Those games make AMD look bad! Use these games so AMD looks better!”

      • ClickClick5
      • 6 years ago

      Quake II, UT 3, Freelancer, Doom 3, Serious Sam, Need For Speed: Hot Pursuit, and finally to throw in an “older game”, Lemmings.

      May the 4K test begin.

        • chuckula
        • 6 years ago

        YOU FORGOT ZORK!!

        Don’t be eaten by a grue, benchmark Zork in 4K.

        • l33t-g4m3r
        • 6 years ago

        Speaking of classics, I think we need to maintain a list of must play favorites for reference. Benchmark games aren’t always the most fun to play.

      • Neutronbeam
      • 6 years ago

      And those games would be what exactly? If you’re gonna call somebody out at least stand up and be specific about it…so name the games.

        • derFunkenstein
        • 6 years ago

        Diablo II with the resolution hack and SNES9x doing 4K filtering on Final Fantasy III

      • superjawes
      • 6 years ago

      Yes, how dare TR test games that gamers would actually want to play!

      • HisDivineOrder
      • 6 years ago

      And last year when AMD had nearly every benchmark franchise (games often used as benchmarks) in Gaming Evolved, did you scream about the methodology then?

      These things go in cycles. Moreover, it seems like he did try to honestly compare Mantle games, too. It’s just Mantle’s advantages are… well, hard to find and more difficult to discern.

      Hell, he went to the trouble of testing in Thief in Mantle even when he had to change everything about how he was testing a game to do it.

        • M3gatron
        • 6 years ago

        So it’s AMD’s fault the can’t access the game libraries in the for the AC 4 and Batman Games as they are locked and can’t do a proper optimization??? Does Nvidia face this kind of limitations on Gaming Evolved Titles??

        As I said AC 4 and Batman Games are clear examples of biased games that alter the global results too much, nobody can deny that, nobody.

          • chuckula
          • 6 years ago

          I’m sure you say the EXACT same thing when Nvidia loses in a Mantle-enabled benchmark.

          You sir are a paragon of intellectual honesty and a beacon of fair-minded integrity for the whole world!

            • M3gatron
            • 6 years ago

            Mantle doesn’t have anything to do with Nv, it doesn’t forbid them from optimizing the game.

            Compared to you I am, no doubt.

            • Billstevens
            • 6 years ago

            Ill go one step further. If Nvidia convinces a game company to use their proprietary libraries to run a game that more or less a design loss for AMD. Its competition plain and simple. Is it fair, probably not? Is it good for gamers, not really? But its the reality of competition.

            AMD would and does do the same crap so neither of these companies are saints, they are always looking for an exclusive company advantage. AMD being at a driver disadvantage right now may talk a mean line about wanting to make the field more open so that everyone can make better optimized games but at the end of the day its all PR and just another part of their competition.

            We all want AMD to do well, but we can’t sugar coat the situation. Its not like the last Batman game was any good anyway, and AMD is making plenty of sales on the game for their exclusive console hardware …

            • nanoflower
            • 6 years ago

            Exactly. AMD would love to see game developers spend all their time optimizing for Mantle and leaving Nvidia (and Intel) to suffer along with a poorly optimized Direct X solution. That isn’t happening because Mantle isn’t that prevalent a solution (and likely never will be thanks to Direct X 12) but that sure doesn’t mean AMD wouldn’t love to see it happen. That’s good for their business. Gameworks does the same thing for Nvidia and I’m that Intel would do something similar if they ever got into the discrete graphics cards.

            • maxxcool
            • 6 years ago

            but with no doubt you are I the same ?

            • jessterman21
            • 6 years ago

            Has anyone called Spigzone on this guy yet?

            [i<][b<]SPIGZONE!!![/b<][/i<]

          • Billstevens
          • 6 years ago

          If AMD can’t access libraries to a major game release, ie source code, then it means they aren’t willing to pay for it… These things are for sale they are not top secret, if you have proof that Nvidia paid off a company to not allow competitors access to source code you should link it. And please no conspiracy theory blogs…

      • Prestige Worldwide
      • 6 years ago

      ” Check each one of the metrics above, and it’s easy to see the score. The R9 295 X2 is pretty much exemplary here, regardless of which way you choose to measure.

      Oddly enough, although its numbers look reasonably decent, the GTX 780 Ti SLI setup struggles, something you tell by the seat of your pants when playing.”

      Oh yeah, that just reaks of nVidia bias! Nice try, move along now.

      • Krogoth
      • 6 years ago

      Actually, it shows underlying issues with multi-GPU solutions.

      They are extremely depended on the software in order to realize their potential. Nvidia simply invests more time and resources towards SLI than AMD does with CF and it shows.

        • superjawes
        • 6 years ago

        Nvidia probably invests more time and resources in EVERYTHING. AMD doesn’t have any money, and hasn’t had any for some time. Until they can give themselves some breathing room, the best they can try is brute force to outperform Nvidia.

          • beck2448
          • 5 years ago

          At the high end I always go Nvidia. The Crossfire debacle illustrated perfectly why. Nvidia has the resources to deliver WORKING software in a timely fashion while AMD took years to even begin to address that their premium solution actually did NOT WORK. AMD is still having frame pacing issues and quality problems, ie the heating problem with the 290X where the card could not maintain the advertized speeds. The 780 ti is outstandingly fast while also being cool and quiet, no liquid needed. Competition is great but quality is king.

      • sschaem
      • 6 years ago

      Including guild war is a puzzling one….

      But otherwise TR 2 out of the 5 games tested are not nvidia “the way it should be played, player”
      games.

      But yea, It does feel a little like TR focused more on nvidia optimized titles.

        • nanoflower
        • 6 years ago

        Not really that much of a surprise. They are trying to provide some continuity with previous benchmark results. Otherwise they have to test everything from scratch and it makes comparing current gen cards with last gen cards a pain because the benchmarks are completely different. I expect by this time next year we will be looking at mostly new benchmark games like Watchdogs or The Division.

      • M3gatron
      • 6 years ago

      Quite pleased with the Nv army rage, go ahead and vote me down that doesn’t change what I’ve said it just shows your impotence.

        • maxxcool
        • 6 years ago

        😛 rage ? You have less than 30 replies at this writing.. boring..

        • Waco
        • 6 years ago

        I’m surprised anyone is even paying attention to your comment. Good troll I guess…

          • M3gatron
          • 6 years ago

          Troll?? I stated something obvious that can’t be denied. Nv fan army from here tried but all of them failed and just lost their time.

            • Waco
            • 6 years ago

            You’re not stating the obvious at all. You’re attacking the methodology of one of the most respected tech sites in the history of the Internet with invalid logic and outright fantasy.

            So yes, troll. Go play somewhere else. 🙂

            • M3gatron
            • 6 years ago

            Again with this exaggerations.
            What is this site the GOD of hardware testing or something?

            The conclusion was influenced mostly by the poor selection of games.

            AC4 – where SLI is supported at an application level but CF is only forced through drivers. So yeah SLI will be faster no matter what because it has better support.

            Batman AC- extremely Nv biased, a single 780Ti is almost as fast as the 295×2, Thief also uses Unreal Engine 3 but you don see that kind of difference, in fact DX11 CF is faster than SLI.

            Guild Wars 2- doesn’t have support for CF so that was a useless game to test.

            COD – another game that does what Gameworks games do best.

            Am I wrong to point his out?
            Am I mistaking?

            • maxxcool
            • 6 years ago

            lol.. 40 replies.. still no NV rage.. hard fail be ‘oil of the snake’

      • Airmantharp
      • 6 years ago

      Chucky, Max, Click, I just want you guys to know-

      I’m proud of you.

      • Wildchild
      • 6 years ago

      The amount of people who took the time to dislike every one of M3gatron’s comments is hilarious.

        • Airmantharp
        • 6 years ago

        He signed up just to diss one of the most respected review houses because he disagrees with the selection of titles used in the test- and then he continued to pour fuel on the fire. He should have seen it coming.

        • Billstevens
        • 6 years ago

        Gold members so divide his down votes by 3. No that many people…

          • nanoflower
          • 6 years ago

          Or not.. I cast my one down vote for a few of his comments. Not so much because I disagreed with them as that they appeared to be completely baseless. I don’t mind someone having a different opinion but be prepared to back it up. I’m not tied to either vendor as my last video card was ATI and my current one is Nvidia and my next will be whatever looks to be the best option for what $$$ I have to spend at the time.

    • chuckula
    • 6 years ago

    [quote<]Here's an example from Tomb Raider on the R9 295 X2. I had hoped to use this game for testing in this review, but the display goes off-center at 3840x2160. I can't seem to make it recover, even by nuking the registry keys that govern its settings and starting over from scratch. Thus, Lara is offset to the left of the screen while playing, and many of the in-game menus are completely inaccessible.[/quote<] In a game that has bent over backwards to work with AMD [TressFX anyone?], that sort of result with AMD's premier product that is specifically intended to push performance in 4K displays just isn't cutting it. AMD has had PLENTY of time to get the drivers working, so I'm expecting better at this point. The failure of Mantle in Thief is slightly more forgivable since AMD has made noises that Mantle really isn't working in Crossfire mode quite right yet.. but then again, that's an excuse for why it [b<]didn't'[/b<] work instead of a positive performance win for a $1,500 card..... At least in BF4 Mantle appeared to have close-to-D3D performance although the frame time issues appeared to be a problem. Props overall on the cooling solution, albeit a bulky one especially if the CPU is already water cooled. Props also on the price when compared to the Zombie-edition $3K Titan Black... although not so props vs. a 780TI in crossfire mode. Basically: AMD has definitely got a better proposition than the Titan-Zombie, but that's not really saying very much. [Edit: Thanks for the downthumbs with no rational replies guys. I'll do a survey, and every downthumb is from an AMD fanboy who secretly agrees with everything I posted but just doesn't want to publicly admit it's true.]

      • sweatshopking
      • 6 years ago

      DUAL GPU CARDS ARE STUPID. BUT DAMN, I WANNA MINE WITH 4 OF THESE!!

      • HisDivineOrder
      • 6 years ago

      The thing is, if I were looking at ANOTHER dual-GPU card from AMD, I’d look at how they treated the last one. Or the one before that. Neither were really treated well or were released at even remotely the right time in their driver support. Hell, the last card was released in the heat of the Frame Latency Fiasco that they didn’t right for months after that.

      So if you’re having driver problems of ANY kind with the R9 295X2, you shouldn’t count on them being fixed for months. And by the time they were fixed?

      7990’s were being firesale’ed with tons of games and lower prices. So it pays to wait for the fixes because by then prices’ll be a lot lower. That is, by the time the card is working properly, you can pay less, too.

      • Wild Thing
      • 6 years ago

      After reading many of your posts here its probably not AMD fanboys down voting you because they secretly agree with your posts and can’t admit it,it’s most likely because you seem to be really obnoxious and unlikeable.
      You do seem obsessed with the voting system here in an almost narcissistic way.
      Maybe you should try to be a pleasant fellow and you’ll get all the up votes you are chasing.
      (not quite sure of end result you seek in accumulating them tho…)

        • Airmantharp
        • 6 years ago

        Just to add balance to the force, I freely admit that I love me some Chucky. He has intelligent jackass down to an art-form.

          • chuckula
          • 6 years ago

          Being called a jackass. It’s good to see that somebody around here appreciates me!

      • Wildchild
      • 6 years ago

      I’ll be sure to down-vote all of your comments whenever I remember since it seems to bother you so much.

        • chuckula
        • 6 years ago

        And the trap is sprung…

        • Billstevens
        • 6 years ago

        Wait why do you hate Chuckula? This must be carry over from another post war…

      • USAFTW
      • 6 years ago

      Say what? 780Ti in crossfire? So apparently nVidia has decided that crossfire works better than SLI and decided to implement it into it’s cards.

      • steenss
      • 6 years ago

      The downthumb was for just being as ass… 😉

Pin It on Pinterest

Share This