An early peek at the Radeon HD 4870 X2

AMD has quite a hit on its hands in the Radeon HD 4850 and 4870, and pretty much everyone already knows it’s developing another product code-named R700, a high-end graphics card based on two 4870s paired together. We have an early engineering sample of this beast, and we can offer a preliminary look at how the card—to be called the Radeon HD 4870 X2—performs. You might be expecting big things, and you might be right. Keep reading to see what we found.

R700 up close

Although this product’s code name, R700, follows a naming convention similar to past high-end Radeon GPUs, it’s not really a new GPU at all. Instead, it’s just two RV770 graphics processors having a party together on one PCB, pretty much like the Radeon HD 3870 X2 was in the last generation. In fact, the new X2 looks an awful lot like the old one at first glance.

The Radeon HD 4870 X2

We don’t have the full scoop on the 4870 X2 at this early date. This is just an engineering sample, not a retail product, and it may be subject to change by the time the boards ship to customers. That’s scheduled to happen some time around the middle of next month, which is soon enough that we don’t expect major changes to the card between now and then.

The board itself is a healthy 10.5″ long, and it sports a couple of PCIe auxiliary power plugs, one 2×4 pin and one 2×3 pin.

Flip it over, and you can see the dual retention brackets for the heatsinks attached to each GPU. Here, if you’re a hopeless geek, you might pause to reflect on the 4870 X2’s likely specs. The RV770 GPUs on our early sample are clocked at 750MHz, the same speed as the regular 4870. As a result, the X2’s 1600 total stream processors have a peak computational rate of 2.4 teraflops. That’s, erm, considerable—beyond the obvious graphics applications, that’s the sort of computing power that may one day enable men to figure out what women want.

Remove the card’s cooling apparatus, and it looks like so:

In the center of the board sits a PLX PCIe switch chip, flanked by a pair of RV770 GPUs. We couldn’t find this particular model of PCIe switch listed on PLX’s website, but when installed, the card shows up as a PCI Express 2.0 x16 device. Each GPU shows up as being PCIe 2.0 x16 capable in AMD’s driver control panel, as well. So our best guess is that we’re looking at a PCI Express Gen2 switch that has 48 total lanes—16 routed to each GPU and 16 connected to the PCIe slot.

Rumor has it the R700 may include a faster GPU-to-GPU CrossFire interconnect in order to help with performance scaling in difficult cases, but we don’t yet have definitive info on how much of a bandwidth boost it may offer.

We do know, however, that the two GPUs on the card don’t share memory. The board has eight Hynix GDDR5 memory chips per graphics processor, four on the front and another four around back. Those chips are 1Gb each, so each GPU has a total of one gigabyte of memory to call its own. Cumulatively, 4870 X2’s effective memory size is still 1GB, since data must be replicated into each GPU’s memory space.

Although the Hynix chips on our engineering sample are rated for up to 4Gbps operation, on this board, they run at the same 900MHz base clock and 3600MT/s data rate as on the Radeon HD 4870. That ain’t exactly shabby, though. All told, the R700 has an aggregate 512-bit path to memory that theoretically peaks at 230GB/s. To put that into perspective, its likely closest competitor, the GeForce GTX 280, has “only” 142GB/s of peak memory bandwidth.

With that in mind, the big question about the 4870 X2 is: How does it perform? If you’ve answered “About like two Radeon HD 4870s in a CrossFire setup,” you’re on the right track. Compared to a dual-card config, this puppy has the potential benefit of a faster CrossFire interconnect between the GPUs and twice the effective memory size (single 4870s currently have 512MB), but it has the possible disadvantage of those GPUs having to share PCI Express bandwidth to the rest of the system via that PLX switch. Which, of course, is why we test these things….

Our testing methods

As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and the results were averaged.

Our test systems were configured like so:

Processor Core 2 Extreme QX9650 3.0GHz Core 2 Extreme QX9650 3.0GHz
System bus 1333MHz (333MHz quad-pumped) 1333MHz (333MHz quad-pumped)
Motherboard Gigabyte GA-X38-DQ6 EVGA nForce 780i SLI
BIOS revision F9a P05p
North bridge X38 MCH 780i SLI SPP
South bridge ICH9R 780i SLI MCP
Chipset drivers INF update 8.3.1.1009
Matrix Storage Manager 7.8
ForceWare 15.17
Memory size 4GB (4 DIMMs) 4GB (4 DIMMs)
Memory type 2 x Corsair TWIN2X20488500C5D
DDR2 SDRAM
at 800MHz
2 x Corsair TWIN2X20488500C5D
DDR2 SDRAM
at 800MHz
CAS latency (CL) 5 5
RAS to CAS delay (tRCD) 5 5
RAS precharge (tRP) 5 5
Cycle time (tRAS) 18 18
Command rate 2T 2T
Audio Integrated ICH9R/ALC889A
with RealTek 6.0.1.5618 drivers
Integrated nForce 780i SLI MCP/ALC885
with RealTek 6.0.1.5618 drivers
Graphics
Radeon HD 2900 XT 512MB PCIe
with Catalyst 8.5 drivers
Dual XFX GeForce 9800 GTX XXX 512MB PCIe
with ForceWare 177.39 drivers
Asus Radeon HD 3870 512MB PCIe
with Catalyst 8.5 drivers
Palit
GeForce GTX 280 1GB PCIe

+ XFX

GeForce GTX 280 1GB PCIe
with ForceWare 177.39 drivers

Radeon HD 3870 X2 1GB PCIe
with Catalyst 8.5 drivers
 
Radeon HD 4850 512MB PCIe
with Catalyst 8.501.1-080612a-064906E-ATI drivers
 
Dual Radeon HD 4850 512MB PCIe
with Catalyst 8.501.1-080612a-064906E-ATI drivers
 
Radeon HD 4870 512MB PCIe
with Catalyst 8.501.1-080612a-064906E-ATI drivers
 
Dual Radeon HD 4870 512MB PCIe
with Catalyst 8.501.1-080612a-064906E-ATI drivers
 
Radeon HD 4870
X2 2GB PCIe
with Catalyst 8.52-080708a-066515E-ATI drivers
 
MSI GeForce 8800 GTX 768MB PCIe
with ForceWare 175.16 drivers
 
XFX GeForce 9800 GTX 512MB PCIe
with ForceWare 175.16 drivers
 
XFX GeForce 9800 GTX XXX 512MB PCIe
with ForceWare 177.39 drivers
 
GeForce 9800 GTX+ 512MB PCIe
with ForceWare 177.39 drivers
 
XFX GeForce 9800 GX2 1GB PCIe
with ForceWare 175.16 drivers
 
GeForce GTX 260 896MB PCIe
with ForceWare 177.34 drivers
 
GeForce GTX 280 1GB PCIe
with ForceWare 177.34 drivers
 
Hard drive WD Caviar SE16 320GB SATA
OS Windows Vista Ultimate x64 Edition
OS updates Service Pack 1, DirectX March 2008 update

Thanks to Corsair for providing us with memory for our testing. Their quality, service, and support are easily superior to no-name DIMMs.

Our test systems were powered by PC Power & Cooling Silencer 750W power supply units. The Silencer 750W was a runaway Editor’s Choice winner in our epic 11-way power supply roundup, so it seemed like a fitting choice for our test rigs. Thanks to OCZ for providing these units for our use in testing.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

We used the following versions of our test applications:

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Half-Life 2: Episode Two

We used a custom-recorded timedemo for this game, as well. We tested Episode Two with the in-game image quality options cranked, with 4X AA and 16 anisotropic filtering. HDR lighting and motion blur were both enabled.

The 4870 X2 isn’t quite as fast as a two-card 4870 CrossFire config here, but it’s close, running only a handful of frames per second behind at 2560×1600. Compared to the X2’s intended target, the GeForce GTX 280, it’s not even close. This game scales nicely with multi-GPU setups, and the X2 takes a decisive lead over the GTX 280.

Of course, the GTX 280 also churns out over 65 frames per second at the highest settings tested, so we’re not talking about huge differences in competency when running this game.

Enemy Territory: Quake Wars

We tested this game with 4X antialiasing and 16X anisotropic filtering enabled, along with “high” settings for all of the game’s quality options except “Shader level” which was set to “Ultra.” We left the diffuse, bump, and specular texture quality settings at their default levels, though. Shadow and smooth foliage were enabled, but soft particles were disabled. Again, we used a custom timedemo recorded for use in this review.

Here’s confirmation of what we saw on the previous page. Once more, the 4870 X2 is just a little bit slower than two 4870 cards in CrossFire, but it’s still quite a bit quicker than a GeForce GTX 280. Again, we’re dealing with a game that scales well with multiple GPUs, and again, the R700 comes out looking very strong.

Crysis

Rather than use a timedemo, I tested Crysis by playing the game and using FRAPS to record frame rates. Because this way of doing things can introduce a lot of variation from one run to the next, I tested each card in five 60-second gameplay sessions.

When playing, I’m on a hillside in the recovery level having a firefight with six or seven of the bad guys. As before, I’ve tested at two different settings, with the game’s “High” quality presets and with its “Very high” ones, also.

These tests involve manual gameplay, so I wouldn’t focus too much on minor performance differences between the cards. For all practical intents, the 4870 X2 ties with the GeForce GTX 280 using Crysis‘ “high quality” presets. With the “very high” presets, the X2 proves to be a little quicker than the GTX 280. In both cases, the X2 performs very similarly to the Radeon HD 4870 CrossFire dual-card config.

Race Driver GRID

I tested this absolutely gorgeous-looking game with FRAPS, as well, and in order to keep things simple, I decided to capture frame rates over a single, longer session as I raced around the track. This approach has the advantage of letting me report second-by-second frame-rate results.

Even a single Radeon HD 4870 is faster than the GeForce GTX 280 in this game, and performance scales pretty well from there with the X2, so that it nearly doubles the average frame rate of the GeForce GTX 280.

This game doesn’t always scale well with multiple GPUs, though. The Radeon HD 3870 X2’s results illustrate that. Even with the latest public driver release for the 3870 X2, Catalyst 8.6, the game doesn’t make use of both GPUs. The Radeon HD 4000-series drivers we’re using are newer and have a CrossFire profile for GRID included. This is the issue, of course, with multi-GPU configurations, including the 4870 X2. Newer games will often require driver updates before the card can live up to its potential.

Power consumption

We measured total system power consumption at the wall socket using an Extech power analyzer model 380803. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The cards were plugged into a motherboard on an open test bench.

The cards were tested under load running Half-Life 2 Episode Two at 2560×1600 resolution, using the same settings we did for performance testing. At AMD’s request, we’ve omitted power consumption results at idle because this early engineering sample doesn’t yet have its power-saving PowerPlay functionality enabled, so its idle power draw isn’t representative of the final products. That’s unfortunate, because we’re curious to see how the 4870 X2 will do at idle, especially when compared to the incredibly low idle power draw of the GeForce GTX 280.

Going with an X2 instead of two separate 4870 cards will save you a little bit on power consumption, but not much. The X2’s power draw is manageable, but here’s one place where going with two smaller chips instead of one larger chip may prove to be a drawback: the GeForce GTX 280 draws over 100W less power under load.

Then again, this is an early sample, and as I’ve said, PowerPlay isn’t yet enabled. It’s possible AMD may find a way to reduce power use, even under load, in the finished product.

Noise levels

We measured noise levels on our test systems, sitting on an open test bench, using an Extech model 407727 digital sound level meter. The meter was mounted on a tripod approximately 12″ from the test system at a height even with the top of the video card. We used the OSHA-standard weighting and speed for these measurements.

You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured, including the stock Intel cooler we used to cool the CPU. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card’s highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.

Our early 4870 X2 sample looks to be part of the trend toward relatively loud coolers in newer video cards, at least under load. Given the amount of power the card consumes and the amount of resulting heat its dual-slot cooler must dissipate, these results aren’t much of a surprise. Subjectively, the X2 does seem to be pretty noisy, although to my ears, the dual GTX 280 SLI rig seemed even louder, despite what our sound level meter said.

GPU temperatures

Per your requests, I’ve added GPU temperature readings to our results. I captured these using AMD’s Catalyst Control Center and Nvidia’s nTune Monitor, so we’re basically relying on the cards to report their temperatures properly. In the case of multi-GPU configs, I only got one number out of CCC. I used the highest of the numbers from the Nvidia monitoring app. These temperatures were recorded while running the “rthdribl” demo in a window. Windowed apps only seem to use one GPU, so it’s possible the dual-GPU cards could get hotter with both GPUs in action. Hard to get a temperature reading if you can’t see the monitoring app, though.

All of the Radeon HD 4800-series cards we’ve tested have produced some relatively high GPU temperatures, and this early X2 card is no exception. When we asked AMD about this issue in relation to the 4850 and 4870 cards now shipping, they told us the products are qualified at even higher temperatures (over 100°C) and tuned for low noise levels. In other words, these temperatures are more or less by design and not necessarily a problem.

Conclusions

I’ve given quite a bit of thought to the issues raised by AMD’s new strategy of pairing up two mid-range GPUs to serve as their only high-end product. I’ve already said that, given the choice in a vacuum between one big GPU to two smaller ones, I’d rather have the single, big GPU. Our experience with the Radeon HD 3870 X2 in GRID is but one example of a problem we’ve seen time and again with multi-GPU cards: without a profile in the drivers, you’re usually stuck running on just a single GPU. Infuriatingly, you’re most likely to run into this problem when playing a brand-new game, right when you want that GPU power the most. Ugh.

However, playing with this early sample of 4870 X2 is a vivid reminder that we don’t make these choices in a vacuum. The reality is that a single Radeon HD 4870 GPU is nearly fast enough to keep pace with the GeForce GTX 280. Even if you’re running a game that lacks a driver profile or simply doesn’t scale well with more than one GPU, the 4870 X2 ought to perform awfully well. And when it does get both GPUs going, as our results show, it’s by far the fastest single video card we’ve ever tested. If this is how AMD rolls, it’s hard to complain.

Of course, now it’s up to AMD to deliver the final product, to price it right, to get its power saving mojo going, and to keep its drivers updated with CrossFire profiles for all of the latest games in a timely fashion. We’d really like to see AMD work more closely with game developers to implement profiles before new games are released—something Nvidia has historically done much better than AMD. One would hope that committing to this dual-GPU path for high-end products would force AMD’s hand a little bit in this regard, but we’ll have to wait and see.

Comments closed
    • Bensam123
    • 11 years ago

    “We’d really like to see AMD work more closely with game developers to implement profiles before new games are released”

    Or develop a method that doesn’t rely on game profiles all together to enable teaming.

    • damtachoa
    • 11 years ago

    Like I said before, these graphic cards can’t play over 60 fps on Crysis at high resolution are still suck. They’re not worth the money. Too much power comsumption.

      • Meadows
      • 11 years ago

      Like everyone else with bigger brains have said before, it’s not the cards that suck, it’s the game.

        • swaaye
        • 11 years ago

        Or you are too excited over this latest round of “exciting newfangled thing”. What’s with making the “unoptimized” call anyway? Did you disassemble the engine yourself and discover the developers’ great incompetence?

        The game is doing stuff that no other game does on the graphics front, and few do on the gameplay front. Do you not see that? Which “optimized” game do you know of that does what this game does? I can think of:

        – Oblivion. Runs arguably worse, considering it stutters very frequently when loading its cells, regardless of your hardware. And Oblivion certainly didn’t do graphics all that well (esp out in the distance).
        – FarCry. Blasting through today’s hardware but ran pretty bad back in its time too. I also wasn’t trying to push it at 1920×1200 back then either.
        – Just Cause. Not all that great. Not really at Far Cry’s level IMO.
        – Test Drive Unlimited. Yeah, it’s true, it’s a big open world first person game. Does it pretty well too, except there is no night at all. And its simulation is obviously limited to racing. One of the best racers out there IMO.
        – GTA3/4. I suppose it fits in with them. Certainly not bug free. Not really my thing though.
        -Morrowind and the older TES games. Hard to compare considering their simplicity overall.
        – Trespasser 🙂 Yeah it sucked, but it was ahead of everyone else in open world gameplay. Did some cool things with physics, actually. And this would be an example of “poor code” because it will still chop out today! This was seriously breakthrough real-time rendering at the time though.

        Hell, I remember Unreal running worse than Crysis, but at 640×480! All we had was Voodoo1 to power that. You needed the best hardware around to make it run really playable.

          • Meadows
          • 11 years ago

          The proof is in the pudding (and the reviews). Crysis is coded badly because it doesn’t scale with a top-end processor, it doesn’t scale with 3 top-end videocards, it doesn’t scale with motherboards, RAM, hard drives or whatever else you want to blame in total desperation.

          Being ahead of its time is one thing, but sucking this deep is another. It should be running fine on today’s cards now – it doesn’t. I don’t know what the game is doing, but it might as well be Trespasser 2.

          • Krogoth
          • 11 years ago

          Just to nitpick, but Unreal 1 at this heyday was actually CPU-limited not GPU-limited. 😉

          PII-300 the fastest x86 chip at the time, was not enough to keep the game happy. Voodoo 1 runs the good ok at 640×480 even 800×600 with a better CPU at the helm. Besides, Voodoo 2 came out around the same time as Unreal and SLI setup of it offer smooth gameplay at 800×600 and almost could do 1024×768.

          • ish718
          • 11 years ago

          Crysis is not optimized.
          Developers were actually rushed by EA to make the deadline. They spent plenty of time developing the engine but when it came to making the game they took short cuts, they are an relatively small company.

      • wingless
      • 11 years ago

      CRYSIS FANBOY!

      These GPUs spank any other game engine on the planet but you would use Crysis as a base to determine if a GPU is worthy? You sir are an asshat. Cry Engine 2 is the last gaming engine anybody should be looking at. It by no means reflects what a GOOD graphics engine should be. These cards have insance performance/watt performance even at such power draw. A GTX280 SLI or 4870X2 setup is a fair amount of future proofing. You most likely wouldn’t have to upgrade your GPU/s until late 2009 or early 2010 to keep up with games. Today’s GPU performance from ATI and Nvidia is simply unheard of at the kinds of resolutions they’re testing at.

      OPEN YOUR EYES AND OPEN YOUR MIND.

    • onlycodered
    • 11 years ago

    Wow. This thing looks sweet. I think I’ve found my next GPU.

    • gooch02000
    • 11 years ago

    I ran this game on my 8800gtx perfectly fine @ 1680×1050 everything on high, plus the extra goodies from dx10 ran in dx9, and averaged 27-33 fps the whole game (except certain spots). It was very playable, and gorgeous.

    I understand the lashing out about the performance scaling, but the game was and is playable on good hardware from the last 1.5 years or so.

    • shaq_mobile
    • 11 years ago

    “That’s, erm, considerable—beyond the obvious graphics applications, that’s the sort of computing power that may one day enable men to figure out what women want.”

    ^_^ i might have to spring for one now.

    • blubje
    • 11 years ago

    “you’re usually stuck running on just a single GPU”

    I wouldn’t say usually, and I definitely don’t think single-GPU is the way of the future. Bad implementation doesn’t imply that the concept was bad. multi-GPU is potentially cheaper (throwing away a single chip is less costly), and driver implementations could potentially bring it up to speed, considering the complete data separation graphics programming languages force at some level (e.g. blocks in nvidia cuda).

    • VILLAIN_xx
    • 11 years ago

    AMD luv’s you at TR.

    lol some one got butt hurt though.

    §[< http://www.guru3d.com/news/amd-messes-up-another-launch–4870-x2/<]§

    • michael_d
    • 11 years ago

    Very disappointing Crysis benchmarks. 2 GPUs, 1600 Stream processors and 2GB of the fastest video memory available, yet it cannot deliver decent frame rates at high resolution with the highest settings. I don’t think that this game is coded “badly”, it is just ahead of its time. We will have to wait for Radeon 5800 or even 6800 to play this game properly.

      • cegras
      • 11 years ago

      No, it’s coded badly. Why? Because it was released almost a year ago and hardware a year from then can’t play it.

      Why do people make such a big fuss about being able to play Crysis on high? It’s got zero replay value. Absolutely zero. I’d rather they benchmark multiplayer games like TF2 or HL2 or w/e.

        • WaltC
        • 11 years ago

        I agree completely. I have the game and find it pretty darn boring. I think the only reason it makes so many hardware reviewer’s lists is because of how poorly the game is coded–the low frame rates make them think that only “the fastest gpus” will ever run this game at an acceptable frame rate. The truth is that the game engine needs a massive rewrite, imo, because it takes the long way around and does things the hard way most of the time–which is precisely why it’s so slow. I’ve got games that look just as splendiferous graphically speaking but which run a whole lot faster at even very high resolutions. We all do–which is what makes the Crysis myth so perplexing.

        But to me the funniest thing of all is how nearly 99% of all the hardware reviewers out there operate under the pretense that Crysis can only be run at 2560 x1536 or 1920×1200 screen resolutions. Heh–wonder what happens to the Crysis frame-rate at 800×600 @ 8x or 16x FSAA, for instance? I wouldn’t know as I’ve never seen this information posted in a hardware review that tests Crysis among other software. I guess I’ll have to try it myself because doing this probably hasn’t occurred to anybody else? Heh…;)

        • no51
        • 11 years ago

        I must be weird for playing through Crysis again, nay, I must be crazy for actually liking it.

          • Space Bags
          • 11 years ago

          You are. The online game play isn’t fun in my opinion. It’s repetitive, there is no diversity to the game play. I get annoyed that I have to play the 32 bit when I have a 64 bit OS because Punkbuster is only coded in 32 bit, and I get annoyed that I get kicked after about two minutes of play because of punkbuster. I also get annoyed that Crytek has basically dropped support for the game and moved on to coding Crysis Warhammer, which I will be sure not to buy.

          • cegras
          • 11 years ago

          I played it, I liked it.

          I won’t play it again, though.

            • yogibbear
            • 11 years ago

            After all the fuss over Crysis when it came out, and my Pentium 4 not being up to the task in the demo, i only just picked up Crysis last week. I’d say i’m about 8 hours in so maybe halfway? :S But I really don’t know what pissed so many people off asides from the inferred bad performance because we can really only run it on all high with no AA at native resolutions. So far i am absolutely loving the gameplay, it just feels really non-linear in the way i get all creative and try to kill people in all sorts of different ways. (And yes i know that this isn’t the definition of non-linear games ala Deus Ex, but i’m referring to the options i have in combat, not game-world affecting choices or whatnot, this is an FPS after all) After perusing the forums and reading all the reviews and deciding crysis wasn’t for me, i’m glad to say that i really was silly to only just pick it up now, it is so far really enjoyable. Plus the 30 bucks off the RRP was also handy. Hope my sale eventually gets tallied in those 1.5 million sales that Yerli is so annoyed about.

            EDIT: Forgot to add i’m playing it on hard. And it is hilariously hard without the cloak. I spam that cloak all the time.

            • Meadows
            • 11 years ago

            There are combat options in many games, but you really have to do that perfectly to amaze players. Let me quote a terrible example: BioShock. You really had an astounding variety when it came to dealing with bad guys, but what did you do instead? You used the lightning bolt/wrench combo. Why? Because the other options were fun to try once, but beyond “once”, they were awful.

            Crysis is the same. You can throw objects at people, throw people at objects, or shoot them. You’ll end up shooting most of the time (usually from a distance, trying to hit a head) but even then you’ll notice that the difficulty of this action is just wrong. Your chances for a headshot range from “terrible” (when aiming down the sight) to “zero” (in all other cases), and people are very reluctant to give in if you shoot anywhere else, even if you empty 2 clips (also because you’re likely to miss 80% of the time because the weapons never shoot where you want them to).

            That’s not so many options if you ask me. I could shoot and shoot in Call of Duty 4 too, but it was a tad more fun sometimes, especially the stealth mission – there was variety, as opposed to Crysis, where there are two sorts of missions: “go here” and “kill these” (sometimes a combination). That’s very sad for a game that’s been in development for so long, and Shacknews says they didn’t really improve with Crysis Warhead either. You’d think the developers of a much hated game would try to improve in the customers’ eyes.

            • nstuff
            • 11 years ago

            For me, I really enjoyed the first half of Crysis. Without giving any spoilers, i enjoyed that, although it was mostly linear, you could take as much or as little time to accomplish each goal as you wanted. You could rush in with guns blazing or sneak in and take out the bad guys silently. The graphics were beautiful, especially in the jungle areas.

            The game, for me, became more mainstream and boring in the second half. Although the change in pace was refreshing, it got old quick. I wanted to go back to how it was in the beginning, but it never did.

            So, I’ve actually played the first half several times, having only beat the whole game once.

            • ish718
            • 11 years ago

            Crysis is shit.

            • Meadows
            • 11 years ago

            Crysis may have sold well (Yerli’s greed is the only thing that’s whining about it), but I doubt anyone will even glance at Warhead when it hits the shelves.

            • no51
            • 11 years ago

            You could have not played it you know…

            • ish718
            • 11 years ago

            Crysis, meh.
            Maybe Crytek will start optimizing their games. Oh yeah thats right, they have no choice to optimize their games now since they plan on going multi platform.

            • A_Pickle
            • 11 years ago

            “Playability” used to be a mandatory part of games. Why is it being touted like a feature in Crysis: Warhead? The more we allow this behavior to go unpunished, we can basically sign off on the demise of the PC.

            • A_Pickle
            • 11 years ago

            Agreed. In Half-Life 2, you actually could use the Gravity Gun as an effective weapon, and on most Valve-made levels, it’s damned effective in Half-Life 2: Deathmatch. Put me in a player-made level and I don’t begin to do nearly as well, because players are bitchy and hate that the Gravity Gun usually results in a one-hit kill.

            But Crysis? You’re absolutely right. You shoot people. Just like you do in CoD 4. Only CoD 4 makes shooting people nice, like a nice three-course meal or something. Rather than a slop of mac and cheese (that would be Unreal Tournament — there’s beauty there too), it’s got all sorts of variety that each player can fine tune and optimize to their liking. Hell, Crysis wouldn’t have even been that bad if they just… didn’t overhype it to death… oh, and included the “playability” feature of the newly-announced Warhead to Crysis…

            • Meadows
            • 11 years ago

            The game would’ve been a huge success if Crysis Warhead would’ve been “Crysis” to begin with, complete with the optimizations and stuff, and two campaigns you could choose from, Nomad’s or Psycho’s, both having the improved AI and weapons and freedom sorts of things – effectively culling issues with short gameplay, shoddy AI, unmerciful performance and others.

            This whole large thing should’ve been the original single game. No wonder people were/are disappointed.

      • madgun
      • 11 years ago

      I would always consider COD4 as the cornerstone of modern gaming. Amazing replay value and great multiplayer.

    • endersdouble
    • 11 years ago

    I’m not sure I buy AMD’s line on this. My new HD 4850 was…seriously unstable…until I hand-edited a profile to force the fan on. Idling at 80 C, as far as I can tell (yes, not a certainty) was causing real problems. Now, it makes a decent but acceptable amount of noise, and idles at 45 C. 🙂

    • tr1kstanc3
    • 11 years ago

    this thing is disgusting

    • swaaye
    • 11 years ago

    Check out that power consumption! How positions have changed for NV and ATI, huh? Performance is great sometimes, but yowsers. This thing can certainly replace a room heater in the home and I believe that could be a beneficial marketing bullet! They should have launched in November for optimum market reception methinks tho. 🙂

    Personally, I’m never ever going to trust a corp to support a uber low volume product like this that relies heavily on custom game profiles to perform. Maybe today it’ll work okay, but in a year or two? This thing has the speed to last that long but by then we’ll have the Next Bestest Thing and they will want you to forget this ever existed. And the truth is it won’t work with everything even today because they don’t make profiles for every single game out there and some games simply do not scale with the way these setups split up the work.

    When these dualie boards are as dependable as a single GPU design, then I’ll consider them viable options. Well maybe not actually because there just aren’t any games on the radar that are demanding enough to require something like this unless you need to push 3kx3k res or some such.

      • Krogoth
      • 11 years ago

      Unfortunately, the trend of going multi-GPU is becoming more economically viable then sticking with tried, proven monolith philosophy. Physics is making it very difficult to keep up double the transistor count without requiring absurd power and thermal requirements.

      That is party why AMD’s (ATI) gamble has pay-off so well and allow them to pull ahead in this round.

        • swaaye
        • 11 years ago

        Yes but they don’t always pull ahead do they now? Performance will be dodgy as with SLI, CF, R680 and GX2. A 100% perfect dual GPU experience is kinda rare, as some owners of them will tell you openly. I read posts from people who went the dualie route and are now on the way back to the single GPU way. The extra software reliance of this hardware is ugly stuff.

        I know you hate Crysis, but it’s also not the only example of a game that doesn’t scale well. A couple of years ago, Everquest 2 was another. I’m not that up on which recent games are problematic though.

        And then there’s the rather amazing power usage that is not any sort of win.

        The value isn’t in these dualie cards anyway. It’s in the new ~$200 midrange that can play everything quite well even at high resolutions. These cards are just mascots for the companies to parade and build up popular mind share among crazy folk. 🙂

        • Meadows
        • 11 years ago

        “Unfortunately”? So you say that something better and cheaper is actually something that’s worse? Sure, there are more delays in software support (though you’d find nVidia’s control panel quite supportive in the matter) but that’s basically it.

          • Krogoth
          • 11 years ago

          Well, because the monolith strategy has proven to work well in the past. I do not imply that it will continue to work without any problems. Physics being the primary one.

          There is a reason why GT2xx family cannot clock high and yields seemed to be pretty low. Huge chips are difficult to manufacture without issues and managing power envelops are tricky these days. Moore’s Observation has already been dead for some time. It was murdered by Prescott and later IBM’s problems with their Power970 chips. To a lesser, extent AMD’s K10 family.

          Multi-GPU path is quite a bit less certain in software department and immature at the moment. It makes more economical sense at this point of time. Fabbing smaller chips is easier and managing power envelops is a bit more simple. GPUs are already extremely parallel by nature. The problem again is adapting the software to change which takes some time.

          Unless, there is some major breakthrough in semiconductors that makes monolith chips very economically practical once again.

            • ChangWang
            • 11 years ago

            Not only did nv have physics working against them, I think they just plainly took the wrong direction with their design. Instead of adding a third shader unit to their SPs, they should have left them in pairs and added 5 more to get the same SP count. OR even just add 3 pairs instead of 5 and add more texturing capability instead of the other 2 pairs.

            Updating their SPs to do DX10.1 would have been nice too, instead of using the same thing we saw in the G80+the small cache

    • StashTheVampede
    • 11 years ago

    Where are the 3-way crossfire benches?

    • Valhalla926
    • 11 years ago

    Does anyone have a guess how this will scale with another 4870 (crossfireX)? If the way it’s implemented in this card is different enough (Don’t remember the specifics), could it limit the possibility for 3-way crossfire solutions?

    • Thresher
    • 11 years ago

    The more I look at these numbers, the better a Crossfired pair of 4850s looks. For under $350, you get a helluva lot of performance for the money.

    I know ASUS came out with a 3850X2, I wonder if there would be a market for a 4850X2.

      • cynan
      • 11 years ago

      /[< The more I look at these numbers, the better a Crossfired pair of 4850s looks<]/ I agree. I have been on the fence about what new vid card setup to go with every since I was finally spurred toward a new system build upon finding a pair of 4850s on June 20th from BestBuy for $150 each. I think these results are finally compelling enough for me to de-shrink wrap the retail boxes and get to some gaming. The only thing that smarts is the need to pay a bit extra for the x38/48 boards over the p45 chipsets to realize the full crossfire performance from the dual PCIe x16,,, Oh well, can't have everything...

        • ChangWang
        • 11 years ago

        I, too, got in on that bestbuy deal. And while I did wind up buying an X48 board, I still paid less for all that hardware ($150+$150+$225) compared to what a GTX 280 was going for at the time (which was around $700 on newegg).

        And yeah, the performance gains of the P35 to X38/X48 are real! I made a spreadsheet if anyone is interested

      • mesyn191
      • 11 years ago

      Yea I’m eagerly awaiting a 4850X2, the rumour mill says there will be one from several OEM’s out soon but who knows when “soon” is…

    • BoBzeBuilder
    • 11 years ago

    I’m surprised to see all the tested games scaled well for the 4870 X2, given its only an engineering sample and probably using beta drivers.

    • flip-mode
    • 11 years ago

    Red fan in black shroud looks badass.

      • Staypuft
      • 11 years ago

      I second that.

      • vdreadz
      • 11 years ago

      I agree also.

    • thermistor
    • 11 years ago

    Having designed cooling systems for diesel engines for emissions control, ATI has the classic problem on their hands. They can increase coolant (chip) temperature and say it’s A-OK to run at those higher temps and consequently get by with a smaller radiator (heat sink/fan).

    That lowers their cooling envelope size, but also necessitates higher air flow…which increases noise because axial fan tip speed is the main predictor of noise.

    And it will shorten the life of the engine (chip).

    I agree with #39. They should have done something like make a “triple” slot cooler or put on a ton of copper…or something. Nice new chips deserve a better cooling solution.

    • Mavrick88
    • 11 years ago

    These results are also with a set of basically beta drivers because there hasn’t been an “official” set of drivers for the 4800 series, just a “hot fix.” The 4800 series doesn’t even show up in their “find your card” list for drivers. I bet when the drivers come out, these test results will improve even more.

    • Inabil1ty
    • 11 years ago

    So I can use this card with my existing 2-year-old Asus P5W64 WS motherboard, which has only one PCI-Express x16 slot, and it will run almost as fast as if I were to run two 4870s on a new PCI-e 2.0 motherboard? If so, that would save me a lot of dough!

      • Meadows
      • 11 years ago

      Basically. That’s the very point of “dual-GPU-on-a-stick”, to broaden the market.

        • poulpy
        • 11 years ago

        /[

          • Meadows
          • 11 years ago

          You’re forgetting that a lot of people have a single PCI-e slot in their computers, and some fraction of those are willing to fill it with something that kicks.

            • poulpy
            • 11 years ago

            No no haven’t forgotten that but it’s just business as usual IMO with a super high end card that just happens to have 2 GPUs because ATi doesn’t develop massive GPUs anymore. Not worth a massive debate anyway 🙂

    • JustAnEngineer
    • 11 years ago

    Crossfire-X?

    I see the connector on that HD4870X2. Don’t tell me that you weren’t tempted to pair it with one or two of your HD4870 cards for that little extra something.

    “These go to eleven.”

      • Damage
      • 11 years ago

      I tried! The drivers wouldn’t cooperate.

    • deruberhanyok
    • 11 years ago

    I really like the strategy AMD is using with these parts… and I also like that it provides more incentive for them to keep improving crossfire functionality.

    Also, I am completely thrilled with how open AMD is being with these parts. Early reviews on the 4850s, engineering samples of the 4870 X2… they’re even being pretty up-front about the Phenoms. That’s a huge change from this time last year when we barely ever heard a peep out of them.

    Card looks impressive. Way out of my price range, but impressive nonetheless.

    • Fighterpilot
    • 11 years ago

    Pretty good results for an ES card.
    Lucky NVidia have panic slashed the price on GTX280 cause here’s what [H] had to say about it….
    l[<"Holy cow, what else is there to say really? It blows the doors off of the GTX 280."<]l

    • Meadows
    • 11 years ago

    Like I said last time somewhere, the GTX cards need higher clock speeds – they don’t seem to be memory-bound. Look at the first two games, they lead by considerable margins at the top resolution with antialias. That’s not the resolution that the majority will use though, which is why they need clock speeds to fix the balance.

    However, this review made me face something weird.
    Crysis.
    What _[

      • Corrado
      • 11 years ago

      The game is coded horribly. Thats the bottleneck. It simply can NOT be run at a decent framerate at high resolutions.

        • Krogoth
        • 11 years ago

        No kidding, where Crysis barely looks better then what the Unreal 3 Engine can provide.

        This game is a victim of modern EA syndrome.

          • Meadows
          • 11 years ago

          Have you even seen the two games next to one another? Crysis does look considerably better – especially for still shots. Max it out perfectly in Dx10 mode sometime and walk around the frosty levels, then look at the plants – you’ll see why the game is so slow, right there.

          Aim down the sight towards a frozen leaf, and you’ll see randomized, bump-mapped, distortion-mapped ice stains and several shader effects on each and every leaf – detail that’s not even conceivably visible at a medium resolution (not to mention nobody gives a remote damn, since people are looking for weapons and aliens). While that’s appreciated for computer-animated cinema shows, it’s complete stupidity in a game today, especially in the way it was coded.
          I’m willing to bet Crysis uses more effects and shaders altogether than the number of days you’ve been alive – but clearly in vain.

          Unreal Tournament 3 has much less detail, and there are lots of cut corners in it too, but that’s exactly why it runs a lot better (I don’t why it hogs the CPU under Vista, ask Epic – but you need at least 3 cores for good gameplay under Vista because of this shitty game issue).
          One place where they were cutting corners is the water. It looks terribad. Sure, it looks great in the heat of action while you’re trying to hit things, but if you ever stop for a moment to look at it, you’ll see how lame it actually is. And why is it invariably moving all the time? Even small amounts of water in truck wheel trails seems to “move” at the same speed rivers do – that’s just not buena.

          It’s clearly apples to oranges – don’t compare Crysis to UT3. Crysis uses heavy LOD stuff in an attempt to help make itself run, while UT3 uses half the effects and somewhat lower detail overall (colours are a different thing, but it’s also ugly as hell with all those washed-out, boring, vomit-inducingly ugly palettes). Not really comparable.

          • swaaye
          • 11 years ago

          Krogoth, you have no idea what you’re talking about. UE3 is nice, but Crysis is in a league of its own for graphics technologies. UE3 runs well for a number of reasons and one of those is definitely its relatively simple graphics compared to Crysis. There is no UE3 game that I’ve played that has any area with anything like the level of detail that most of Crysis displays.

          I am also amazed that some folks miss the little details, beyond the obvious. It’s not just the graphics, but the environment simulation, physical simulation, destructible objects, AI, etc. The game pulls things together that I’ve been waiting to see for years.

            • Krogoth
            • 11 years ago

            Then why the said engine has poor scalability on modern systems? Not even multi-GPU solutions put that much of a dent over a single-GPU solution. That to me screams poor optimization or coding.

            Supreme Commander and Forged Alliance got tons of flak for it, why should Crysis be spare from such criticism?

            Crysis Demo utterly failed to impress me and save me $49 of disappointment. Its overall graphical quality is hardly any better then what Doom 3 and UT3 engines can provide. Physics are still a freaking joke and arcadish , not that UT3 is any better. The only thing that Crysis did that was better was “fake” sunrise/sunset cycle.

            The gameplay by itself was a cheesy, half-arsed rip-off of Deus Ex I. The plot and most of environment are just recycled content from Far Cry. Just replace the hill-billy mercs with North Korean grunts, and mutant freaks with aliens.

            It even fails as a tech demo for Crytek engine 3. Crappy performance is going to scare away any potential third developers from picking up your tools and engines.

            • swaaye
            • 11 years ago

            I guess you ought to ask yourself just what your expectations really are. That is what posts like that stink of. You don’t really know what you want, but Crysis doesn’t deliver whatever it is and you can’t and don’t want to appreciate what it does offer. The classics (to each his own, eh) you list each have their own major disappointments too.

            As for why it doesn’t scale: that seems like a deep technological issue and I’m sure the devs know exactly why and NVIDIA+ATI probably know exactly why too. NVIDIA dumped a decent number of engineer hours into that engine, if you don’t remember that. Guessing and bitching about why it doesn’t scale isn’t going to change it.

            • Krogoth
            • 11 years ago

            My expectations are to have a game that is like “fun” to play and *gasp* have a coherent, captivating plot to it. I still play Tie Figther and X-Wing to this day (OMFG pre-3D accelerated graphics days!). I can tell you that graphics are not reason why I come back to them. Doom 3 and its eye candy-laced kin are easily forgettable.

            The industry has gone way overboard with eye candy that it is downright sad. Graphics are a nice lure, but they are no substitute for a lasting gameplay experience.

            For graphics, the only thing that is sorely lacking are photorealistic shadowing, lighting effects, clouds/fog and texture quality. There is nothing on the market that gets all of them right. Only a handful gets one of those four right, but fails at the others.

        • ChangWang
        • 11 years ago

        OMG… Finally someone who agree’s with me. I’ve been telling my friends this for months…. I mean when you throw the kind of horsepower these cards can deliver at all these titles, and Crysis is STILL the only title that doesn’t scale worth a damn… well that should say something in itself. I’m just glad its more apparent for the non-believers now.

          • swaaye
          • 11 years ago

          Unless you code 3D game engines, you are just blind guessing. If you were a great coder, you probably wouldn’t be making these claims at all because you would know how complex game development is and appreciate every little step a colleague in the industry makes.

          I suggest you explore the flexibility of the engine with the editor and check out the number of options within the configuration files of the game itself.

            • ChangWang
            • 11 years ago

            Blind guessing or not, fact is reality. Crysis runs like poo on everything, and thats a fact. The proof is in the numbers and the numbers are consistent all over the interwebs.

            • swaaye
            • 11 years ago

            It also puts out the best visuals of any game out there. Or do you want to dispute that too? There is plenty of proof of that in just the game itself, but you can also go check out what modders are doing just by editing the configuration files.

            Seems to me that more complex graphics rendering usually equals more demands on the hardware. And it also seems to me that that is why the hardware is always getting better and the games prettier, back and forth. It’s why the games that run fast today do run fast.

            Your argument sounds like you want to stay where we are. Stick with the games that run fast. So, Source engine (static everything) and UE3 (plastic world)? WoW (DirectX 7)? You want things to advance more slowly and for devs to play the safer card instead of pushing things really aggressively.

            I really am not bothered by a game that noticeably gets results by pushing real-time 3D to new limits. It’s why I game and it always has been. How many 3D cards have you gone though? I liked my Voodoo1 and how fast some games ran on it too, but it came and went. Everything does. I have a feeling that there are many, many people who aren’t bothered by advancement either.

            • Krogoth
            • 11 years ago

            Crysis’s models still looks like plastically and lighting effects are still lacking in being convincingly real. I mostly blame this to be a problem of rasterization in general.

            • swaaye
            • 11 years ago

            True but it is still the best of what’s out there. The models look better than those in any game-related tech demo I’ve even seen. Better than NV’s DX10 demo and much better than 3DMark stuff.

            It’s just proof of how many effects need to be layered on a polygon to make it look like flesh or just a real material in general. Lighting was good outside, but didn’t do nearly as well inside. Full dynamic lighting and shadowing as shown in Doom3/FEAR carries some serious constraints (no outdoors).

            The display tech we have is also a major limitation. LCDs are terrible for color accuracy, greys, and just plain black. The sharp pixels make aliasing naturally worse. They are certainly no improvement over CRTs in any respect other than power consumption, weight and sheer brightness.

            • ChangWang
            • 11 years ago

            Dude, pushing the graphical envelope is pointless if the title runs like a slideshow. At that point, my Rebel XTi will output better visuals and more consistent visuals than this game. LOL We are on what… the second generation of hardware since the game was released…. I’m predicting that the next generation of hardware/refresh won’t make a difference either.

            • swaaye
            • 11 years ago

            I played it on Radeon 3850 just fine though. Oh gosh, couldn’t run VHQ. So what? HQ looks beautiful too. 8800GTX (2006) ran it better yet, but yea you had to pay big to play it that well. The game is demanding, but it’s hardly some lost cause.

            The most demanding areas are the snow levels and the endgame. Other parts are actually quite playable on GF 8600 and older high-end DX9 cards.

      • eitje
      • 11 years ago

      we may never know.

    • moritzgedig
    • 11 years ago

    how does 1920×1200 compare to 2xAA?
    If I played at 1280×1024 with 2xAA, would I get similar results?

    • Kaleid
    • 11 years ago

    About noise and temps.

    “tuned for low noise levels” so how come it is at the top in noise level with 280 SLI? Where’s the low noise part? Low noise by being the loudest card you can buy?

    The way I see the fans are still too bad on pretty much all cards.
    “In other words, these temperatures are more or less by design and not necessarily a problem. ”

    It is a problem, I think my 4850 died prematurely just because of that. The fans on these cards, especially because of how hot they run is clearly inadequate. Make them larger by default and perhaps there would be no reason to purchase third party coolers. Enough of these crappy tiny fans. Give us bigger low RPM fans!

      • MadManOriginal
      • 11 years ago

      Your 4850 died, after what, two weeks? That’s pretty bad, I’m not sure that can be assigned directly to heat unless your case airflow was terrrible too. Usually heat failures over time would take longer I’d think, but maybe some crappy component gave out due to overheating and killed the card.

        • Kaleid
        • 11 years ago

        2 days actually. Ran Ati’s own overclocking tool and after that had nothing but artifacts even at regular speed. It overclocked way too high on the memory I suppose.

        Edit: And it cannot be blamed on poor cooling from my part and I had not done any voltmods.

          • Usacomp2k3
          • 11 years ago

          Doesn’t the “ATI overclock” still come with a disclaimer that doing so will void the warranty? I know that the nVidia Coolbits trick did.

          • cegras
          • 11 years ago

          That’s .. technically your fault, isn’t it?

            • Meadows
            • 11 years ago

            That’s right, blame the customer.
            One good move shouldn’t suddenly blindfold everyone in AMD’s favour. Face it, their cooling is shoddy.

            Many, if not all videocards are capable of overclocking to the physical max without reaching thermal limits, particularly ones from nVidia. ATI cards should be no exception. It’s probably true that the GPU itself is durable, but the RAM chips are under the same heatsink, and I’ve yet to find memory that will smile at me at 90 degrees. No wonder they bake irreversibly.

            • Krogoth
            • 11 years ago

            Dude, running any part beyond its recommended operational specifications and breaking it is entirely the customer’s fault.

            It is like trying to run a car that manufacturer requires you to use expensive synthetic oil, and it is explicitly stated in the owner’s manual. However you opted for the much cheaper 5W30. You run into engine problems shortly down the road. This entirely the fault of customer who failed to follow the operational specifications.

            There is also shoddy cooling business on Nvidia side with their mid and low-range products (same range as 4850). It is because it helps the bottom line for AIBs.

            I remember back with 7900GTs that there were a large number of them developing premature problems due to sub-standard stock cooling and some of them were factory-overclocked.

            • MadManOriginal
            • 11 years ago

            Not a very good analogy, this is more like running an engine over the redline.

            • poulpy
            • 11 years ago

            yes I second that, this represents way more accurately the situation.
            No denial that the 4850 runs hot with its single slot cooler though but tinkering with frequencies isn’t always consequence free.

            • cegras
            • 11 years ago

            Actually, no, since the engine is spec’d to run into the redline.

            The autotune on Catalyst makes NO PROMISES about performance and warns about potential damage.

            And better analogue would be trying to start your car in 3rd gear from a stop. Sure, if your engine can do it it’ll do it, but most likely it’ll stall. So you’re taking a risk by doing it.

            • TurtlePerson2
            • 11 years ago

            AMD and nVidia need to focus on making cards run cooler instead of just faster. Video cards keep getting longer, wider, hotter, and louder. I’m all for performance gains, but I’d be willing to pay an extra $30 for the same clock speeds if it wasn’t one of those extra long cards.

            • coldpower27
            • 11 years ago

            You can but you would probably only get those minor increase you see when you shrink a CPU like say a Core 2 Quad Q6600 -> Core 2 Quad Q9550.

            You can’t increase performance nearly 100% and have it be much cooler too, there just isn’t enough technology headroom to do both at the same time.

            Your not fooling mother nature you have nearly 2 billion transistors in that thing, it’s power consumption is not bad considering.

            • cegras
            • 11 years ago

            l[

            • Kaleid
            • 11 years ago

            No. It’s a substandard product. I can do the same with my 3870 and all I get is some artifacts, but the 4850 probably went toast because of that.
            I would’ve used my Arctic Cooling S1 + 120mm fan too cool it down but I had not yet received some heatsinks for the VRM (here’s another part that is extremely poorly cooled in general)

            And certainly didn’t expect Ati to have placed a button in the card that pretty much says “SELF DESTRUCT”. It is not my fault that it is so poorly named.

            Substandard crappy cooling. Idiotic overclocking feature that cannot detect when it has gone TOO far. Atitool does a much better job, why can’t Ati with all of their expertise do something better?

          • MadManOriginal
          • 11 years ago

          You mean the ‘Auto-tune’ in CCC Overdrive?

            • Kaleid
            • 11 years ago

            Yes. Should be called “self destruct”.

            • Kaleid
            • 11 years ago

            But actually I’m not really sure why it died. I certainly had better than average case cooling. One would also think that the fan would ramp up the speed as some sort of interest in self-preservation.

    • Philldoe
    • 11 years ago

    Geez, that’s one warm card, well I suppose it’s not so bad since all of th heat from that card is forced directly out the rear end of the case. Buuuut, I think I’ll order an aftermarket cooler for it. Maby even a couple waterblocks as hot as it is =/

      • Jigar
      • 11 years ago

      I don’t think there is any after market cooler for X2 cards …

    • Flying Fox
    • 11 years ago

    Now the scrambling price cuts by Nvidia make sense. Bring on the competition!

    • VILLAIN_xx
    • 11 years ago

    Werent the 4870×2’s supposed to be 15% than the cross fired 4870’s? I saw it neck in neck in all the results.

    and

    It’s been a while since ive been physically attracted to a video card. I like the black alot.

      • TurtlePerson2
      • 11 years ago

      That was an engineering sample. I don’t think that’s the final look of it.

      • TheEmrys
      • 11 years ago

      Wait for drivers to mature. And factor in that its an engineering sample.

    • cygnus1
    • 11 years ago

    i don’t think i can fault AMD for the no super high end chip strategy, what kind of numbers have they ever sold on those kind of parts in the past? the high end parts sales numbers are probably utterly dwarfed by mainstream and low end sales.

    they’re putting their effort into squeezing more sales out of the midrange, with probably higher margins, as no money went into designing something more complex, and their design is probably more cost efficient as well

    i say good for them

      • sigher
      • 11 years ago

      I don’t quite see how you can say that, I’d say they made a super high-end chip and just put it on the entire range, have you read what’s on there? And compared it to the nvidia’s super high-end? If this isn’t super high-end then I don’t know what is, it’s more advanced than nvidia’s offering I dare to suggest.

        • Flying Fox
        • 11 years ago

        He meant a super high end /[

    • Legend
    • 11 years ago

    #5, Indeed it will be interesting. Anyone think this part will see a <$500 release?

      • Umbragen
      • 11 years ago

      No, but I expect the 4850 X2 to be priced substantially lower than $500, maybe even below $400 even with 2 GB GDDR3. Personally, I’d rather have a souped up single 4870 with 1GB and good custom cooler, but who knows when those will hit the market.

        • Valhalla926
        • 11 years ago

        I hope something comes from the 4850 X2 rumors. I recall seeing a 2600×2 on Newegg, but its gone now. Sapphire (or ASUS. Either way, only one company) was the only company that made it, and it had a terrible rating. No drivers, poor cooling, and horribly overpriced. I hope the 4850 X2, if it comes to fruition, has a good level of support. It’d be interesting to see a dual-gpu “mid-range” offering.

          • Pax-UX
          • 11 years ago

          These 4850 x2 cards don’t make sense to me. Fine if you buy a 4850 and then go crossfire that understandable. But the price different wouldn’t be that much off a 4870 which overall would be a better choice.

            • Valhalla926
            • 11 years ago

            Well, I think it could be an attempt to put a card at every $100 increment or so. there is some performance increase between a 4850 Cross Fire setup and a single 4870, and the price is about $80 or so more. It seems AMD could really be making multi-GPU cards a standard part of their card hierarchy, if they do this.

            I wonder what a 3-GPU card would look like. Probably too long for any case made so far. But if they “borrowed” Nvidia’s idea of two boards sandwiched together, we could get an interesting Frankenstein’s monster out of this.

    • PRIME1
    • 11 years ago

    l[

      • MadManOriginal
      • 11 years ago

      That’s the one small area where I’m dissapointed in the ATi drivers versus NV ones I’d had recently. Not even for dual-card setups, but for single-card, the ability to make automatic game profiles in NV drivers was nice. There are ways to do effectively the same thing in ATi drivers but to use the profiles upon playing the game is not nearly as slick and seamless.

        • odizzido
        • 11 years ago

        I thought the game profiles were cool too until I found out they kept resetting themselves. I don’t know if it’s fixed because I just stopped using them.

          • homerdog
          • 11 years ago

          It’s fixed now, but I still prefer nHancer.

    • UberGerbil
    • 11 years ago

    /[http://www.boingboing.net/2008/06/24/large-hadron-collide.html<]§

      • ssidbroadcast
      • 11 years ago

      Yeah I rolled my eyes when I read that line. It’s impossible, I don’t care how many FLOPS you throw at that question.

        • Pax-UX
        • 11 years ago

        FLOPS don’t work with women, they need HARDware. ¬:( I know….

      • stmok
      • 11 years ago

      The comment is flawed. The reason is because Women aren’t logical to begin with. They’re emotional.

      To throw massive amounts of computing power into this subject, will do nothing more than waste time and the electricity in the wall socket.

      It’ll be wiser to use this GPU power on some other application. GPGPU programming anyone? 🙂

        • elty
        • 11 years ago

        Instead of folding protein we should be simulating neuron.

      • Usacomp2k3
      • 11 years ago

      42

      • eitje
      • 11 years ago

      Figuring out women…

      …you’re doing it wrong.

    • Shobai
    • 11 years ago

    fp? did i manage it?

      • UberGerbil
      • 11 years ago

      Yes, but you didn’t manage to say anything in it. Though perhaps that’s the point.

        • Shobai
        • 11 years ago

        yeh, didn’t think that far ahead…

        not to worry, now that i have actually read the article and potentially have something intelligent to add to the discussion, i don’t really have anything to say. them’s the brakes.

        seriously though, even though this is an engineering sample, it looks like a solid product. it’ll be interesting to see how they’re priced..

          • Usacomp2k3
          • 11 years ago

          Isn’t it ‘breaks’?

        • sigher
        • 11 years ago

        Yeah I always think on boards where such sad behaviour is prevalent that a fp without content doesn’t ‘count’ really, since it isn’t actually a post at all but just the equivalent of spam and reduces the one that did it to the level of slugs.

Pin It on Pinterest

Share This