Here’s why the CrossFire Eyefinity/4K story matters

Earlier this week, we posted a news item about an article written by Ryan Shrout over at PC Perspective. In the article, Ryan revealed some problems with using a Radeon CrossFire multi-GPU setups and multiple displays.

Those problems look superficially similar to the ones we explored in our Radeon HD 7990 review. They were partially resolved—for single displays with resolutions of 2560×1600 and below, and for DirectX 10/11 games—by AMD’s frame pacing beta driver. AMD has been forthright that it has more work to do in order to make CrossFire work properly with multiple displays, higher resolutions, and DirectX 9 games.

I noticed that many folks reacted to our news item by asking why this story matters, given the known issues with CrossFire that have persisted literally for years. I have been talking with Ryan and looking into these things for myself, and I think I can explain.

Let’s start with the obvious: this story is news because nobody has ever looked at frame delivery with multi-display configs using these tools before. We first published results using Nvidia’s FCAT tools back in March, and we’ve used them quite a bit since. However, getting meaningful results from multi-display setups is tricky when you can only capture one video output at a time, and, rah rah other excuses—the bottom line is, I never took the time to try capturing, say, the left-most display with the colored FCAT overlay and analyzing the output. Ryan did so and published the first public results.

That’s interesting because, technically speaking, multi-display CrossFire setups work differently than single-monitor ones. We noted this fact way back in our six-way Eyefinity write-up: the card-to-card link over a CrossFire bridge can only transfer images up to to four megapixels in size. Thus, a CrossFire team connected to multiple displays must pass data from the secondary card to the primary card over PCI Express. The method of compositing frames for Eyefinity is simply different. That’s presumably why AMD’s current frame-pacing driver can’t work its magic on anything beyond a single, four-megapixel monitor.

We already know that non-frame-paced CrossFire solutions on a single display are kind of a mess. Turns out that the problems are a bit different, and even worse, with multiple monitors.

I’ve been doing some frame captures myself this week, and I can tell you what I’ve seen. The vast majority of the time, CrossFire with Eyefinity drops every other frame with alarming consistency. About half of the frames just don’t make it to the display at all, even though they’re counted in software benchmarking tools like Fraps. I’ve seen dropped frames with single-display CrossFire, but nothing nearly this extreme.

Also, Ryan found a problem in some games where scan lines from two different frames become intermixed, causing multiple horizontal tearing artifacts on screen at once. (That’s his screenshot above.) I’ve not seen this problem in my testing yet, but it looks to be a little worse and different from the slight “leakage” of an old frame into a newer one that we observed with CrossFire and one monitor. I need to do more testing in order to get a sense of how frequently this issue pops up.

The bottom line is that Eyefinity and CrossFire together appear to be a uniquely bad combination. Worse, these problems could be tough to overcome with a driver update because of the hardware bandwidth limitations involved.

This story is a bit of a powder keg for several reasons.

For one, the new marketing frontier for high-end PC graphics is 4K displays. As you may know, current 4K monitors are essentially the same as multi-monitor setups in their operation. Since today’s display ASICs can’t support 4K resolutions natively, monitors like the Asus PQ321Q use tiling. One input drives the left “tile” of the monitor, and a second feeds the right tile. AMD’s drivers handle the PQ321Q just like a dual-monitor Eyefinity setup. That means the compositing problems we’ve explored happen to CrossFire configs connected to 4K displays—not the regular microstuttering troubles, but the amped-up versions.

Ryan tells me he was working on this story behind the scenes for a while, talking to both AMD and Nvidia about problems they each had with 4K monitors. You can imagine what happened when these two fierce competitors caught wind of the CrossFire problems.

For its part, Nvidia called together several of us in the press last week, got us set up to use FCAT with 4K monitors, and pointed us toward some specific issues with their competition. One the big issues Nvidia emphasized in this context is how Radeons using dual HDMI outputs to drive a 4K display can exhibit vertical tearing right smack in the middle of the screen, where the two tiles meet, because they’re not being refreshed in sync. This problem is easy to spot in operation.

 

GeForces don’t do this. Fortunately, you can avoid this problem on Radeons simply by using a single DisplayPort cable and putting the monitor into DisplayPort MST mode. The display is still treated as two tiles, but the two DP streams use the same timing source, and this vertical tearing effect is eliminated.

I figure if you drop thousands of dollars on a 4K gaming setup, you can spring for the best cable config. So one of Nvidia’s main points just doesn’t resonate with me.

And you’ve gotta say, it’s quite the aggressive move, working to highlight problems with 4K displays just days ahead of your rival’s big launch event for a next-gen GPU. I had to take some time to confirm that the Eyefinity/4K issues were truly different from the known issues with CrossFire on a single monitor before deciding to post anything.

That said, Nvidia deserves some credit for making sure its products work properly. My experience with dual GeForce GTX 770s and a 4K display has been nearly seamless. Plug in two HDMI inputs or a single DisplayPort connection with MST, and the GeForce drivers identify the display and configure it silently without resorting to the Surround setup UI. There’s no vertical tearing if you choose to use dual HDMI inputs. You’re going to want to use multiple graphics cards in order to get fluid gameplay at 4K resolutions, and Nvidia’s frame metering tech allows our dual-GTX 770 SLI setup to deliver. It’s noticeably better than dual Radeon HD 7970s, and not in a subtle way. Nvidia has engineered a solution that overcomes a lot of obstacles in order to make that happen. Give them props for that.

As for AMD, well, one can imagine the collective groan that went up in their halls when word of these problems surfaced on the eve of their big announcement. The timing isn’t great for them. I received some appeals to my better nature, asking me not to write about these things yet, telling me I’d hear all about AMD’s 4K plans next week. I expect AMD to commit to fixing the problems with its existing products, as well as unveiling a newer and more capable high-end GPU. I’m looking forward to it.

But I’m less sympathetic when I think about how AMD has marketed multi-GPU solutions like the Radeon HD 7990 as the best solution for 4K graphics. We’re talking about very expensive products that simply don’t work like they should. I figure folks should know about these issues today, not later.

My hope is that we’ll be adding another chapter to this story soon, one that tells the tale of AMD correcting these problems in both current and upcoming Radeons.

Comments closed
    • CatheyBarrett23
    • 6 years ago
    • DiMaestro
    • 6 years ago

    Who is winning right now? Single GPU setups increasing the amount of pixels they can push, or high pixel density monitors?

    Right now, single GPU setups are winning.

    32.8% of all steam users are utilizing 1920×1080 resolution, anything over that besides 1920×1200 (at 2.71%) is less than 1% …. Yes, any resolution above 1920×1200 is below 1%.

    This article with it’s multi-gpu centric appeals to 1.37% of the total PC gaming market. ,.. while technically interesting … is useless.

    • kamikaziechameleon
    • 6 years ago

    I know there are lots of guys on this and other forums pointing to this as a problem for the rich and not the average gamer. If that is how you feel don’t read these articles and don’t post in them.

    As for how this looks:

    1. AMD did a good job of offering single GPU solutions that had powerful hardware at good prices and could drive multiple displays.

    2. AMD then proceeded to push dual GPU solutions and never resolved how those worked with multiple display setups.

    I know those are pretty muddled timelines but it lets you see the accomplishment in what they achieved with single GPU’s for budget consumers. The mess was in bringing those features across to the multi GPU configs. Crossfire has always been second fiddle to SLI but now they are selling premium solutions that can’t do what their budget products can. That is the conundrum.

    Meanwhile Nvidia gates their tech more deliberately to not create conflicts or duplicitys in their offerings.

    AMD has struggled for many business related reasons. Their hardware engineers are insanely brilliant but their driver management, supply chain management, and marketing all muddle those achievements. Nvidia isn’t doing anything amazing just being consistent and reliable and they are winning.

      • oldDummy
      • 6 years ago

      ” AMD then proceeded to push dual GPU solutions and never resolved how those worked with multiple display setups.”

      Small nit to pick:

      This is more than a push…

      AMD was forced to utilize dual chips to compete against Nvidia’s high end single chip solutions. Without working drivers this whole concept is bogus. AMD is in a world of pain unless this is cleared up. Since this hasn’t been cleared up yet, one has to assume there are BIG problems with their driver work group.

      Not good, not good at all.

    • Chrispy_
    • 6 years ago

    Standard dirty tactics, as used by everyone everywhere when they’re at a disadvantage.

    1. Find an insignificant or easily-fixable problems with a competitor
    2. Make sure your product doesn’t have that one, specific problem
    3. Make a huge fuss over the magnitude and severity of that problem

    Nvidia’s products are [b<]completely perfect[/b<] though, so it's all fine...

      • Airmantharp
      • 6 years ago

      Previous generations? No, no they’re not.

      This generation? They’ve set the standard across the board, while laughing at AMD all the way to the bank. And if AMD is claiming that they’re the better solution than Nvidia when their own products don’t even work right; well, I can forgive Nvidia flinging some mud in their eye.

      • MFergus
      • 6 years ago

      I don’t think Nvidia ever said their products are perfect. If they did that would be pretty arrogant and wrong.

        • Firestarter
        • 6 years ago

        The Way It’s Meant to be Played

      • Choz
      • 6 years ago

      LG did this in Australia when Samsung, in their infinite wisdom, made a replacement for corporate monitors and used a figure 8 power cable to a brick instead of a standard kettle cord without telling anyone.

      LG made a killing that year and Samsung took a big hit.

      On another note, are these crossfire+eyefinity issues restricted to 7-series? I just don’t seem to have them on my 6970s.

    • Goty
    • 6 years ago

    Matters, still isn’t news.

    • Kretschmer
    • 6 years ago

    As the proud owner of a Voodoo5 5500, I’m never again purchasing a graphics card from a firm known to be circling the drain.

    Sorry AMD.

      • oldDummy
      • 6 years ago

      At one time I disliked Nvidia for being ruthless.
      Jen-Hsun Huang/Nvidia were relentless [better word], just ask 3dFX.
      The Jobs of gfx…kinda.
      Don’t know if he’s still involved but he was a very tough taskmaster.
      Takes a tough man to make a tender chicken..

      • Mr. Eco
      • 6 years ago

      This makes no sense. You enjoyed your Voodoo5 back in the days. What more do you want from it, to play Crysis 3?

      Graphics cards useful life is no more than 2 years. Then the new generations come with more graphics power and/or less power consumption, and more compact.

      The point being – it does not matter that there is no support for Voodoo 5 anymore. Or do you want to still use it somehow?

        • ColeLT1
        • 6 years ago

        My guess is he never got to enjoy his voodoo 5, overpriced and overpromised, late and underperforming with technical issues.

    • jimbo75
    • 6 years ago
      • Klimax
      • 6 years ago

      Purportedly… and I’d be very skeptical until independents like TechReport confirm it without significant issues.

      • chuckula
      • 6 years ago

      HAWAII IS IN NO WAY LATE OR DELAYED. IN FACT, IF IT SHIPS BEFORE THE BEGINNING OF THE 22ND CENTURY THEN IT IS EARLY!

      • maxxcool
      • 6 years ago

      I don’t understand? is it the lava or the spam ?

    • eloj
    • 6 years ago

    I find the whole idea of this:

    “For its part, Nvidia called together several of us in the press last week, got us set up to use FCAT with 4K monitors, and pointed us toward some specific issues with their competition. ”

    positively disgusting. The correct response would be to politely decline and say that you will investigate on your own terms. This is literally making you the propaganda arm of one vendor against the other.

    “But the problem is real!” just sounds like weak justifications for engaging in the publicty afforded those who participate in nVidias game like good little boys. A favor I’m sure they’ll remember.

    Even if we agree that it’s fair game to write about the actual problems, the fact that you’re allowing nVidia to coordinate the message means you’re helping them in their PR assult. It goes without saying that timing is very important, and multiple “independent” sources reporting at the same time is only helpful to nVidia, wanting maximum impact. It was no “mistake” that none of the articles I saw talked about having been coordinated by nVidia. I suspected as much though.

    Can’t help but notice that any post that says something similar is deep in the negative here. I can only assume that nVidia also got a crew to “moderate” for you.

      • Airmantharp
      • 6 years ago

      If AMD makes a claim that their products are ‘the solution’ when they quite literally aren’t, is Nvidia wrong to point that out?

      I mean, hell, the damn cards don’t work, and AMD is ready to pour more cards on the market to compete with Nvidia’s products essentially using false claims.

      Nvidia could likely sue AMD over this. Providing impartial tools to check AMD’s claims is hardly bad conduct.

        • eloj
        • 6 years ago

        nVidia are of course free to put out a press release of their own. That would be nVidia pointing something out. They should not reach out to have their sock puppets in the press to do their dirty work for them.

        This shouldn’t be hard to understand, unless your ethics are seriously warped.

          • Airmantharp
          • 6 years ago

          Oh, it isn’t ethical, by my personal standards- I agree with that. But for businesses? Well, I expect quite a bit worse; and in reality, this is Nvidia fighting AMD’s disinformation at the source.

          It’s not like Nvidia can counter AMD’s ‘we’re the best for 4k!’ claims without proving it, you know, and it’s not like TR has to go along with all of AMD or Nvidia’s ‘requests’. Nvidia is far from a spotless company, but they are coming across pretty clean on this issue, for sure, and it’s reasonable for them to defend their work and their record.

      • bwcbiz
      • 6 years ago

      I think you’re being a bit unfair. Yes, nVidia is playing this for all it’s worth: that’s their job. But note that

      [i<]"Ryan tells me he was working on this story behind the scenes for a while, talking to both AMD and Nvidia about problems they each had with 4K monitors. You can imagine what happened when these two fierce competitors caught wind of the CrossFire problems."[/i<] So TR was looking at this issue [u<]before[/u<] the nVidia media blitz. And just because someone has ulterior motives when they draw attention to a defect in their competition's product doesn't mean the defect doesn't exist. As a journalist/reviewer, once you do your own research to verify the claim, are you supposed to suppress a true story because of the source? I don't think so.

      • brucek2
      • 6 years ago

      So how far should journalists go in not “allowing vVidia to coordinate the message”?

      Do you have a problem when the PR depts and the press work together so reviews for a new product can be available on day one? Do you have a problem when those reviews state the manufacturer’s claims as to the products virtues, and then proceed to test them? And isn’t the manufacturers stated beliefs about how its products stand in comparison to their competitors news of itself (to be reported that way, not as truth of course; and hopefully tested.)

      I see nothing wrong here. For every PR outreach that ends up in a story like this, there’s usually lots more that went nowhere. The press’s job is to figure out which stories are of interest to their audience, then add their own individual voice and expertise and verification.

      None of this would have gone anywhere if nvidia didn’t have the truth on its side. The fact that the issue is sufficiently complicated that it takes specialized tools & assistance from the only party that can afford and is motivated to provide them doesn’t mean it should be allowed to drop because of that.

        • eloj
        • 6 years ago

        Reviews are a totally different beast, but at the very LEAST the origin of the review copy should be clearly stated, as well as any and all conditions attached to the publication of the review. It’s all about transparency.

        In this particular case I think it would have been proper to a) Not engage directly with nVidia re: AMDs hardware and b) Rest on the story and test it on the new generation hardware, where it’d be actual news.

        It’s not like there’s some rush to protect new prospective buyers of 7990s. There’s pretty much zero public service in promoting this story.

      • Silus
      • 6 years ago

      So yet another dual standard from the typical AMD zealots ?

      Remember when NVIDIA had problems with their mobile chips ? AMD came to the press to talk about something they had no idea about (i.e. they didn’t know all the details, yet thought it was a good idea to talk about it). It was just a smear campaign against NVIDIA and AMD used TR and other sites to do their disgusting PR work.

      [url<]https://techreport.com/news/15707/amd-offers-its-take-on-gpu-packages-failures[/url<] Didn't see you complain there about how disgusting it was...but for some reason it's disgusting now, when NVIDIA does it. At least NVIDIA is using actual hardware to show AMD's problems in this matter, while AMD did its smear campaign based on their "opinion", without knowing all the facts. But again, double standards ahoy!

    • ptsant
    • 6 years ago

    This was already clear in your review of beta drivers from 19 Aug (https://techreport.com/news/25248/new-radeon-drivers-promise-better-performance-frame-pacing-improvements) and AMD has never tried to hide the fact. It’s not like you discovered something new. Furthermore, there is a recent story for that dating from 18.09 (https://techreport.com/news/25388/crossfire-doesnt-play-well-with-4k-displays-eyefinity-setups). Then you come back 4 days later to make sure we don’t forget how important it is?

    For those who use eyefinity + crossfire, I think you have done enough to highlight the issue. For the rest of us, who are waiting for an exciting new product launch, you should maybe try to find something more interesting…

    PS If you review the new GPUs don’t forget to mention that Crossfire/Eyefinity is broken. In big bold letters. Preferably in the title.

      • Mr. Eco
      • 6 years ago

      [url<]http://en.wikipedia.org/wiki/Sensationalism[/url<]

    • flip-mode
    • 6 years ago

    In other words, Nvidia has bested AMD again on in a way that will never personally affect me but is still an indicator that Nvidia is more obsessed with driver quality than AMD is.

      • clone
      • 6 years ago

      Nvidia’s 3d vision/sli issues disagree with you.

        • Klimax
        • 6 years ago

        Evidence?

          • clone
          • 6 years ago

          [url<]http://support.futuremark.com/futuremark/topics/still_sli_problems_on_3dmark_fire_strike[/url<] [url<]http://pixelenemy.com/nvidia-320-49-driver-fixes-battlefield-3-and-assassins-creed-iii-problems/[/url<] [url<]http://www.stereofinland.com/nvidia-3d-vision-doesnt-like-multiple-displays/[/url<] [url<]http://forums.evga.com/tm.aspx?m=1700436#1701152[/url<]

            • Klimax
            • 6 years ago

            And you couldn’t find anything recent, couldn’t you. For all I know it may have been already addressed! Youngest is 5 month, which is outdated by now…

            (Not that I can test things, since I don’t have SLI. Space and money constraints)

            • clone
            • 6 years ago

            lol, 5 months isn’t that long especially when the “massive number of 3d vision and SLI users” are considered but you could be right, it’s all fixed and perfect now….. it wasn’t for all the years prior but it’s all fixed now.

            p.s. I found that info in less than 5 minutes of trying.

            • Klimax
            • 6 years ago

            You haven’t specified timeline (defaulting to currently), your links show multiple related and unrelated issues to SLI/Surround. Some true bugs in driver (got fixed) and somewhere slight breakage after patch by Microsoft (seems that there were some assumptions on part of 3DMark – or it is system patch where they changed auxiliary DirectX API used for debugging and performance measurements like FRAPS)

            You haven’t even quoted which bug you refer to in case of driver update. (Or pointed out how long it existed. For all one can say it may have been much shorter timespan then AMD’s issues under discussion)

            (Pity there doesn’t appear follow up with 3D Vision and as for last link, 3D with three panels is one of most demanding scenarios, so it wouldn’t surprise me that it wouldn’t be that much of driver problem, but performance one and then 3D is still quite new field in games)

            BTW: I take it you refer to “[SLI][Surround][GeForce GTX 670]: 2D Surround cannot be enabled when SLI is enabled” Unless it was for long time, then it is not anywhere near AMD’s issues.

            And frankly, evidence still missing that “massive number of 3d vision and SLI users” were affected. Not at all evident or even derivable from you links, furthermore there is still no evidence for “for all the years prior” – 670 is only a year old. (And random blog posts won’t do that, as often proper investigation into issue is missing. That’s important as many problems are caused by other things, but manifest as problem with GPU/drivers)

    • Mr. Eco
    • 6 years ago

    Paid NVidia campaign, disgusting. Throw dirt all over AMD. The issue affects all the 5 people on planet Earth playing on 4K in multi-gpu setups.

    NVidia are afraid of something – like better performance for the price for AMD cards. So they use tech sites wanting to make BIG HEADLINES – AMD CARDS DO NOT WORK.

      • HisDivineOrder
      • 6 years ago

      I’m noticing you’re not denying the fact the cards don’t work in this increasingly growing scenario. The most curious thing is you’re ignoring the fact for the last two years that AMD has been screaming, “We have 3GB’s of RAM, we’re built for the future! nVidia’s not! nVidia’s not for the future! We’re a better value because we’re built for the higher resolution displays!”

      Except they’re not because they can’t even do 4K gaming justice with the dual card or tri card setups that such gaming would require. By the looks of things, it doesn’t appear they’re ever going to truly fix this either as it’s a hardware limitation.

        • Mr. Eco
        • 6 years ago

        It is a fact, and AMD stated it in the requirements. Other facts:
        – on this elite tech site the issue affects zero people;
        – nVidia uses fact #1 for marketing campaign, relying on sites to make the dirty work for them.

      • MathMan
      • 6 years ago

      There’s a surprisingly easy way to avoid these kind of smear campaigns: don’t give your competitor the tools to do them by making sure your products work as intended. See? Simple!

      Nvidia marketing would be negligent if they didn’t suggest to the press what the issues were of their biggest rival.

      And sites like TechReport are supposed to inform us about the latest and greatest in technology. If every website followed your suggestion of silencing the issue then everybody (irrelevant how little there are) in the market for such a monitor would be clueless about the problem. Is that what you want?

      The crossfire problems were exposed more than 6 months ago and all AMD have been able to come up with in this long time is a beta driver that only works in limited configurations? Do you think that’s acceptable?

      I love reading reviews about high-end audio and super cars. I don’t have the money to buy them and neither those the vast majority of other readers. Should those magazines only test iPods and Toyota Corollas?

      • Kougar
      • 6 years ago

      Dual 780’s function properly when dual 7970’s don’t for 4K displays. It’s as simple as that. I’d say kudos to TR for calling AMD’s marketing bluff.

    • Commander Octavian
    • 6 years ago

    The issue mentioned in this post is not relevant to 99,9….% of people.

      • MFergus
      • 6 years ago

      But it is relevant to 100% of people who have crossfire which is what it’s about.

        • Commander Octavian
        • 6 years ago

        This is an issue with Xfire on 4K resolution. The issue doesn’t extend to other resolutions. In fact, this whole article seems so out of place, being published right before the launch of AMD’s new Hawaii family of graphics cards, it feels more like an Nvidia propaganda.

          • Laykun
          • 6 years ago

          Wrong, this is an issue with crossfire and eyefinity. You should read the article more closely.

      • lycium
      • 6 years ago

      So are 99.9% of pages on Wikipedia. Your point?

      • MathMan
      • 6 years ago

      I don’t have a Titan. I don’t have 7990. I don’t have a Ferrari. I don’t have many expensive gadgets and things. I love to read about them and you probably do too. Who cares that it only matters to the 0.1%?

      • brucek2
      • 6 years ago

      By that logic would you prefer that TechReport review only mass market PC products, like the lowest cost laptop and desktop available at CostCo?

      Separately, we have a long history in tech of capabilities that originally started out as exotic and near unobtainable actually becoming mainstream. 4k displays are unusual today but for all we know may be the mainstream panel size within several years. And people making that jump for the first time may very well also find themselves going multi-GPU for the first time.

      Finally, the issue of a major tech company marketing & promoting technologies that it turns out don’t actually work well once you need them is an issue that should be of concern to 100% of people who may be considering purchasing products from them. This current issue may now be “well known” (by enthusiasts reading tech blogs, but not anyone else), although it had existed for months/years before the tech press documented it so cleverly that AMD finally had no choice but to admit the issue. If all those same execs that were OK with marketing a BS technology until they were finally called out on it are still in their same jobs, my working assumption is they are likely to do the same again and may in fact already be concealing another yet-to-be-discovered issue.

        • Commander Octavian
        • 6 years ago

        I would prefer that TechReport is not used in this cheap bickering between competing firms. The issue is indeed not relevant to any of us and it wont be relevant anytime soon. 4K is not something I would spend $3,500K over, especially when the available 4K displays using the same panel suffer from many unpleasant shortcomings, such as backlight bleed, horrible luminance and color uniformity, poor contrast and high input and pixel lag.

          • Waco
          • 6 years ago

          You should read before trolling. This applies to triple-display setups as well and they are a LOT more common.

          • brucek2
          • 6 years ago

          You need to consider the potential circumstances over the life of these cards, not just the reality of today. 4K may cost $3,500 today (or $1,500 for the no-name brand), but remember it was not all that long ago that 1080p was exotic, expensive, and considered by at least some in the tech press to be a needless feature created only for marketing reasons.

          If 4K is found valuable and especially if it becomes the standard in CE television displays, it will become the mass produced panel type just like 1080p is now. This would happen in years not decades. And once it does a multi 4K display set up will not be uncommon in enthusiast circles. And suddenly there will be many more people who will finally call on the graphics cards they bought in part because they could scale up as needed, and who would only then find out that if they chose the wrong brand, they were out of luck.

          Having the tech press apply the screws now over this issue is doing a lot of us a favor, so the working solution will be available when we need it. And also to help us choose the right brand today in case we’re still using those cards in any of our systems when we make the jump.

    • SternHammer
    • 6 years ago

    While I think this article is slightly exaggerating cross-fire issues relating to the topic points, I just want to say, that, while these issues may exist, the 4K displays have just recently started rolling out to the mainstream market and as inconvenient as it may be for some users; it’s gonna take A.M.D a little while to solve their “cross-fire” issues for these upcoming 4K displays.

    While I do agree that AMD needs to expand their software engineering and drivers division and increase their release efficiency, I find it rather non-professional for a tech-site to publish this kind of negative “bashing/harmful marketing” article which has apparently been disclosed by the main competitor, just a few days before the release of the next gen Hawaii GPUs.

    • tbone8ty
    • 6 years ago

    what about this?

    [url<]http://www.brightsideofnews.com/news/2013/9/18/nvidia-launches-amd-has-issues-marketing-offensive-ahead-of-hawaii-launch.aspx#.Ujo-ScnFMsU.twitter[/url<]

      • DrCR
      • 6 years ago

      Bump to the top. A must read to keep this all in context.

        • Lans
        • 6 years ago

        Yep, very interesting read.

        So it sounds like what TR is saying is absolutely true for this cycle of “Nvidia ahead / AMD behind”… Whereas over the span of ~5years that BSN it is clear(er) that it has been back and forth rather…

        Thanks for the link!

    • clone
    • 6 years ago

    I’m not trying to downplay how sensationalist this article is but jesus…. after looking at Newegg and NCIX I found 2 displays, an Asus 31in for $3700 and a Samsung for $36,000 (85in)

    PC perspective is being just as bad if not worse given they deny Nvidia’s involvement in the process vs TR admitting it.
    [quote<]Nvidia called together several of us in the press last week, got us set up to use FCAT with 4K monitors, and pointed us toward some specific issues with their competition.[/quote<] this just sounds sad, like ever since AMD snagged all of the consoles Nvidia has switched to a far more attack / negative PR style.

      • kc77
      • 6 years ago

      I agree. That section you quoted is problematic. Big time.

        • clone
        • 6 years ago

        it’s all grown so cozy (and yes I’d be just as uncomfortable if AMD joined the party and we read “AMD called together several of us in the press in order to discuss how to handle it”)……. the first response to any call from Nvidia involving crapping on it’s competition for them should be “go $&*# yourself, we aren’t your shills.” then they investigate it and if their is something that can’t be refuted you publish an objective article that mentions it in context…… if it matters at all which in this case is very far from the truth.

        Nvidia calling together the media to find a way of writing an article that will crap on AMD…. I don’t blame Nvidia for that, the media that drinks the kool aid… I can’t say the same.

        “Here’s why the CrossFire Eyefinity/4K story matters” … it doesn’t yet, it’s not nearly the case which is why this tagline had to be used in the first place.

      • cynan
      • 6 years ago

      [quote<]I'm not trying to downplay how sensationalist this article is but jesus.... after looking at Newegg and NCIX I found 2 displays, an Asus 31in for $3700 and a Samsung for $36,000 (85in)[/quote<] By how it comes across to me, it seems as you are trying to [i<]uplay[/i<] how sensationalist this article is. Because really it's not all that sensationalist. This blog post is simply summarizing issues encountered first-hand and reports of other issues encountered by Ryan ant PCPer that, as of yet, definitely do, or may need to be addressed by AMD in order for customers to be able to achieve the experience promised to them by AMD's marketing. Added to that, as mentioned already numerous times, it upholds AMD's statements that the crossfire issues with eyefinity are in fact different from those previously mostly addressed in Catalyst 13.8 and 13.10 beta drivers and, in addition to what AMD has stated as of the present, [i<]qualifies how these issues differ[/i<] - which is interesting and novel and therefore apt fodder for a tech news site. Does it say that AMD won't be able to fix these issues? Does it say they are horrible and no one should give them a second consideration? No. That would be sensationalist. If anything, you might be able to fault Damage somewhat for perhaps being a tad pessimistic at AMD's prospects of fixing these in a future driver release due to bandwidth limiations, as, as far as I know, he is purely speculating about how much of a monkey wrench this will be/is for AMD: [quote<]The bottom line is that Eyefinity and CrossFire together appear to be a uniquely bad combination. Worse, these problems could be tough to overcome with a driver update because of the hardware bandwidth limitations involved.[/quote<] But then again, informed speculation is all part of good journalism, if done responsibly. Yes, NVIDIA circling AMD with their FCAT "propaganda machine" seems at least a slimy, but at the end of the day, all that it is really doing is providing a reproducible means of observing some of AMD's more stark issues with crossfire and multi-display gaming. This benefits the reader and potential customer. Particularly when they have been promising market-leading multi-gpu/display performance. Which they simply don't have at the moment. If/when they do, they are free to provide a demonstration to the contrary and let the tech journalism sites investigate the merits of such. I'll not comment on the integrity of PCper's article, but this blog post, at least for the most part, does nothing but outline the resent events that precipitated it, provide a few comments about issues observed in the author's own experience and those purportedly encountered by Ryan Shrout, and then ends on a mostly hopeful note wishing AMD expedience in getting these issues remedied. As a potential customer for a mutli-gpu/display gaming setup, would it be preferable just to stick your fingers in your ears and shout "La La La La"?

    • wierdo
    • 6 years ago

    It’s a problem I don’t have to worry about for at least 5-10 years personally, but thanks for keeping companies on their toes, better they work on it now than when these products start making sense for the mainstream segment.

    • PopcornMachine
    • 6 years ago

    I would agree that the story matters because it indicates new problems with crossfire that weren’t known before. The hardware limitation in resolving the issue, and AMD’s marketing without mentioning it should be made known

    But I would disagree how serious that problem is. 4k panels at $3,500+ are not the imminent future of gaming. I now have a 2560×1440 lcd, and as one reviewer of 4k gaming noted, he would rather go that route at the moment as the pixel density seems quite sufficient and single video cards can handle it. My plan now is to get the single most powerful card I can afford as it seems the better solution, and it will be the best bang for the buck from AMD or NVIDIA.

    Multi-monitor gaming seemed neat at first, but I’ve decided gaming with bezels is not for me. At this point 4k is mostly an e-peen option only. And NVidia seems desperate to disparage their rival. A surprisingly insecure reaction from a company that should feel rather confident in their position.

    Makes me wonder if they are worried about something we don’t know about.

    • iatacs19
    • 6 years ago

    Thanks for the heads up!

    • bhtooefr
    • 6 years ago

    Mind you, there is one monitor (at least) that the additional hardware gets ridiculously expensive for…

    The IBM T221.

    Here’s the full list of hardware that you need to run a T221 at native resolution and maximum refresh rate, with DisplayPort MST:

    IBM T221 9503-DG5, DGP, or DGM – those are the sub-models that (somewhat) support dual-link DVI
    Either the Cirthix passive T221 dual-link DVI adapter, or two of the very rare IBM active one (DG5s and DGPs shipped with one, but IIRC Eyefinity gets a bit weird)
    Club 3D Sensevision MST hub
    Two DisplayPort to DL-DVI adapters (they don’t have to be 3D-capable, the T221’s slow refresh rate fits in the 268.5 MHz that most of those adapters are rated for)

    • Laykun
    • 6 years ago

    We bought some FirePro (v8800) cards for work to do a 6 screen eyefinity setup, which is advertised as a feature (for building a cave). You’d think they’d have their shit sorted for their high premium workstation graphics cards but the drivers didn’t support using multiple un-crossfired cards for doing eyefinity (pretty standard on consumer level) . So it’s not just their consumer level cards where they cop out with the drivers. What a waste of time and money that was.

    AMD has successfully turned me away from their hardware with their completely sub-par drivers. The experience just isn’t the same compared to back when I had my Radeon X1900XT (about the time AMD purchased ATi … hmmm). It’s because of this track record that I don’t really expect to see any real change in the future, AMD has always promised things and never really delivered. It’s pretty crap when your users have to request features and fixes in areas that they are supposed to be the experts in.

    I feel like a broken record but ever since I went with my SLi GTX 670 setup (running 3 screen surround) I’ve never really looked back.

      • anubis44
      • 6 years ago

      “I feel like a broken record but ever since I went with my SLi GTX 670 setup (running 3 screen surround) I’ve never really looked back.”

      Which is very funny, since I had to sell my GTX670 and buy a 7950 because the NVidia drivers wouldn’t work without a 15 step manual adjustment process with my 3 monitor setup that took about 30 minutes and involved only plugging in one monitor at first and plugging in the second and third monitors after later steps. Every time there was a new driver update, it would forget the settings and I would have to manually go through the 15 steps all over again. With the 7950, I clicked on the eyefinity setting, clicked which monitor was left and center and presto! Working 3 monitor eyefinity setup that retains the settings after any driver updates I run.

      So much for NVidia’s ‘superior’ drivers. It’s just a bunch of propaganda.

        • Laykun
        • 6 years ago

        Try doing crossfire/multi card setups. Now I made a general statement about their drivers so allow me to clarify. In single card amd configurations I have no issue with, but once you go multi card multi monitor setup they fall apart. I’m not saying nvidia is the holy grail of drivers but in my experience I’ve never had any real show stoppers and often I don’t feel the need to even update my drivers, where as on amd I did have game breakers, not just blue screens but also missing functionality and terrible on screen performance, which made me feel like I had to constantly update my drivers, and I hate that. I’m glad I’ve moved away of the ritual of eagerly awaiting new drivers every month for them to either not come out or to be let down all together.

        • Sagia
        • 6 years ago

        Either you are too noob to configure it or you worship your ATI god

      • Commander Octavian
      • 6 years ago

      I witnessed first-handedly multi-monitor problems on a GTX780 setup few weeks ago mate. A friend of mine with a FirePro W7000 and a 4-way Eyefinity setup doesn’t seem to have the problem you mentioned. It could be a problem with a specific monitor model. Have you tried different models of monitors?

        • chuckula
        • 6 years ago

        Fascinating… you make a post about how this “bug” doesn’t affect anyone and now you are personally witnessing all these systems that only have problems when Nvidia is used… yeah, not buying it. Shill? Astroturfer? Just somebody who can’t separate fantasy from reality? Who knows… and who cares.

        [Edit: Yeah… account was registered about 10 minutes after this story was published. Looks like we can see how AMD is spending its marketing dollars…]

          • Commander Octavian
          • 6 years ago

          Yes, this bug indeed doesn’t affect anyone. This whole post is meant to stain AMD’s name and driver reputation right before their new Hawaii family of GPUs release.
          AMD explicitly stated when they released their frame-pacing beta driver that the driver only fixed DX10/11 games and only addressed resolutions of 2560×1440 and below. Anyhow, the obvious is obvious, and the purpose of this Hawaii-pre-launch-post is obvious..

            • Klimax
            • 6 years ago

            As if AMD’s driver needed any more bad name then they already got. (Also missed that it covers brand new issue, which looks quite worse…)

            And if they want to shout to the world about 4k and great Eyefinity, then they need first to have things in order, otherwise they mislead customers…

            • Laykun
            • 6 years ago

            Ok so I don’t count as a person then? I experienced this issue with HD 5xxx and HD 7xxx card setups, this does seem to be a fundamental underlying issue with their driver framework that permeates all card series. Considering that with these setups you also buy mid-high end video cards you’re paying a large premium which is in turn profit for AMD, you’d expect those that pay the largest margins get a bit of service, which is in fact half the reason you buy professional level video cards like the FirePro.

        • Laykun
        • 6 years ago

        Ah you should read back a bit. From what I can gather you’re talking about single card setups. I’m talking about multi-card setups. And you’ll have to be more specific than “problems”.

    • TwoEars
    • 6 years ago

    4k would be great for work, graphics and web-browsing since it will make everything crisper.

    But for games? I don’t like scaling, I like to run games at native resolution to eliminate any scaling and the delay associated with that.

    Very, very few people will have the horsepower needed to run today’s and tomorrows games in 4k at 60Hz.

    Which is why I will be holding off with 4k for a very long time I suspect.

    Give me 2560×1600@120Hz – that’s what the majority of gamers really want.

      • Farting Bob
      • 6 years ago

      3840×2160 at 60 FPS is 14.93 Gb/s of bandwidth every second.
      2560×1600 at 120 FPS is 14.75 Gb/s of data and 491m pixels a second. So basically the same GPU horsepower needed to do both.

        • Airmantharp
        • 6 years ago

        No idea why this is such a challenge for most people. Got more pixels? Need more fillrate! Well, that and more RAM for certain things, but hey, mostly more fillrate.

        I’m running 1600p now; and I could seriously see the use for a ~40″ 4k monitor. Especially since the 30″ panels can apparently be sold profitably for ~$700; put it in a decent cabinet and give me 60Hz operation over DP with decent blacks, contrast, uniformity and color (after calibration, at least), and you can happily charge me twice as much.

        And I’ll only need two next-gen cards with 8GB+ of RAM each. And RAM is cheap, remember!

      • Chrispy_
      • 6 years ago

      Did you just say that the majority of gamers really want 16:10?

      I guess I’m in the minority for not wanting a screwed aspect ratio and cropped image or black letterboxing then.

        • f0d
        • 6 years ago

        i get none of those things while playing games on my 16:10 monitor
        sure you get that watching movies but we are talking about gamers and games here not htpc or tv screens

          • Chrispy_
          • 6 years ago

          Depends on the game:

          Some games crop the aspect ratio to 16:9 max, to avoid unfair advantage (Starcraft II, Diablo III)
          Older games (especially console ports) often support non-16:9 badly with missing assets (eg, Bioshock, to name one of many)
          Pretty much all indie games are designed to run at 16:9. They just don’t have the resources to code assets and OSD to be flexible.

          Sure, as time goes by, more and more games are becoming tolerant of nonstandard aspect ratios. You do realise that 4K will be driven by the entertainment industry, just as 720p and 1080p were. They’ll be pushing 16:9 because that’s the television ratio whether you like it or not.

    • Haserath
    • 6 years ago

    Shouldn’t they be creating a Crossfire 2.0 with beyond 4k capability and hardware frame pacing then?

    Offering a broken feature is terrible…

      • Klimax
      • 6 years ago

      That particular limitation was surprising, Don’t think SLI has any like that.(But that might be related to only one SLI config allowed)

    • Ph.D
    • 6 years ago

    And here I was ever hopeful of one day being able to run a 2560×1440 or 2560×1600 monitor with a powerful enough and affordable enough GPU…

    As bad as AMD is doing I sure don’t to end up with a single GPU creator having a monopoly. We need some healthy competition.

      • jessterman21
      • 6 years ago

      One day.

      • kalelovil
      • 6 years ago

      These issues only affect display setups above 2560×1600 (those above ~4 megapixels).

      I’ve been quite content with a 2560×1440 monitor and a Radeon HD7950.

    • anotherengineer
    • 6 years ago

    Well I would expect Nvidia to have less issues with 4k monitors, since AMD probably can’t afford a few 4k monitors to do testing with them 😉

    • cRock
    • 6 years ago

    Great post; balanced and rooted in facts like journalism should be.

    I’m not running out to buy a 4k setup, but this certainly seems like something I’d want to know beforehand.

      • HisDivineOrder
      • 6 years ago

      I agree, but more than that, I think it’s something anyone investing in multiple cards of the next gen line by AMD will want to know possibly before the end of their (new) cards’ lifespan.

    • puppetworx
    • 6 years ago

    I didn’t hear anybody saying “Hey! Easy now! Give that there NSA time to respond to our concerns like they promised they would. Let’s leave our investigations at that and not ask anymore questions.” Can you imagine? But don’t you dare dig into and objectively expose greater shortcomings of AMD’s products in respect to their own advertising! Not on a technology website for the love of God!

    • End User
    • 6 years ago

    As someone who’s also running dual GeForce GTX 770s I’m curious about the memory usage you saw at 4K. Are those 2GB or 4GB cards?

      • HisDivineOrder
      • 6 years ago

      Seconded.

    • beck2448
    • 6 years ago

    But I’m less sympathetic when I think about how AMD has marketed multi-GPU solutions like the Radeon HD 7990 as the best solution for 4K graphics. We’re talking about very expensive products that simply don’t work like they should.

    100% right. That is pretty inexcusable.

      • HisDivineOrder
      • 6 years ago

      Been saying this for years. Glad everyone is catching on.

      • Krogoth
      • 6 years ago

      Nvidia is also in the same boat in regards to their 3D Vision and SLI. Blame the marketing drones for over-promising and under deliver.

      Multi-card solutions have always been inferior to single-cards in terms of stability, consistent performance and other stupid issues. SLI/CF users only tolerate it, because it is the only way that can handle their stuff at ultra-high settings. SLI/CF users always drop their setup as soon as a single-card solution is able to yield the same level of raw performance.

        • beck2448
        • 6 years ago

        Actually the testing for the last year or so has shown SLI works MUCH better than Crossfire. They are hardly equivalent and it is disingenuous to suggest so. Check Hardocp, Tech Report , PC Perspective, Legion Hardware, Hot Hardware, etc.

          • redpriest
          • 6 years ago

          It has its ups and downs:

          [url<]http://hardforum.com/showthread.php?t=1765419&highlight=[/url<] I couldn't get SLI to work with multiple displays at Haswell launch.

        • redpriest
        • 6 years ago

        SLI didn’t work properly with Surround at Haswell launch, and it took at least a couple of months to fix. It was pretty annoying – and while clearly a first world problem that only impacted the 15 of us that had surround displays with a brand new CPU/motherboard platform, didn’t get multiple site coverage that this has.

        [url<]http://hardforum.com/showthread.php?t=1765419&highlight=[/url<]

    • Krogoth
    • 6 years ago

    Wake me up when 4K monitors and display are obtainable for the masses and reached critical mass.

      • HisDivineOrder
      • 6 years ago

      Krogoth is not impressed?

      • Meadows
      • 6 years ago

      “…Wake me up, before you go go”

        • HisDivineOrder
        • 6 years ago

        [quote<]Wake me up before you go go Don't leave me hanging on like a yo yo Wake me up before you go go I don't want to miss it when you hit that high Wake me up before you go go 'Cause I'm not plannin' on going solo Wake me up before you go go Take me dancing tonight I wanna hit that high, yeah, yeah![/quote<] -Unnamed Radeon 7970 Crossfire owner.

          • derFunkenstein
          • 6 years ago

          did…did you even change the words?

          edit: that typo changed the whole meaning of the comment.

          • indeego
          • 6 years ago

          Crossfire is so gay.

            • derFunkenstein
            • 6 years ago

            And George Michael. ICWUDT

      • Klimax
      • 6 years ago

      Although I’ll be waiting even longer, wake me up when they are in smaller dimensions then 27… (High DPI won’t be problem…)

      • l33t-g4m3r
      • 6 years ago

      It’s just another 2d monitor, only with greater pixels that no game in existence can take advantage of, and no single video card is capable of running. 4k is for work, not gaming. On the other hand, the Oculus Rift is something I’d be interested in because it could add real value to a gaming session. Resolution isn’t even the problem, it’s AA. Once you solve AA, pixel density won’t matter as much.

      • jihadjoe
      • 6 years ago

      Krogoth has successfully won this comment system.

      At first he started off by making his own meme, not being impressed with stuff. Then he occasionally gets impressed and we’re all impressed by that. And now he’s back to not being impressed and again we’re impressed by that.

        • derFunkenstein
        • 6 years ago

        HE NEVER FAILS TO [s<]OPPRESS[/s<] IMPRESS

          • HisDivineOrder
          • 6 years ago

          HE NEVER FAILS TO [s<]OPPRESS[/s<] [s<]IMPRESS[/s<] EXPRESS

            • Meadows
            • 6 years ago

            HE FAILS TO BENCHPRESS

            • Chrispy_
            • 6 years ago

            Aww crap, now you’re instigating a cross-meme fertilization with [i<]do you even lift bro[/i<].... The internet has officially finished, it can be switched off now.

      • sschaem
      • 6 years ago

      With crappy support like this it will just take longer.

      ~$600 4K display would be here if the experience was flawless, people would be interested in 4K knowing it works, instead of lamenting how crappy the support is.
      The issue is, even people with the $$$$ wont spend 3K for a 4K PC display knowing it doesn’t work and its half baked.

      Most gamer dont care about Adobe RGB and super gamut. They want 60+hz, and no stuttering. and AMD is not making this a reality.
      And here, the reality is 4K panel cost is around $400 .. whats missing is HDMI2.0 to make it usable as PC monitor.

      The display industry is begging reason to ramp up 4K… come on AMD, why are you putting the brakes on 4K on the desktop ?!

        • Krogoth
        • 6 years ago

        Give me source material that takes advantage of 4K displays….. That’s right its nothing but demos and screenshots of renderings. Besides, 4K only makes a difference on huge displays (50″ or more), otherwise it goes practically unnoticed on a smaller screen.

        At this point, 4K displays only matter to the professional market where it is useful for imaging, monitoring equipment, demonstrations, conferences etc.

          • Deanjo
          • 6 years ago

          [quote<]Besides, 4K only makes a difference on huge displays (50" or more), otherwise it goes practically unnoticed on a smaller screen.[/quote<] Not exactly true. You have to remember that you are sitting a heck of a lot closer to a monitor then you are with a TV. Also one area where 4k will shine is with passive or glassless 3D.

            • Airmantharp
            • 6 years ago

            I’d use a 4k monitor right now- for gaming and for photo editing.

            I just need a pair of video cards that actually work and have enough RAM to handle games developed with the new consoles in mind.

            The Chinese can make the things for nothing- they don’t actually cost more to make than a fraction more than the same size panels in other resolutions. It’s just a matter of economy of scale and getting the ASICs in place.

            • Krogoth
            • 6 years ago

            That’s entirely the point.

            You need a large screen to notice the difference between 1080p and 4K without resorting to a magnifying glass.

            3D Support =! Resolution

            • Deanjo
            • 6 years ago

            [quote<]3D Support =! Resolution[/quote<] The detail presented in passive/glassless 3D is directly effected by resolution as you are effectively halving it to create the effect.

            • Airmantharp
            • 6 years ago

            I was going to mention that one, as I do own one of those TVs (a 55″ LG). I much prefer the passive solution as the active solution does cause some eyestrain, but the loss of resolution is disheartening.

      • Yawn
      • 6 years ago

      I’m just glad I got out of the “wake me” business when SSDs dropped to $1/GB. Wouldn’t want to have the job of waiting to wake Krogoth in time for him to be impressed.

    • Bensam123
    • 6 years ago

    Erm… so you just reiterated what we already knew and AMD has talked about, the fixes for crossfire and 4k/eyefinity are not available yet. They’re working on it, it isn’t done though.

    If it the news stories were presented something like ‘AMD still has issues with 4k and Eyefinity’ that’d be different, but it was presented as if we never knew before hand and AMD is terrible for ever letting this past. If we hadn’t known about it, if AMD was denying it, if they weren’t working on it, if it affected tons of users I’d say those last two news snippets would have more cred, but none of the above is true and that’s a lot of IFs.

    It really just seems like you guys were looking for a reason to drum up publicity and bag on AMD. I’m not sure why you felt the need to write a blog post justify those news posts either if they were simply notifying us that the issue still exists. This is essentially a summary of the news posts and the PCPER article.

      • Forge
      • 6 years ago

      How does Neelycam manage to make a troll post that’s also insightful, while you just come across as negativity about negativity? You and Pettytheft both, do you see the meta, the near-irony of making a contentless negative post, complaining about how a news blurb is (for you) contentless and negative?

        • chuckula
        • 6 years ago

        [quote<]How does Neelycam manage to make a troll post that's also insightful[/quote<] Practice. Years & years of practice.

          • derFunkenstein
          • 6 years ago

          Count Chuckula aspires to that position one day.

        • Bensam123
        • 6 years ago

        The ‘negativity’ here would be constantly downplaying AMD despite their best efforts. I’m sure you can see that though as you’re being so objective.

        As for Neely, perhaps there is a bias here and while he’s toying with a rather serious issue, I’m taking it serious. You just don’t like the angle I’m coming from.

          • Forge
          • 6 years ago

          I really couldn’t care less what angle you’re approaching from, I have no horse in this race. If you’re painting me as an anti-AMD person, I’m going to laugh in your face. If I have any bias, it’s pro-AMD. Do I need to trot out the onesie that my firstborn wore, coming home from the hospital? It says “Daddy’s little Athlon” and has a big AMD logo on it. My wife made it by hand, because I am, or at least was, a huge AMD fan. Apparently I’m just not public enough about my affection for AMD. At the BBQ, when I got a grab at the prize cart, I considered what was there, and Scott said “You’ll probably want the memory” (was some nice 1866MHz stuff, I forget how much, 8GB?) and I took the A8-660K. I have no boards to put it on. I have no need for a fast CPU, I have an aged but vital i5-2500K running at nearly 5GHz that refuses to be overtaken, but I was genuinely excited at the thought of having an AMD system again, it’s been a while, and the APUs fill a niche Intel is neglecting, so I’d have a good reason to go AMD.

          I was never a deeply biased fanboy, though, and I think that’s why you and I are at odds right now. AMD is in a critically bad position and seem to have no interest in correcting that. You seem to feel that attacking anyone who speaks ill of AMD will help, while I think that major changes need to be made, and it may already be too late.

          You believe whatever you like.

            • Bensam123
            • 6 years ago

            You weren’t the one painting them that way. As far as I know, the first post had nothing to do with you.

            • Forge
            • 6 years ago

            I commented because I had a problem with your post. It didn’t have anything to do with me personally until you attacked me personally. Your original post attacked Damage’s neturality/objectivity, and I take grave issue with that. Damage has been objective and neutral for a very long time, and you attempting to portray him as biased and advancing an agenda offends me deeply.

            Or am I only allowed to comment on comments or articles that personally affect me directly? In that case, you shouldn’t have commented either.

            • Bensam123
            • 6 years ago

            I didn’t attack you personally dude, the second post, in response to you, was still talking about the way TR is presenting the information.

            It’s not just all about you. This is hardly a personal attack either, stop crying wolf to try and drum up negativity towards me. It almost seems as if you’re trying to make me look that way for some reason… hmmm…

            You’re welcome to comment on anything you like, just as I am able to comment on the negativity TR is presenting towards AMD with a negative post (which you seem to object). Just don’t rip my posts out of context and make them ‘all about you’. Please try not to be a hypocrite as well, it makes it hard to have a logical, objective discussion.

      • LukeCWM
      • 6 years ago

      This article was different because it clarified what each separate problem is and showed proof for each instead of ambiguously saying there are a variety of problems and showing nothing.

      I found it very informative, and I’m glad Scott wrote it.

        • Bensam123
        • 6 years ago

        The PCPER article did that. Really the only new set of information in regards to that is the differentiation between runt frames and interleaves.

        Putting that aside they portrayed it as if these are new issues and AMD doesn’t know and/or isn’t trying to address them, as per the original frame time and FCAT articles. It just seems like an attention grab.

          • cygnus1
          • 6 years ago

          I didn’t get the impression they were talking about anything new, just clarifying the problems with actual evidence. And frankly, AMD deserves to have this constantly in the news until it’s fixed the issue OR they stop advertising that their products work fine for these purposes when they still don’t. The right thing for AMD to do would be to strip all mention of these capabilities from their marketing and spec sheets, but they’re not doing that.

          Also, at this point I’m worried that some of these issues can’t be fixed in software for their current chips and will thus never be fixed. That will leave a lot of people with hardware that won’t be fit for all the purposes sold and could very well lead to class action lawsuit. Granted, very few users are ever going to use a current gen GPU with 4K monitors… It doesn’t really matter how small a corporate error is, if the class of POTENTIALLY harmed is big enough, lawyers will pounce.

            • Bensam123
            • 6 years ago

            Yeah, back when the microstuttering first popped up I questioned whether or not this is actually a hardware issue and whether or not the new 8xxx series was held back due to these issues later this year when it was announced that the 8xxx series was delayed. I was sure at the time they delayed it to iron out hardware latency issues with their chips.

            The driver fixes may just be a ‘band-aid fix’ and not a legitimate hardware fix and that’s what they were/are working on. Also why it took so long for the band-aid fix to come along.

            Here you go BTW, I said the same thing as you and got a -9 for it back in Febuary:

            [url<]https://techreport.com/news/24345/amd-roadmap-radeon-hd-7000-series-to-be-stable-throughout-2013?post=708190[/url<] Also part of Chuckulas Perpetual BS in that same thread, as a kicker: [url<]https://techreport.com/news/24345/amd-roadmap-radeon-hd-7000-series-to-be-stable-throughout-2013?post=708488[/url<]

            • cygnus1
            • 6 years ago

            Anything is possible. But to me it would seem strange to hold back the 8xxx series way back in February if it still contained this issue given their handling of the issue with current models. They haven’t bothered to change their marketing at all, so why bother holding back a new model? It’s possible they felt they could fix it with a new revision of silicon, but I’d be surprised if that was the case. My expectation is that it will require a significant enough amount of hardware change/engineering that it would be overly difficult for it to be incorporated in an almost completed product and would have to go into the next generation.

            As for your minuses back in February, maybe people didn’t like those ideas then or maybe the way you worded it rubbed people wrong. People are fickle I guess.

            • Bensam123
            • 6 years ago

            I don’t know, maybe because of all the bad publicity they’re getting for it?

            They have the issue almost completely fixed and they’re STILL getting pretty hardcore flack for it, look at most of the comments and even the way TR reported this.

            As per my other post in Febuary, they don’t necessarily completely fix the issue, which would require it being incorporated into the design from the ground up, they may have had some easy changes they could make that would provide a pretty good solution for the time being.

            It’s who is saying it and when it was said. Now is seems more certain, where as before it seemed less likely.

          • JohnC
          • 6 years ago

          You are free to skip all these “attention grabbing” articles if you find them as such.

          • derFunkenstein
          • 6 years ago

          oh, so we can’t have any information repeated anywhere on the internet so that more people might find it. I don’t read PCPer so this was kind of new to me.

      • derFunkenstein
      • 6 years ago

      WAT

        • Bensam123
        • 6 years ago

        windows activation technologies?

    • Pettytheft
    • 6 years ago

    Still don’t care.

    Nice article though.

    • chuckula
    • 6 years ago

    1. While this article does kind of scream “first world problem,” part of AMD’s big marketing push is that their parts are less expensive than Nvidia’s and they can still give you the same or better performance for the money… by going to multi-GPU configurations. If AMD is going to emphasize those configurations as a competitive edge, they are opening themselves up to increased scrutiny when that edge isn’t as sharp as they say it is.

    2. I also doubt that Nvidia’s 4K output is 100% perfect so they aren’t entirely off the hook. However, they do appear to avoid the glaring errors so their own glitches seem minor in comparison.

      • HisDivineOrder
      • 6 years ago

      It really is a shame that AMD has failed so spectacularly that “avoiding the glaring problems” is suddenly success when it should have been a given.

    • cynan
    • 6 years ago

    I wonder if current AMD multi-gpu configs will fair better at 4K once the first single-head 4K displays appear. After all, the HD 7900 cards support DP 1.2, which can do 4K at 60 Hz. I suppose time will tell if AMD doesn’t fix these eyefinity/multi-display multi-GPU problems in a reasonable time frame.

    Apparently single headed 4K screens are on the way…

      • cynan
      • 6 years ago

      [quote<]Since today's video output standards can't support 4K resolutions natively, monitors like the Asus PQ321Q use tiling[/quote<] Just to elaborate on the above in response to the quote from the article, I was under the impression that it was the timing controllers (TCONs) built into the control board for the monitor that was the limitation. Currently, there aren't any readily available that can support 4K and 60Hz. This is why two are used in parallel on current 4K dispalys (tiled). This would mean that it's not an issue with the video output standards, but the ability of the display to receive/process the signal. Or is there also a hard limitation with Displayport 1.2 outputing 4K@60Hz in a single stream as well? [/Scratches Head]

        • Damage
        • 6 years ago

        Nah, you’re right that the issue is display ASICs right now. I’ve tweaked the article’s wording to correct. Of course, having a single DP link doesn’t fix CrossFire issues at >2560×1600.

          • cynan
          • 6 years ago

          Because of the need to pass frames over the PCIe bus instead of the crossfire-bridge… So the bandwidth limitation that may be tough to overcome with drivers you speak of refers to the crossfire bridge? If Nvidia can make it work over the PCIe bus with frame metering, then…

          Maybe they can figure out a way to make 4K@60Hz work with two cards in crossfire using [url=https://techreport.com/blog/24415/as-the-second-turns-frame-captures-crossfire-and-more?post=712223<]2 bridges at once[/url<]!

            • Damage
            • 6 years ago

            Heh, maybe. We’ll have to see. So far, AMD has been very guarded about how its frame pacing implementation works. Hoping to learn more next week.

            • jessterman21
            • 6 years ago

            Looks like a variable frametime cap that updates every few hundred milliseconds to me.

    • NeelyCam
    • 6 years ago

    Awful.

    When you have no money, it’s hard to make your drivers work.

      • HisDivineOrder
      • 6 years ago

      When you have no money, it’s hard to make your drivers work becausae when you have no money, it’s hard to make your ex-employees you laid off a year ago make your drivers work.

      Remember when AMD was laying off huge masses of employees and shrugging it off because they were going to have this big name and that big name show up to lead the remnants of teams they had? I remember one in particular Anand plastered a huge photo of in announcing the big move. Then there was that nVidia guy who was hired by AMD, announced, who went to AMD for less than a day, and then went screaming back to nVidia the next day.

      AMD’s money problems–the most likely source of all its delays (8xxx series, Kaveri)–are affecting its entire business line at this point and the problems AMD has had in fixing even the most basic Crossfire issues (from early last year!) would make anyone less confident about the long term future of both AMD and driver support for any product they make.

      Which in itself hurts AMD’s ability to sell its products. Looks like a death spiral to me.

        • anubis44
        • 6 years ago

        “Looks like a death spiral to me.”

        Oh give us a break. A death spiral? Could you be more melodramatic please? This nonsense about AMD when their gaming credibility has just reversed itself 180 degrees now that they’ve aced all three gaming consoles and are about to release new cards? The revenues from the gaming consoles alone will guarantee AMD’s financials for the next 7 years. I hardly think it’s plausible to be raising the spectre of a ‘death spiral’ for this company.

        Dream on.

          • Deanjo
          • 6 years ago

          [quote<]The revenues from the gaming consoles alone will guarantee AMD's financials for the next 7 years.[/quote<] Ya because those console revenues skyrocketed nVidia's P/L for so many years. Dream on.

            • HisDivineOrder
            • 6 years ago

            Exactly.

            AMD practically owned this generation’s console race and it did little to help boost their bottom line. Why would next gen be any different?

            • Spunjji
            • 6 years ago

            They do at least have a full SoC this time, rather than just being the GPU designer.

            But yeah… I’m not sure if that’s a great enough magnitude of change in wallet share to ensure their future.

        • Klimax
        • 6 years ago

        Just quick question: Who was that NVidia guy who after day ran back? Don’t remember that… (I could miss that.)

          • HisDivineOrder
          • 6 years ago

          [url<]http://www.anandtech.com/show/6967/amd-continues-assembling-dream-team-sean-pelletier-from-nvidia-tech-marketing-to-join[/url<] [quote<]Update: I just got word that Sean ended up back at NVIDIA. He sent me a message after making the decision saying that there wasn't anything wrong with AMD, but that the fit simply didn't feel right. [/quote<] One can't really expect the guy to come right out and say, "It was a ghost town" or "It was horrible over there" or whatever because that'd be a death sentence on changing jobs in the future. So he says the generic, "the fit simply didn't feel right." But you don't change back that fast unless something is very, very wrong.

            • Klimax
            • 6 years ago

            I read that bloody article, but before any update and since I closed it before update, didn’t know.

            Fun.(And I don’t think I saw any update on it anywhere else either. Bad luck…)

            Thanks.

        • Fighterpilot
        • 6 years ago

        Why don’t you STFU ?(just for once)
        A poor man’s Focus Group wannabe that never fails to shill for nvidia and throw crap at AMD whether it be over new hardware,drivers,consoles or CPUs.
        Spend your time at ABT…they adore people like you.

          • Airmantharp
          • 6 years ago

          He actually has a good point- though I do hope that he’s wrong; I agree that AMD has been in a bit of a death-spiral, and if they hadn’t taken the prize for this upcoming console generation, I’d be inclined to believe that they wouldn’t be pulling out of it.

          But they are showing signs of hope; they’re just not out of the weeds yet, by far.

      • jihadjoe
      • 6 years ago

      That’s not an excuse. Nvidia has had excellent drivers since before they had anywhere near the money they have now.

      [quote<]Nvidia's OpenGL drivers are my 'gold standard', and it has been quite a while since I have had to report a problem to them, and even their brand new extensions work as documented the first time I try them. When I have a problem on an Nvidia, I assume that it is my fault. With anyone else's drivers, I assume it is their fault. -- John Carmack, 2002 [/quote<]

        • superjawes
        • 6 years ago

        …that kind of misses the point. It doesn’t matter when AMD’s drivers started having problems. The point is that they have problems NOW, at a point when company resources are more limited than they were before.

        • Deanjo
        • 6 years ago

        And AMD/ATI openGL drivers are still viewed as crap. They can be thankful they are even still around due to and solely by their DX performance.

          • l33t-g4m3r
          • 6 years ago

          ATI’s *dx9* drivers were pretty good up until CCC, then they were OK with a bloated and buggy control panel up until GCN, then monthly driver updates disappeared and everything went downhill near instantaneously. Sure, they’re working on fixing frame pacing, but IMO this feature would have been already finished under the old ATI driver team. ATI used to turn out features just as fast if not faster than Nvidia, but now they can barely catch up, and even their DX performance has suffered. The only thing AMD has going now is directly supported games.

          AMD needs to drop all promotions and marketing, fire some management, and double down on the driver team. There is no excuse for this mess, and it wouldn’t exist if the old team was working on the problem.

            • derFunkenstein
            • 6 years ago

            As a former Rage 128 and later a Radeon 8500 Pro owner, I can tell you that no, they were awful before CCC.

            • l33t-g4m3r
            • 6 years ago

            In what way? I never had problems. Maybe the Rage 128 & 7000, but the 8500 ran just about anything and had “free” AF with bilinear. OpenGL was the biggest problem, but ATI kept making improvements, and there wasn’t any OpenGL games that even needed the 8500’s power. IMO, the biggest issue from that era wasn’t ATI’s drivers, but games themselves, and general instability with the 9x OS. Hell, by the end of it’s life, the 8500 could run doom3. I wouldn’t call that bad drivers.

            In reality, the biggest issues were caused by nvidia bribing developers to not support 8.1 / PS 1.4, and use nvidia’s proprietary extensions, but it’s not like there were that many dx8 games around either. Most of the time, the 8500 ran anything you threw at it, and the “free” AF gave it an edge over nvidia.

            The older cards are what really had the problems, but people who couldn’t let it go have been spreading FUD ever since, while such issues didn’t exist on the newer hardware, aside from nvidia’s meddling.

            • clone
            • 6 years ago

            my Radeon 8500 used to “pack up” during prolonged gaming like the ram had loaded and couldn’t be cleared….. it was weird, card would run fine, wasn’t running hot and then suddenly it’d stutter during gameplay.

            pulled the card installed another and waited a cppl months, reinstalled the card, updated the drivers and all the problems were gone.

            then again the Nvidia card I had at the time failed to render stars in Freespace 2 at the time unless I was turning, it was a little annoying, flying straight, no stars just black, hit the afterburner and shazaam all these stars would appear until I let off boost.

            • derFunkenstein
            • 6 years ago

            The Rage 128 was just an awful experience. I never used it with Win9x though, that might have been fine. I used Win2k from the time it came out because it was so stable. However with a K6-2 450MHz system running with a Rage 128, it was crashy. Lots of crashes. Even returned the card to the local PC shop and exchanged for another, same story. I only had it for about a week and I exchanged the second one for a TNT2 and it was a breeze.

            Aside from the Quake/Quack thing, the 8500 generally (as most Radeons do today) lagged on new releases. I bought it at the same time as Neverwinter Nights (which I bought on launch) and at 1024×768 on my CRT monitor I was in the 20-30fps range. Very obviously not 60fps, whereas everyone with GeForce 3s – heck everyone with GeForce 2 GTSes even – were running at 60+ FPS all the time. That’s just one game, but it was like that all up and down the line.

            • clone
            • 6 years ago

            back in the day I was big on Nvidia because their product worked and it looked like it was well ahead of the competition, at the time I was using 3dfx which was on the fast track to disaster, was making decent money with minimal expenses back then and was constantly upgrading to try and sample everything for sale at the time.

            a year after my first Radeon 8500 (which was a disappointment) I got one back in trade for an upgrade which was when I discovered to my surprise that it was fixed via driver updates….. since then I’ve been more laid back, far more forgiving of hardware, and less prone to condemning it.

            I miss the early days of desktop 3d gaming…. their was a buzz back then that no longer exists today.

            • anubis44
            • 6 years ago

            “I miss the early days of desktop 3d gaming…. their was a buzz back then that no longer exists today.”

            Amen, brother. I’ll never forget the first time I fired up the first Unreal game with a brand new 3Dfx Voodoo 1 card and beheld 3D accelerated gaming in a truly immersive title. The responsiveness was utterly fluid, and the detail was mind-blowing. From that point on, I was truly hooked. My only wish now is to see the promise of 3D graphics and all this multicore CPU/GPU power realized in a game where the destructive environment is truly realistic. I want to play in an urban warfare FPS environment (say Stalingrad during WWII) and have the weapons and buildings interact authentically.

            • HisDivineOrder
            • 6 years ago

            Like you said, “there weren’t many DX8 games around,” which is why the few that did come out didn’t support the ATI-exclusive DX 8.1 based on AMD hardware. They supported the more generic, more widely available DX 8 spec instead.

            • l33t-g4m3r
            • 6 years ago

            It wasn’t a big issue. The 8500 could still run those games, aside from missing some nvidia specific effects in games like splinter cell(?). Soft shadows? It wasn’t much different from how nvidia does things today via physx.

            You heard derFunk, this is all sour grapes from the Rage128 and 7000 generation, which bitter fanboys held against ATI far longer than necessary. Sure, the 8500 sometimes lagged behind with a game here and there, but it wasn’t unplayable. Even if you got 30 fps in Neverwinter, that game didn’t need 60 fps, and nobody forced you to not upgrade your driver next month when the performance increased, or not install game updates which Neverwinter did need. I wasn’t a day1 sucker, but waited until after the expansions, and it ran perfectly fine. Once ATI caught up, which they did within a month or two back then, you had nothing to complain about. Patience is a virtue that some people obviously don’t have, and these complaints are exaggerated, aside from the legacy hardware.

            There were also a lot of tweakable settings back then, which if people actually cared to try out, did increase performance. I never had any huge issues with the 8500 generation, and the 9800+ dx9 generation is where ATI really started outshining nvidia. Complainers don’t have a leg to stand on with those cards. ATI even outperformed nvidia’s FX in opengl games like Doom3.

            Rage 128 != 8500 / 9800. But it does seem like AMD is falling back to early 8500 era driver quality with GCN, which is their own fault, and customers don’t have to buy defective products. If you don’t use the broken features, then it would be a tolerable card, otherwise there isn’t any reason to be a masochist.

          • Krogoth
          • 6 years ago

          Where’s the proof?

          If this was pre-2005, then I would nod in agreement. Not so much anymore and let’s not forget that Nvidia’s record isn’t exactly spotless.

          Both GPU companies are guilty of shoddy drivers, broken hardware issues and failing to deliver on exaggerated promises (Blame marketing for this one not the engineers).

            • Deanjo
            • 6 years ago

            Ask any developer porting games to OS X or linux and they will tell you flat out how crappy AMD’s openGL support is.

            • Krogoth
            • 6 years ago

            Nvidia isn’t a saint in either area. Besides, OGL in OS X and Linux is geared towards professionals not gamers.

            *nix and OS X are not the primary platform choices for PC gamers. Game developers focus on consoles and Windows (DirectX). OS X and *nix are just afterthoughts.

            • Deanjo
            • 6 years ago

            Good Lord that’s a bunch of BS. If oGL was geared towards just professional applications they would still be @ oGL 2.1 as that’s the maximum level oGL support 99.9% of those “professional” applications use. In fact many of those applications still operate at oGL 1.3.

            • Krogoth
            • 6 years ago

            How many PC games use OGL as their primary graphical API? Not too many of them. DirectX rules the PC gaming world. It is because of Microsoft massive marketshare in the desktop PC world and DirectX is their baby.

            OGL in the gaming world is mostly used in ports between different platforms (mostly consoles/handhelds). The remainder is for third-party open source mods for legacy titles where the original owners released the source code. (Freespace Open Source, ZGLDOOM, EDuke32 etc).

            • Airmantharp
            • 6 years ago

            Outside of id software, I’m not sure anyone uses OpenGL on the PC for gaming. There’s really not a reason too, either. Krogoth’s got this one Deanjo :).

            • jihadjoe
            • 6 years ago

            No, it’s because OGL basically stagnated around about version 1.5. The 2.x releases brought nothing new, because the Architecture Review Board rejected changes being brought forth by other parties like 3DLabs, while doing nothing itself. Eventually they decided they were so stagnant they turned over control to the Khronos group.

            Meanwhile, Microsoft was heavily pushing DX development. While OGL was basically stuck at bump mapped textures, DX pushed forward with pixel shaders, cube-map arrays, per-MRT blend modes, tesselation and other stuff. DX had all this stuff in 2009. At this point, OGL was way behind, and OGL 4.0 was basically playing catch-up to DX11.

            I mean, they were entrenched when Microsoft didn’t even have DirectX, and as OGL was available for Windows there should have been no reason for developers to code for DX when OGL was already there, and better. The main reason DX is the standard now, is because OGL, being an Open standard eventually fell to the inevitable bureaucracy and in-fighting.

            Sometimes it’s better to have a dictator than a committee.

            • kn00tcn
            • 6 years ago

            i’m a dx fan but there are TONS of ogl only pc games, & not all of them are open source or steamplay capable

            how many AAA titles are ogl? not many sure, but for all the smaller games…. if you’re not on unity/unreal/source, if you’re ‘2d’, if you cant make a custom engine, a lot of times you end up with ogl based tools

            then there’s the occasional ‘we added a dx option’ patch that happens to some games (i think osmos has this)

            now even in dx land, dx9 sure has its limits (usually, witcher2 & trackmania2, even crysis2 are a few of the last great quality pushers), so what really matters is modern+future dx & ogl

            • SternHammer
            • 6 years ago

            Sony PS4 runs on Linux and uses Open GL/CL and it runs on AMD Huma Octa-core!
            And here’s a sample of the great work that is being initiated now as a benchmark for the future:
            [url<]https://www.youtube.com/watch?v=JUkrIIBVPGY[/url<] Check out those amazing physics and the beautiful fire effects and lighting and most of all the amazing Dragon XD And that's just the beginning. gg

            • MFergus
            • 6 years ago

            FreeBSD is not Linux just so you know Stern. The PS4 also won’t be using AMD’s PC OpenGL drivers so this thread has no bearing on the PS4. The PS3 also supported a version of OpenGL but nobody on consoles uses it because it gives worse performance than the lower libraries.

        • SternHammer
        • 6 years ago

        2002? much has changed in the past 10 years whether on the software front or the hardware front. Your decade old statement doesn’t really mean anything in the present day.

          • MFergus
          • 6 years ago

          Considering Ati/Amd’s opengl drivers still have issues it is very relevant. I know your only getting defensive as a PS4 fan but as my other post stated, there PC opengl drivers have nothing to do with the PS4.

      • sschaem
      • 6 years ago

      Come on! How do you expect AMD to organize lavish event in the middle of the pacific ocean if you spend you resource on product development ? Clearly you cant have both…

Pin It on Pinterest

Share This