Ubisoft comments on Assassin’s Creed DX10.1 controversy – UPDATED

We have been following a brewing controversy over the PC version of Assassin’s Creed and its support for AMD Radeon graphics cards with DirectX 10.1 for some time now. The folks at Rage3D first broke this story by noting some major performance gains in the game on a Radeon HD 3870 X2 with antialiasing enabled after Vista Service Pack 1 is installed—gains of up to 20%. Vista SP1, of course, adds support for DirectX version 10.1, among other things. Rage3D’s Alex Voicu also demonstrated some instances of higher quality antialiasing—some edges were touched that otherwise would not be—with DX10.1. Currently, only Radeon HD 3000-series GPUs are DX10.1-capable, and given AMD’s struggles of late, the positive news about DX10.1 support in a major game seemed like a much-needed ray of hope for the company and for Radeon owners.

After that article, things began to snowball, first with confirmation that Assassin’s Creed did indeed ship with DX10.1 support, and then with Ubisoft’s announcement about a forthcoming patch for the game. The announcement included a rather cryptic explanation of why the DX10.1 code improved performance, but strangely, it also said Ubisoft would be stripping out DX10.1 in the upcoming patch.

In addition to addressing reported glitches, the patch will remove support for DX10.1, since we need to rework its implementation. The performance gains seen by players who are currently playing Assassin’s Creed with a DX10.1 graphics card are in large part due to the fact that our implementation removes a render pass during post-effect which is costly.

This statement raised a whole new set of questions: What exactly is the "costly" render pass that’s being removed in DX10.1? Does it impact image quality or just improve performance? And what are Ubisoft’s motives for removing the DX10.1 code path?

Rage3D posted a follow-up article noting some very slight image quality anomalies with DX10.1, but nothing major. Other sites, including PC Games Hardware in Germany and the HardOCP, reproduced Rage3D’s findings about performance increases and minor image quality changes in DX10.1.

Perhaps the DirectX 10.1 code path in Assassin’s Creed needed some work, as Ubisoft claimed, but why remove DX10.1 support rather than fix it?  The rumor mill creaked to life, with folks insinuating Ubisoft decided to nix DX10.1 support in response to pressure from Nvidia after the GPU maker sponsored Assassin’s Creed via its The Way It’s Meant To Be Played program.  Our conversations with multiple credible sources in the industry gave some credence to this scenario, suggesting the marketing partnership with Nvidia may have been a disincentive for Ubisoft to complete its DirectX 10.1 development efforts.

Our next step was to ask Ubisoft some specific questions about DX10.1 support in Assassin’s Creed, in order to better understand what’s happening.  Fortunately, Charles Beauchemin, the tech lead for the Assassin’s Creed development team, was kind enough to answer our questions.  Those questions, and his answers, follow.

TR: First, what is the nature of the "costly" "post-effect" removed in Assassin’s Creed‘s DX10.1 implementation?  Is it related to antialiasing?  Tone mapping?

Beauchemin: The post-effects are used to generate a special look to the game. This means some color correction, glow, and other visual effects that give the unique graphical ambiance to the game. They are also used for game play, like character selection, eagle-eye vision coloring, etc.

TR: Does the removal of this "render pass during post-effect" in the DX10.1 have an impact on image quality in the game?  

Beauchemin: With DirectX 10.1, we are able to re-use an existing buffer to render the post-effects instead of having to render it again with different attributes. However, with the implementation of the retail version, we found a problem that caused the post-effects to fail to render properly.

TR: Is this "render pass during post-effect" somehow made unnecessary by DirectX 10.1?

Beauchemin: The DirectX 10.1 API enables us to re-use one of our depth buffers without having to render it twice, once with AA and once without.

TR: What other image quality and/or performance enchancements does the DX10.1 code path in the game offer?

Beauchemin: There is no visual difference for the gamer. Only the performance is affected.

 

TR: What specific factors led to DX10.1 support’s removal in patch 1?

Beauchemin: Our DX10.1 implementation was not properly done and we didn’t want the users with Vista SP1 and DX10.1-enabled cards to have a bad gaming experience.

TR: Finally, what is the future of DX10.1 support in Assassin’s Creed?  Will it be restored in a future patch for the game? 

Beauchemin: We are currently investigating this situation.

So we have confirmation that the performance gains on Radeons in DirectX 10.1 are indeed legitimate.  The removal of the rendering pass is made possible by DX10.1’s antialiasing improvements and should not affect image quality.  Ubisoft claims it’s pulling DX10.1 support in the patch because of a bug, but is non-commital on whether DX10.1 capability will be restored in a future patch for the game.

The big question now is what happens next.  It’s not hard to surmise that AMD’s developer relations team stands ready to assist Ubisoft with fixing Assassin’s Creed‘s DX10.1 code path as quickly as possible, and that doing so ought to be relatively straightforward, since the game’s developers have said DX10.1 simply allows them to reuse a depth buffer without re-rendering it.

One would hope that all parties involved, including Ubisoft and Nvidia, would encourage the Assassin’s Creed development team to complete its DX10.1 development work in a timely fashion—not to abandon it or to delay its completion until Nvidia also has a DX10.1-capable GPU on the market.

After all, Nvidia recently signed on to the PC Gaming Alliance, whose charter involves pushing common standards like DX10.1 and increasing "the number of PCs that can run games really well."  Assassin’s Creed is nothing if not a perfect candidate for assistance on this front: a high-profile console port that’s gaining a reputation for steep hardware requirements and iffy performance on the PC.  How can such an alliance succeed if one of its members is working at cross-purposes with it in a case like this one?  And what would the owner of an nForce-based system with a Radeon graphics card think upon learning that Nvidia’s marketing dollars had served to weaken his gaming experience? 

We’ll be watching to see what happens next.  For our part, the outcome will affect whether and how we use Assassin’s Creed and other Ubisoft and Nvidia "TWIMTBP" titles in our future GPU evaluations.

Update 5/9/08: Since publishing this story, we’ve spoken further with both Nvidia and Ubisoft about some of the issues involved. Nvidia spokesman Ken Brown told us unequivocally that "no money changed hands" as a result of Ubisoft joining Nvidia’s "The Way It’s Meant To Be Played" program, because that program is entirely a co-marketing arrangement. As part of this program, Nvidia may promote Assassin’s Creed for Ubisoft in various ways (including magazine advertising) and may offer engineering resources and development assistance. But, Brown said, the value for Ubisoft comes solely from such activities.

Brown further emphasized that Nvidia’s "TWIMTBP" program has done many good things for PC gaming, helping developers improve image quality, compatibility, performance, and visibility for their games—an assertion that’s tough to dispute, we must admit.

Brown also said Nvidia exerted no influence whatsoever on Ubisoft or the development team with regard to the DirectX 10.1 code path.

Ubisoft spokesman Michael Beadle confirmed to us that Ubisoft’s participation in TWIMTBP was strictly a co-marketing deal, and that Nvidia did not pay Ubisoft any money as a result of the deal.  He seconded Brown’s statement that Nvidia did not influence the decision to remove DirectX 10.1 support from the first patch for Assassin’s Creed. Beadle said the decision was made strictly by the game’s development team.

Still up in the air is the question of whether Ubisoft will restore DX10.1 support to Assassin’s Creed in a future patch. Based on what we’ve heard so far, I’d say that’s not likely to happen. Ubisoft maintains no final decision has been made. Beadle made the further point that users of DirectX 10.1 graphics cards may simply want to avoid applying the patch to the game, if they’re not encountering any problems with it.

Comments closed
    • Lans
    • 11 years ago

    Finally decided to see if there was an end to this story and most of it is covered already. 🙂 I was suspicious of any performance enhancing bug that did not affect image quality and is good to know is actually legitimate because you can just reuse the depth buffer in DX10.1 instead of second pass.

    I found the Rage3D link was very enlightening as well:
    q[< Did we find the glitches that everyone has been talking about when making reference to the 10.1 path? In a way - we do have that missing dust that qualifies for that category, albeit we don't know yet if it's simply a driver bug or something wrong with the pathway itself. Other than that, there's the more intense lighting, but that actually seems to show that there's a bug with the DX10 pathway, given the fact that DX9 has the same, more intense, lighting as 10.1, and UBi said that 9 and 10 (and by extension 10.1) should be nearly identical visually (shadowing is different between versions). <]q Lets remove the DX10 path too? 🙂

    • cloh2083
    • 11 years ago

    I was never a strong follower of either camp (just choosing the better GPUs that are released), but now I’m thinking of shoving nVidia the finger and going to get myself some of that red stuff.

    No money changed hands? Only the fools will be convinced. Ever heard of value-in-kind sponsorships? No cash may be transferred, but those kind of sponsors reap benefits in the 6 figure ballpark at least!

    As for Ubisoft:

    Gamer: DX10.1 gives better gaming experience
    Ubi: let’s remove DX10.1 and lower gaming quality while claiming we’re delivering higher quality.
    Gamer: Why remove?
    Ubi: Cos it’s buggy.
    Gamer: How is it buggy?
    Ubi: It just is cos nVidia said so.
    Gamer: Does this affect gaming performance?
    Ubi: We are committed to ensure gamers are given high quality performance in games (but will remove DX10.1 anyway so stfu).

    Shame!

    • format_C
    • 11 years ago

    q[http://www.pcgameshardware.de/aid,645430/News/Ubisoft_No_Direct_3D_101_support_for_Assassins_Creed_planned/<]§

    • nightmorph
    • 11 years ago

    /[<. . . pushing common standards like DX10.1 . . .<]/ Thanks. Thanks for crapping on the Linux, Mac, and BSD crowds. If you want a "common" standard, stick to OpenGL, yeah? Actually, speaking as a member of the Linux crowd, I found this article rather humorous -- all this uproar over one patch for a single game -- is this actually /[

      • MethylONE
      • 11 years ago

      wow, thought we already wore you folks out….

      • TheEmrys
      • 11 years ago

      What percentage of gamers do you suppose runs those systems? Like it or not, MS Windows is the most common platform. DirectX <i>is</i> common.

      Moreover, 3870’s can be purchased for ~$150US.

      If you want to game, go for it. But griping about how Linux and Mac doesn’t “get any” is pointless, trite, and just plain counter-productive. These platforms just aren’t worth the money (for developers) and OpenGL (at this point) can’t do all the things that DirectX 10.x can.

    • FubbHead
    • 11 years ago

    So, they had time and money to work on this DX10.1 implementation and get it working well, and then suddenly can’t find the time and money to do a little polishing of it.

    Granted, I don’t know if they hit a dead end or some such, but… Sorry, I don’t believe *[

    • bogbox
    • 11 years ago

    what’s so hard to undestend? Xbox has exclusive games like Halo, and PS 3 metal gear etc..
    Nvidia is triying to make more out of the title and just like they did with Crysis they paid Ubisoft to optimaze the code for there cards, but is more simple to uses DX10.1 then 10 so they did so 10.1 (or maybe where paid by ATI :))
    *[

    • wingless
    • 11 years ago

    Why not just add the option to switch from a DX10 to DX10.1 renderer? It would work kind of like how you can select a DX9 or DX10 mode in some games (CoH). Is the too difficult to implement that technically?

    Game developers should never cater to one manufacturer or another like this. They should have given the end user a choice.

    • swaaye
    • 11 years ago

    It’s disappointing that there is never a peep from any sort of ATI dev rel.

    • ECH
    • 11 years ago

    This is a great story however, I believe that an interview with ATI’s developer relations team would cast some light on the situation. We have Ubi and Nvidia’s take on the situation AC’s DX10.1. Lets hear what ATI developer relations team has to say about this. For example:
    -Are they having difficulty with Ubi to improve on DX10.1 code in AC?
    -Are the problems found a driver issue which can be fixed in a future Cat release?
    Or
    -Are they in fact currently working with Ubi to improve on DX10.1 code for a future release
    -etc.

    Questions like this would shed light on the situation IMO.

      • PetMiceRnice
      • 11 years ago

      I agree, these sorts of questions need to be asked.

      • asdsa
      • 11 years ago

      Hear hear! Why on earth nvidia and ubisoft guys would admit freely there be foul play? That wouldn’t be very good for their reputation.

        • WaltC
        • 11 years ago

        Why on earth, indeed?…;) nVidia did the same thing with Eidos concerning a benchmark the developers of Tomb Raider: Angel of Darkness included with the game as it was originally shipped. nVidia didn’t like the fact that TR:AoD shipped with a benchmark demonstrating the performance and IQ differences between the “DX8.1 path” and the “DX9 path”–respectively, SM1.x and SM2. R300 was very good at SM2 while nVIdia’s products at the time sucked at SM2. At that time, EIdos was very much upfront about the fact that they were removing the benchmark from future shipments of the game because nVidia objected to it and had “asked” them to remove it. There was a big stink about it for a long time.

        The only thing different about this case that I can see is that Ubisoft is being extremely evasive about what it’s doing while Eidos was direct and to the point. Heh…;) I guess nVidia learned something from those years after all, such as how to be more subtle and devious about getting 3rd-party software pulled from distribution when it underscores weaknesses in their products when compared to the competition’s.

        The more things change, the more they stay the same.

    • MrJP
    • 11 years ago

    Thanks for the update, Scott.

    Even if they’re all telling the truth, it still doesn’t explain why Ubisoft decided to completely remove the 10.1 path, rather than just putting in the option to disable it. I suppose that just comes down to time and money. Whatever Nvidia’s role in all of this, Ubisoft doesn’t come out of it looking particularly good, especially given that the high system requirements point to a really lazy port. I’ll leave the rant about cross-platform games for another day.

    • flip-mode
    • 11 years ago

    Who cares if “no money has changed hands”. Money isn’t the only form of payola. Nvidia says “we’ll promote the game if you drop DX10.1” Ubi says, “sure, no problem.” There you go, no money changed hands, but still pure evil. That’s my flamboyant speculation at least.

    • ECH
    • 11 years ago

    Silus, I am fixed on the fact that its “plausible” to fix the problems they believe exist not remove DX10.1 altogether. The AF issue was fixed in a driver hotfix. The white dots that you referred to could be a driver issue or game bug. What is “not clear” is how those white dots are produced. However, what is believable is that AMD/ATI’s developer relations team are ready to work with Ubi with fixing Assassin’s Creed’s DX10.1, not remove DX10.1.

    • Convert
    • 11 years ago

    q[< Update 5/9/08: <]q /proceeds to praise TR in a bubbly manner

    • ECH
    • 11 years ago

    Silus, per the article it’s not hard to believe that
    AMD’s developer relations team are ready to work with Ubit with fixing Assassin’s Creed’s DX10.1. This alone makes it simple and clear. There is no reason to remove DX10.1. BTW, of course DX10.1 removes the render pass! That’s what 10.1 does. The same effects in 10.1 take 1 pass, in DX10 it takes more then 1 pass. The additional rendering path in DX10 is redundancy not accuracy. I hope you are not implying it takes excessive passes (overhead) to properly render post process effect in AC using DX10.1.

      • Silus
      • 11 years ago

      I’m not implying a thing, but you certainly are…You seem to fixed on some conspiracy theory.
      The developers know their work and if they say this is what needs to be done, so be it. The only thing I can question, is the removal of DX10.1 support completely. If it has problems, fix it. I see no need to remove it completely. It’s the only thing I don’t quite understand.

    • ECH
    • 11 years ago

    Silus #62
    Bugs no less. It could be a ATI driver issue or a bug in the game itself.

    Kraft #75
    Not only do they provide tools, but also:
    -“Support Team”
    -Game Test Lab
    and other advice on how to code the game. The article clearly states:
    -they help developers code games
    -has over 100 people working on programme
    -assigns engineers their support team to work “in-house” with development teams
    -etc
    It might be useful to read the article again. TWIMTBP program offers more then just “a slew of useful tools” in exchange for advertisement.
    Once you re-read the whole article again and understand it I won’t hold it against you 🙂

      • Silus
      • 11 years ago

      That’s not so simple and clear, when the developer itself is saying that by not using that additional render path, some problems result from it.

    • elty
    • 11 years ago

    TR: What other image quality and/or performance enchancements does the DX10.1 code path in the game offer?

    Beauchemin: There is no visual difference for the gamer. Only the performance is affected.

    TR: What specific factors led to DX10.1 support’s removal in patch 1?

    Beauchemin: Our DX10.1 implementation was not properly done and we didn’t want the users with Vista SP1 and DX10.1-enabled cards to have a bad gaming experience.

    Improve performance = Bad gaming experience

    more like

    Improve performance on AMD = Bad experience for Nvidia.

    • ECH
    • 11 years ago

    Kraft75 #33
    TWIMTBP article I linked you to clearly states that they help remove bugs, optimize and test games. This is far more then slap-on advertising. I suggest you read the entire article again as you are not comprehending it correctly as the portion I linked you to clearly states “More than just marketing”

    Silus #39
    That review regarding lowered AF was a result of a bug in the Cat driver that was fixed in the same update used for improving Vantage performance. [H] has not revisited the AF problem using the updated drivers which corrects the AF problem. Also, it’s obvious the game has bugs. However, it’s not a result of post processing.

      • Silus
      • 11 years ago

      I wasn’t referring only to the AF issue, but also those weird white dots that appeared on the screenshots taken with the X2 and Vista SP1.

      • Kraft75
      • 11 years ago

      I acknowledge that nvidia provides the developer with a whole slew of useful tools. I don’t discredit the benefits of using these tools at all. Be that as it may, the transaction of services in exchange for brand exposure is still simple advertisement. I don’t see how you can argue that.

      It might be useful to learn how to use the reply button, but that’s me. I won’t hold it against you. 🙂

    • Krogoth
    • 11 years ago

    This confirms my suspicions when I first heard of the DX10.1 controversy with Assassin’s Creed.

    Ubisoft does not have the balls to make Nvidia look bad.

    Nvidia, here is a hint. Stop releasing the same darn architect. It is almost two years old! Die-shrinks are fine and some tweaking. There comes a point where you have to move on though. Just learn from ATI’s own mistakes of trying to hold out with R3xx architect.

    • Ashbringer
    • 11 years ago

    l[

      • BKA
      • 11 years ago

      I thought I read somewhere that nVidia plans to skip DX10 all together and support DX11?

      I switched to an ATI CF(3870) setup from nVidia SLI (8800GT) before AC came out but glad I did now. Trying to adjust to frequent driver updates is taking some getting used to. I switched because I went from a nVidia 780 chipset to an x48 chipset.

      As people have mention all companies cheat sometime or another to get the upper hand. But it usually comes down to how much of an impact the cheat has on their customer base. Since nVidia has the majority right now and AC forums have a lot of threads with problems with nVidia hardware this will stir the pot a lot more.

      I really should get back to playing AC on PC but surprisingly my PSP has gotten most of my playing time lately.

      • rythex
      • 11 years ago

      way to stick it to the man there. :rolleyes:

    • asdsa
    • 11 years ago

    nvidia is satan. nuf said. 🙂

    • Jigar
    • 11 years ago

    Not convinced with the way, this guy has answered the questions. Also, some more questions were need to be asked.. Oh btw the last answer was full of bull…

    But still good work sir..

    • Fighterpilot
    • 11 years ago

    So when its played on NVidia cards there’s some bugs apparently if the DX10.1 code is running but on the ATI cards it runs well and faster?
    Given the nonchalant way that this feature was dismissed by the “graphics gurus” here…where does that leave them now?
    Congrats to TR for investing time and effort on this initial investigation.

      • Silus
      • 11 years ago

      No. There are no bugs with NVIDIA cards. Just the fact that the NVIDIA cards have no advantage in running in DX10.1, which is no surprise, since they don’t support DX10.1 yet. But ATI cards do, but at a cost of some image quality issues, as is demonstrated in the linked [H]ardOCP article.

    • rechicero
    • 11 years ago

    Great work, Scott!!!

    What a pity I’ve just bought a 8800GTS…

      • no51
      • 11 years ago

      I’ll sell you mine so you can have SLI, and I can go pick up an HD48XX?

    • Voldenuit
    • 11 years ago

    Quack.

    You’d think nvidia would have learnt that the community backlash from being a$$holes will hurt its bottom line.

    I went with ati on the Radeon 9700 Pro instead of nvidia because of their cheating shenanigans with drivers.

    Now that they’re doing the same with developers, I’ll probably go R700 instead of G200 for my next upgrade.

    EDIT: Oh, and Ubisoft bowing to this bull$hit? Don’t expect me to buy any games from you for a while, either.

      • Silus
      • 11 years ago

      That makes no sense. ATI was also caught cheating with their 8500 cards. Companies cheat and none is better than the other.

      Not that this is cheating or at least, proved to be cheating. The effects on IQ are noticeable, even if not major and the issue needs fixing, but certainly not removal.

        • Voldenuit
        • 11 years ago

        l[< That makes no sense. ATI was also caught cheating with their 8500 cards. Companies cheat and none is better than the other.<]l It's exactly because companies cheat that informed consumers should provide them a tangible disincentive against doing so.

          • Meadows
          • 11 years ago

          If you want a company that doesn’t cheat, buy an S3 videocard. While you’re at it, take a look at where they are now.

          • Silus
          • 11 years ago

          Sure, but by that line of reasoning no one should’ve bought ATI cards since the 8500 days. And no one should’ve bought NVIDIA cards since the FX days.

        • Krogoth
        • 11 years ago

        He already referenced ATI’s attempt at it with “Quack”.

        §[< https://techreport.com/articles.x/3089<]§

          • Silus
          • 11 years ago

          He just said Quack. And Quack was [H]’s name not Tech-Report’s. TR was Quaff. Anyone that didn’t know about it, wouldn’t know it by just “Quack”, especially in TR.

          • Voldenuit
          • 11 years ago

          Thank you.

          I am not claiming that either side is a paragon of virtue.

          Every time one company is caught cheating, I am swayed towards the competitor.

          Thus, it is in the company’s best interest not to be caught cheating if they want to sell me a product in that timeframe.

    • VILLAIN_xx
    • 11 years ago

    Tips hat to TR.

    Reader for life, thanks for keeping it real on hot issues like this.

    • indeego
    • 11 years ago

    Has any game shown anything remarkable with DX10 verus 9 aloneg{

      • NeXus 6
      • 11 years ago

      DX10 is supposed to make coding easier for game developers and not much more. There’s some graphic improvements but we won’t see any major changes until DX11 gets here assuming there’s anybody left making PC games.

      I think they should have called it DX9.1 but DX10 sounded better and was a way to get gamers to buy Vista. A con for sure.

        • DrDillyBar
        • 11 years ago

        physics in Dx11, wasn’t it? …

    • ECH
    • 11 years ago

    To zqw in post #27:
    Plausibility flew right out the window with the removal of DX10.1 instead of fixing DX10.1. If there is a post processing problem no one who’s reviewed this game mentioned how it effects IQ.

    To Kraft75 in post 31:
    TWIMTBP is more then “slap-on advertising”. But don’t take my word for it:
    §[< http://www.bit-tech.net/bits/2007/06/26/roy_taylor_interview_twimtbp_dx10/3<]§

      • Kraft75
      • 11 years ago

      Well, I stand corrected. I’ll have to investigate some more I suppose. I start with your link.

      EDIT: So developers get free tools if they ‘participate’ in the program. nVidia pays for tools, in exchange developers slap-on the ‘TWIMTBP’ banner. It is still simple advertising to me.

      • Silus
      • 11 years ago

      No, [H]ardOCP showed the impact on image quality.

    • matnath1
    • 11 years ago

    How the heck does a TWIMTBP’d game NOT get tested on the hardware it’s meant to be played with????

    This is BUNK……..

    Ubi developers got carried away with the performance gains that 10.1 offered by eliminating the extra pass etc etc and in their excitement forgot to test GeForce 8800 level hardware…WTF!

    If it looks like a duck, walks like a duck and sounds like a duck it must be a DUCK!

    Keep peeling that ONION SCOTT the last layer is rite in front of you! Where’s the smoking gun?

      • Kraft75
      • 11 years ago

      One would think that the developer would be programing for the Direct X API, and not programing for specific GPU makers. I don’t see how testing individual cards from a developer’s standpoint would be all that important. Didn’t seem to be a concern for the Crysis people, since no hardware runs their game to full specs yet.

      To me the ‘TWIMTBP’ banner seems little more than a slap-on advertising. Correct me if I’m wrong, but I don’t think they go thru any certification to slap that 2 seconds advertising every time you load your game.

      Onions and duck, could be the start of a good recipe!

    • Dposcorp
    • 11 years ago

    This is coming straight out of a Intel/Microsoft play book.

    Putting pressure on companies / cheating to get ahead.

    Is it any wonder my current quad core is AMD and my next video card purchase will be ATI.

    And yes, I give a crap about getting the most for me money, but I also prefer to support companies that do things right.

    I can live with the few FPS I lose better then I can live with this kinda crap.

    • marvelous
    • 11 years ago

    Nvidia spreading it’s influence. Money talks. Technology walks.

    • provoko
    • 11 years ago

    TR gets down and dirty.

    Next week they break into Ubisoft’s HQ and discover a patch for DX 10.2.

    • Forge
    • 11 years ago

    If 10.1 is removed and not returned in 1-2 patches, I certainly won’t change my mind and purchase it, even though I’ve got an Nvidia GPU for the time being.

    Letting sponsors dictate technical decisions is just shy of payola. This needs to stop ASAP.

    • Convert
    • 11 years ago

    While I appreciate the effort you guys put into this I am not exactly as bubbly about the conversation with Mr. Beauchemin as others.

    The questions and answers summed up what I have already read. Which in of itself is nice to get a final word on, still I would have liked some harder hitting questions like:

    “Wouldn’t a option to enable/disable this be prudent instead of pulling it out completely?”

    “Did Nvidia’s TWIMTBP have any impact on the decision to pull the feature?”

    We are talking about a rather substantial difference in speed over some visual differences. *adjusts tinfoil hat* His response of: “We are currently investigating this situation.” is pretty weak.

      • Damage
      • 11 years ago

      We are currently investigating this situation.

        • DrDillyBar
        • 11 years ago

        *chuckle*

          • Damage
          • 11 years ago

          Seriously, though, I meant to link Theo’s article at TG Daily, which gets into the money/influence question a little bit and offer some outside developers’ perspectives on the DX10.1 tech questions.

          §[< http://www.tgdaily.com/content/view/37326/98/<]§ Worth a read.

            • Kraft75
            • 11 years ago

            Pop corn : Check!
            Beer : Check!
            Comfy seat : Check!

            Now the only thing missing is that sweet “Tech Report” hat wear! Love you guys!

            • l33t-g4m3r
            • 11 years ago

            Why I like TR, right here.

            • DrDillyBar
            • 11 years ago

            </bump>

            • Convert
            • 11 years ago

            Thanks Scott, appreciate it.

    • zqw
    • 11 years ago

    Even if they do relent and re-enable the dx10.1 path, the conspirists will say it’s only because of the bad press. And, I think they’d be right. But, maybe not because of evil manipulations. Maybe dx10.1 was disabled because of simple cost/benefit when they saw that it was darker, and few people have dx10.1 cards. It sounds like it was never enabled/QAed in the original release because SP1 wasn’t out.

    EDIT: That sounded too much like a defense. It’s clear the right thing to do is to fix the dx10.1 path. I was just trying to say money/time is often at odds with the right thing. But, I would have made the same call if in a time pinch and these are two typical images on dx10 vs dx10.1:
    §[< http://rage3d.blackholestorage.com/ac-addendum/pics/4/<]§ FWIW, "remove a pass" is GPU programmer lingo for "combine passes (for huge speed benefit)" - as you found out. dx10 also has all sorts of performance wins like this on source data, but few places can justify a "real" dx10 engine. dx9 is still the market.

    • FireGryphon
    • 11 years ago

    You guys rock. I applaud your efforts to get to the bottom of this. The honesty and integrity of Tech Report is as solid as ever.

    This certainly looks like foul play on Nvidia’s part. I mean, Ubisoft basically said, “There’s no image quality change, it just goes faster, so we’re taking it away.” What possible pure motive could precipitate that move?

    • SuperSpy
    • 11 years ago

    I’ve said it before, and I’ll say it again, there’s a reason TR is my homepage, and here it is.

    • herothezero
    • 11 years ago

    This is why TR kicks everyone else’s ass.

    Moreover, has anyone actually played this game?

      • NeXus 6
      • 11 years ago

      I started playing it a week ago. It’s somewhat buggy and the performance seems good but not great at 1920×1200 on a 8800GTS. Gameplay seems rather repetitive after doing a few missions. 7/10 for me.

      • Price0331
      • 11 years ago

      Beat it on 360, sucked ass. Made me throw the game case across my room. There is such a repetitiveness to it, that it about made me quit it all together. (And I hate unfinished things)

        • eitje
        • 11 years ago

        made you, huh? i didn’t know games were so interactive these days.

          • ludi
          • 11 years ago

          It’s called the “Wii Reflex” :-))

      • yogibbear
      • 11 years ago

      I’m up to the 9th assassination. Very cool looking game. Plays really smooth. Only just starting to feel repetitive for me now. Which is good because i’ve basically finished it. The gameplay though is fun/entertaining, but doesn’t really require much skill beyond a bit of timing and getting used to the initial controls (i.e. 30 mins of playing and you are a pro)

      • jehurey
      • 11 years ago

      Played the console versions.

      It really does feel awesome until your tolerance for repetitiveness runs out abut 2/3’s into the game, and then it becomes an amazing struggle to finish it without exploding in anger.

    • DrDillyBar
    • 11 years ago

    Good form!

    • Pettytheft
    • 11 years ago

    So much drama over such a mediocre game. The whole Jade Raymond thing and now this.

    • ChronoReverse
    • 11 years ago

    This is why I read The Tech Report. You guys get down to the bottom of things =)

    Hopefully they’ll fix up the DX10.1 path and re-release it then. The next gen cards better all be DX10.1. From the look of things, you get better AA for more than free O_o

    • flip-mode
    • 11 years ago

    Wow, just wow. Evil. TR hinted at removing TWIMTBP games from reviews and I think that is genius.

      • jobodaho
      • 11 years ago

      TR is always ready to draw swords when a company is not flying straight, and I applaud their efforts.

      • Price0331
      • 11 years ago

      I think it’s a good idea, this certainly seems like some scandal to me. I’m glad you guys are questioning this as well.

      • ssidbroadcast
      • 11 years ago

      flip-mode is on point.

        • MadManOriginal
        • 11 years ago

        I think we should put YOU on point, preferably for a really dangerous, nay, suicidal, mission, then you’ll stop posting the same damn useless thing every time flip-mode posts.

          • DrDillyBar
          • 11 years ago

          I’ll take point, as long as I get AirMiles per respawn

          • bthylafh
          • 11 years ago

          I endorse this product and/or service.

            • Meadows
            • 11 years ago

            I laf’d.

          • flip-mode
          • 11 years ago

          Don’t be jealous of our friendship.

            • eitje
            • 11 years ago

            more like LOVEship! 😉

            • ssidbroadcast
            • 11 years ago

            Zing… …

Pin It on Pinterest

Share This