Nvidia highlights PhysX effects in Batman: Arkham Origins

Batman: Arkham Origins is due October 25, and it will have extra eye candy effects for GeForce owners. The game features a number of special effects powered by Nvidia’s PhysX API. You can see some of them on display in the video below.

The PhysX-powered fog looks good, and it definitely fits the mood of the game. The soft shadows, particle-based snow effects, and tessellated footprints are also nice touches. I’m not getting much out of the PhysX-based cloth simulation used to animate Batman’s cape, though. Nvidia’s TXAA demo also falls flat, although that may be in part due to the YouTube video compression.

The video, which was posted by GamesHQMedia, is a perfect illustration of why YouTube can be a poor medium for showing off graphical effects in games. Even with the resolution cranked up to 720p, the footage looks dull and washed out. A lot of the textures lack detail, too, though video compression may not be the culprit there.

Arkham Origins should look better when rendered in real time, and I hope the PC version has higher-resolution textures than console versions of the game. It would be a shame if implementing vendor-specific effects took precedent over providing a crisper, more detailed world for all. Thanks to Rock, Paper, Shotgun for the tip.

Comments closed
    • GENiEBEN
    • 7 years ago

    All that glimmer and shimmer but he walks on snow and leaves no traces behind @ 1:20

    • Laykun
    • 7 years ago

    Perhaps Batman’s cape should be separate playable character if it’s so important :P.

    • Buzzard44
    • 7 years ago

    I thought the game looked good…until they turned on PhysX and everything became, as Pink Floyd would put it, Obscured by Clouds.

    • Spotpuff
    • 7 years ago

    I loved the batman series but that cape just gets in the way; it actually blocks your view sometimes (specifically in the joker challenge).

    The batman beyond costume alleviated that problem but yeah.

    • Wildchild
    • 7 years ago

    I cannot tell you how upset I was when I got my first Nvidia card in probably 5-6 years and come to find out that PhysX is a buggy piece of junk. The fact they market this so much and yet it performs so poorly is a huge slap in the face.

    Please just let PhysX die.

      • Airmantharp
      • 7 years ago

      PhysX, for all intents and purposes, died when Nvidia bought them. But it runs rather well from what I’ve seen, for all the graphical ‘enhancements’ they use it to provide. Sad that it never grew beyond eye candy.

        • Meadows
        • 7 years ago

        It never had to! It never had to grow beyond eye candy. But what truly riles me up is how they bring here effects from 10-15 years ago (physics-simulated cape, smoke, particle snow) and then yell how it’s “crazy” and how it’s never been possible before.

        Good grief, NVidia. Cape? We had that 10 years ago. In Max Payne 2. With Havok. [b<][i<]On a single-core CPU.[/i<][/b<] I'm not trying to sound like a Krogoth on steroids, but that's how I feel like whenever something like this comes along.

          • Airmantharp
          • 7 years ago

          Yup- actually, a good example of in-game physics being integral to gameplay was how Havok was used in the Half-Life 2 series. While that was pretty primitive, essentially, I’m looking forward to the point when the entire game-world is being simulated by physics routines on GPUs/APUs, where every action that takes place is governed by those rules. That’s really what I had expected PhysX to do, based on a number of demos and some expressed developer interest, until Nvidia bought them up; no one really wanted to touch PhysX after that, and that potential disappeared.

          And yeah, it’s kind of stupid to see this stuff paraded around as ‘new’. It doesn’t even look all that great, and it isn’t novel in the least- and it’s in a single-player ‘Batman’ game, which limits it’s appeal. Hell, I own at least one, but I’ve never really gotten into it.

          • Laykun
          • 7 years ago

          Marketting.

        • Wildchild
        • 7 years ago

        It was a hit or miss for me. A lot of times I had to go into the games file directory and alter something in order to get physx working properly, but that only worked half the time.

        Off topic, but is the founder of Ageia still working for AMD?

    • tbone8ty
    • 7 years ago

    physx is not progressive for the gaming community

    TressFX is

      • Wildchild
      • 7 years ago

      Not sure why you got a bunch of down votes. Sure, TressFX isn’t really all that impressive, but at least it’s not proprietary crap. It IS a step in the right direction.

        • JustAnEngineer
        • 7 years ago

        The thing about TressFX is that it’s done the [b<]right[/b<] way. It just works, regardless of what brand of GPU or CPU I'm running.

          • Airmantharp
          • 7 years ago

          Now, if it was used for something other than just hair… 🙂

            • Meadows
            • 7 years ago

            Pubes?

            • destroy.all.monsters
            • 7 years ago

            I was thinking tentacles.

            • Airmantharp
            • 7 years ago

            I could see that being useful…

      • Laykun
      • 7 years ago

      My only real problem with TressFX is that it suffers from the same thing as Physx. In it’s current usage in Tomb Raider it’s overdone on purpose, so that you notice it’s there. Problem is TressFX is aimed to make things look realistic but the way they configured the hair properties had the opposite effect. A dirty un-shower woman walking round in the rain DOES NOT have perfectly flowing unclumped floaty dry hair from a shampoo commercial.

      [url<]http://www.youtube.com/watch?v=HvHq4JIcneY[/url<] It also seems that the hair isn't effected by wind in this video either.

    • michael_d
    • 7 years ago

    PhysX makes a game much more dynamic and interactive. I can attest to it by playing Metro games with a Radeon card and Titan.

    I do not see a reason to come back to Radeon until they come up with a decent physics solution and driver support.

    • Krogoth
    • 7 years ago

    Not impressed.

    • Srsly_Bro
    • 7 years ago

    OMG DID THEY MAKE SMOKE????

      • destroy.all.monsters
      • 7 years ago

      No, that was Wayne Wang.

    • Meadows
    • 7 years ago

    The year 2013.

    Virtual reality reimagined.

    Introducing: [i<][b<]smoke[/b<][/i<]. Be blown away.

      • Diplomacy42
      • 7 years ago

      not just smoke, but fog and wind-blown snow as well, and that isn’t even beginning to scratch the surface of what this amazing technology can do! imagine, desert-scapes with wind blown sand, dusty rooms, swamps with real mosquito effects.

      The possibilities are nearly endless. Be blown.

        • derFunkenstein
        • 7 years ago

        “Really? And all the guys like her, huh? That is – that is – that is great. Uh, you mean ‘away,’ though, right? Because, otherwise, it sounds a little different, but, uh, that’s, uh, that’s outstanding. You forgot to say ‘away’ again. But listen, let me call you back in a bit, ok?”

        #ArrestedDevelopment

    • ish718
    • 7 years ago

    Damn Nvidia! just give up already with this Physx crap…

    • trek205
    • 7 years ago

    ugh that TXAA is blurry crap. why do they still promote that as a feature when in every game it just ruins the scene with blur?

    • jessterman21
    • 7 years ago

    This is for everyone – I think I’ve solved the Youtube compression problem… You just have to capture and upload uncompressed video, so that YouTube is the only one compressing it.

    Of course that’s like 5GB for 3 minutes of video, but the results are impressive. Take a look at [url=https://www.youtube.com/user/Betafix/videos<]BetaFix's amazing gameplay videos[/url<] at 1200p - that's where I got the idea. Here's an example I uploaded: [url=http://youtu.be/av64jEegc2U<]playing Warface co-op[/url<] for anyone who's interested.

      • Airmantharp
      • 7 years ago

      Can’t watch your videos from here, but it’d be nice to know what format Youtube/Facebook/et al actually expect so that no transcoding is necessary. Facebook especially butchers videos to the extreme.

      • LukeCWM
      • 7 years ago

      I don’t mean to be disagreeable, but your YouTube video and BetaFix’s videos still look compressed. Clearly it is YouTube’s fault. But, then again, they are storing these videos for free.

      Sometimes people forget how good games can look with no compression. For example, even League of Legends with the settings maxed at 1080p looks simply stunning compared to YouTube clips of the latest and greatest games.

      I’d really love to see Tomb Raider with the settings maxed out at 1440p at 60+ FPS, but I simply don’t have the hardware for it.

      Perhaps, as Jordan suggested, game producers should put game trailers up on torrent hosting sites. With minimal compression. And keep them short, of course. Or perhaps direct downloads of minimally compressed game trailers? Or some paid video hosting service that actually does the video justice?

      • jihadjoe
      • 7 years ago

      I think you can still compress the video, just make sure to use a lossless algorithm like huffyuv or zlib so lossy compression only happens once.

    • Chrispy_
    • 7 years ago

    As always, Physx is overused so that it looks fake; Every indoor scene looks like there’s a dry-ice movie set malfunction.

    Never mind, Nvidia – You didn’t want the console contracts so this will be the last AAA game* that uses your vendor-locked PhysX instead of OpenCL.

    * – Actually, I’m sure there will be more, but I don’t care much for PhysX, I disable it on my Nvidia machine because the non-PhysX implementations look more realistic and usually run much better too.

      • Airmantharp
      • 7 years ago

      This is, really, all Nvidia’s fault. PhysX could have been so very much more than extra eye candy- it had the potential to become the foundation of in-game geometry processing before it gets incorporated into AI routines, sound maps, and then sent off to the GPU for rendering, and that’s whether or not it’s run on an Nvidia GPU or any GPU at all.

      They set us back at least five years, if not a decade, in this regard.

        • Deanjo
        • 7 years ago

        How is it nvidias fault that no other company puts the effort into what you want? Again, NOTHING is stopping others from putting that effort into it.

          • Airmantharp
          • 7 years ago

          Mostly because they bought up the most promising solution and then locked it down- PhysX ran great on AMD GPUs at the time, and would run even better on them now at similar price points across the board, given their greater focus on compute logic.

          So Nvidia could have made it an open solution, or just invested in PhysX, instead of taking them over completely and then attempting to use the technology as a competitive advantage. That didn’t work at all, really- a few games use PhysX for eye candy, but literally none use it for anything important, because the technology would have to be able to run on every platform that they ship their games on- and Nvidia made that impossible.

            • Deanjo
            • 7 years ago

            [quote<]Mostly because they bought up the most promising solution and then locked it down[/quote<] Newsflash, Physx was locked down before even nvidia got their hands on it. Physx was locked down to Ageia PPU's. [quote<]PhysX ran great on AMD GPUs at the time[/quote<] On a unlicensed product. At one point nVidia were willing to license it to ATI. ATI deemed it not desirable. That's ATI's decision, not nVidia's. If nvidia all of a sudden started building chipsets again for AMD CPU's or incorporating eyefinity without a license you can be damn sure AMD would be raising a stink. [url<]http://www.extremetech.com/computing/82264-why-wont-ati-support-cuda-and-physx[/url<] [quote<]I spoke with Roy Taylor, Nvidia’s VP of Content Business Development, and he says his phone hasn’t even rung to discuss the issue. “If Richard Huddy wants to call me up, that’s a call I’d love to take,” he said.[/quote<] Physx has been licensed to many others not requiring nVidia hardware. If AMD wanted it, they could have licensed it as well like everyone else does. [quote<]So Nvidia could have made it an open solution, or just invested in PhysX, instead of taking them over completely and then attempting to use the technology as a competitive advantage. That didn't work at all, really- a few games use PhysX for eye candy, but literally none use it for anything important, because the technology would have to be able to run on every platform that they ship their games on- and Nvidia made that impossible.[/quote<] Or AMD/ATI could have licensed it like all the others. Want Physx? Get a licensed product. Hell Physx 3 has SSE and multithreading support (and is very cross platform friendly as it is being available in Sony/MS/Nintendo consoles, OS X, Linux, Android and iOS.) If AMD/ATI wanted to go the open route then that option is available to them. They decided not to license Physx and go the openCL route. Unfortunately, even with openCL support they still have not invested a lot into developing a physics library to accompany it. Intel even had shown of Havoc running on ATI and Nvidia hardware but decided to kill it. And BTW, Havok came out way before Physx and was used in a hell of a lot more games. Intel could have easily opened that up, but alas no, you decide to single out nVidia because they invested in something, continued to develop it and licensed it out to people who were wanting to license it.

            • Airmantharp
            • 7 years ago

            Havok was actually used in games- but it wasn’t a hardware accelerated solution, though it likely could have been.

            Most of my complaint is that PhysX was bought by Nvidia; again, it ran on everything up until that point.

            And the real point is that it’s going to take a dedicated physics routine in DirectX to get any real forward motion, and then, it’s going to take years of developer support in order to make physics actually integral to game design.

            PhysX was just another company, with a not so terribly unique product, but they were as close as anyone had gotten to really putting physics in games not just for eye candy but to really improve the experience significantly. Selling out to/being bought by Nvidia killed that inertia, just like Intel killed Havok’s inertia by gobbling them up.

            So, the ball is in Microsoft’s court to make everyone play nice long enough to get something useful done.

            • Deanjo
            • 7 years ago

            [quote<]Havok was actually used in games- but it wasn't a hardware accelerated solution, though it likely could have been.[/quote<] It is a know fact that it could have been as it was already proven by running it on ATI and nVidia cards (aka Havok FX). [quote<]Most of my complaint is that PhysX was bought by Nvidia; again, it ran on everything up until that point.[/quote<] It is absolutely no different, Physx can run off of CPU, as could Ageia but running it on their proprietary hardware was where it absolutely shined. If you were looking for a CPU based library then Havok was right there and gave you better performance running from the CPU. [quote<]PhysX was just another company, with a not so terribly unique product, but they were as close as anyone had gotten to really putting physics in games not just for eye candy but to really improve the experience significantly. Selling out to/being bought by Nvidia killed that inertia, just like Intel killed Havok's inertia by gobbling them up.[/quote<] Again, Havok was there that could have done the exact same thing. [quote<]So, the ball is in Microsoft's court to make everyone play nice long enough to get something useful done.[/quote<] This really kills me, bitching about vendor lock-in, then waiting for the #1 proprietary vendor to save the day. Do you think MS is going to bring forth an API that runs on OS X, Linux, Playstation, Wii, Android, etc? We are still waiting to DX being brought out in a non-proprietary form. 17 years and counting......

            • Airmantharp
            • 7 years ago

            I’m not saying that a physics implementation in DirectX is the end- I’m saying that it’s the beginning. Getting the hooks built in to driver models and real hardware logic to back it all means that once physics support is corralled into every GPU in order to support it in DirectX, then it can easily be ported to other systems, like the ones you mention above.

            But you have to get the vendors all going in the same direction at the same time, and only MS can really do that. They’ve already proven that they’re not going to play nice with each other on their own.

            • Deanjo
            • 7 years ago

            [quote<]I'm not saying that a physics implementation in DirectX is the end- I'm saying that it's the beginning. Getting the hooks built in to driver models and real hardware logic to back it all means that once physics support is corralled into every GPU in order to support it in DirectX, then it can easily be ported to other systems, like the ones you mention above.[/quote<] The "hooks and real hardware logic" is already there and has been for a while through GPGPU. MS went their own route with DirectCompute, instead of openCL. If you are looking for dedicated hardware logic for just physics, then you are going backwards of where the industry is going. MS does only things that benefit Windows and doesn't give a crap about elsewhere. Imagine how much further we would be along had MS decided to support openGL and openCL. You want to talk about a company that has a history of holding universal solutions back, it is MS in spades.

            • Airmantharp
            • 7 years ago

            OpenGL has been following DirectX in features for the better part of a decade- and OpenCL, while getting broad attention, is still not getting much more than token support from the industry. Not that DirectCompute is even getting that; CUDA is still where it’s at if you’re not mining Bitcoins.

            By ‘hooks and real hardware logic’ I’m mostly referring to tuning the GPU architectures to ensure efficient physics processing AND the ability to do it smoothly using inputs from a game engine, and then being able to return results back to the game engine, not just using them as part of the rendering process (which is all PhysX has been used for since their acquisition by Nvidia, a shame). I expect that’s going to take more than just a driver update to get right, and I’m offended that that process hasn’t been worked out yet.

            • l33t-g4m3r
            • 7 years ago

            PhysX was always a scam. That’s why Nvidia bought it. It used crippled x86 code on the CPU and limited cores from DAY ONE. The ONLY difference is that Ageia crippled games less than Nvidia to increase it’s non-existent userbase. PhysX works on a shareware/demo model, where you can try out the basic features, but have to pay for the full version. This is how PhysX always worked. People who claim otherwise don’t know what they’re talking about.

            [url<]http://www.pcgameshardware.com/aid,703571/AMD-vs-Nvidia-Physx-said-to-lack-proper-CPU-optimization/News/[/url<] [quote<]I have been a member of the PhysX team, first with AEGIA, and then with NVIDIA, and I can honestly say that since the merger with NVIDIA there have been no changes to the SDK code which purposely reduces the software performance of PhysX or its use of CPU multi-cores. [/quote<] [quote<]Our PhysX SDK API is designed such that thread control is done explicitly by the application developer, not by the SDK functions themselves. One of the best examples is 3DMarkVantage which can use 12 threads while running in software-only PhysX. This can easily be tested by anyone with a multi-core CPU system and a PhysX-capable GeForce GPU. This level of multi-core support and programming methodology has not changed since day one.[/quote<] A lot of the older links about PhysX have disappeared or gotten harder to find, but Phsyx works exactly the same now as it did then. Game developers were left in charge of cpu cores, as that wasn't an automatic feature. This allows TWIMTBP/Ageia sponsered developers to limit cores, and shift blame away from the API. Not that it matters, as the cpu path was still being crippled with legacy x86 code. I am sick of shills claiming Ageia was different, as there have been countless investigative articles and direct admissions from employees proving otherwise. PhysX is a scam, and it always was. Developers don't need PhysX to implement effects, but it is harder to code effects from scratch. They only benefit here is ease of use, and that's how nvidia gets this crap in all these games. It's a quick and easy hack job to throw in a couple sponsored particle effects from an editor, instead of manually programming it.

        • PixelArmy
        • 7 years ago

        It is used for more than eye candy by virtue of being integrated into so many game engines as their physics engine. But “real” physics isn’t the most exciting thing to market.

        For example, I don’t have to tell you how many games use Unreal Engine 3… and it uses PhysX as its physics engine… You did know that right? [url<]http://www.unrealengine.com/en/features/physics/[/url<] That is why for example in BL2, you can't actually turn PhysX off and a Gaming Evolved title like Bioshock Infinite, has PhysX listed one of the splash screens, that you can't toggle.

    • Goty
    • 7 years ago

    [quote<]Nvidia's TXAA demo also falls flat, although that may be in part due to the YouTube video compression.[/quote<] TXAA probably looks better in comparison on Youtube than it would in a screenshot since the compression will make the non-TXAA images look blurry, too.

      • jessterman21
      • 7 years ago

      No, TXAA pretty much turns textures to mud. You have cinema-quality antialiasing and it’s amazing, but the effect on textures is like what you see on surfaces viewed at oblique angles without anisotropic filtering turned on, except it’s every surface, viewed at any angle or distance…

        • Airmantharp
        • 7 years ago

        Few AA routines actually improve fidelity in every way- in the end, it’s going to take a conglomeration of tightly coupled techniques to really eliminate aliasing, along with higher resolution displays. I look forward to what developers are able to do on these consoles, as they should look much, much better at 1080p than the previous gen ever did at 720p.

          • jessterman21
          • 7 years ago

          [quote<]Few AA routines actually improve fidelity in every way- in the end, it's going to take a conglomeration of tightly coupled techniques to really eliminate aliasing, along with higher resolution displays[/quote<] I wonder if that's why quite a few devs are putting supersampling as an option in their games lately.

            • Airmantharp
            • 7 years ago

            Even super-sampling can kill fidelity- but developers can account for that these days, unlike when 3Dfx originally implemented it at the driver level. Still, it’s going to take a number of very targeted routines to actually clean up aliasing without adversely impacting some other aspect of graphics fidelity or really tanking performance. None of this ‘apply FXAA/MSAA/SSAA done’ crap, it’s really got to be done per-pixel based on what that pixel is representing, and what the pixels around it are representing.

            • Laykun
            • 7 years ago

            Please elaborate on how Super-sampling (the act of rendering at a higher resolution and scaling down to native) kills fidelity.

            • l33t-g4m3r
            • 7 years ago

            Probably confusing real AA with Quincunx. ATI and 3dfx did AA right, whereas nvidia couldn’t figure AA out until dx10-11.

            I prefer using a low level of MSAA+FXAA, which probably is as good as it’s gonna get for perf/iq. TXAA looks too blurry to bother with.

          • Chrispy_
          • 7 years ago

          MSAA was almost the holy grail – find the jaggies ([i<]just[/i<] the jaggies) and smooth them. The only downside was the edges of transparent textures. Adaptive multisampling was supposed to identify only transparent textures and blend those textures alone - and that would be the final piece of the image-quality puzzle but it turned out that performance nosedived 🙁

            • Airmantharp
            • 7 years ago

            And MSAA worked great, for the most part like you say, up until everything went to shaders. That’s why AA has to be both integral to the game engine with support from the APIs and GPUs, and it has to be really, really targeted to accomplish it’s goal without killing performance.

            • jessterman21
            • 7 years ago

            I’m actually okay with most polygonal aliasing. It’s the transparent and alpha tex stuff that I can’t stand aliasing on – which is why I’m an advocate of FXAA and SMAA. Oh, and someone do something about specular aliasing without making the textures run together. Eyugh.

            • Airmantharp
            • 7 years ago

            I don’t run AA on BF3- just FXAA at medium, awful as that implementation is. Other games, maybe 2x or 4x, depending, as long as the performance hit isn’t there; this is the price I pay for running 2GB cards at 1600p. Not that I’m complaining, of course.

            • Laykun
            • 7 years ago

            Shaders did not kill the ability to do full screen MSAA. What are you talking about? MSAA works fine with shaders so long is you’re forward rendering your content. And if you deferrer the rendering and want MSAA you need support from more modern APIs that give you multi-sampled textures to render into (not supported in DX9).

    • LukeCWM
    • 7 years ago

    That much fog looks rather annoying, and Tomb Raider does all the fog and wind and snow particles just fine without exclusive, proprietary technology.

      • Airmantharp
      • 7 years ago

      And it’s equally sad that we’ve had this technology for nearly a decade now, yet Nvidia’s ‘marketing move’ essentially killed real GPU physics for over five years. I mean, seriously, none of this crap is hard, and it would run equally well (or better!) on AMD GPUs.

      Imagine if Nvidia hadn’t acted anti-competitively- we’d have far better looking games that require even more GPU power, and that would mean more profits for both Nvidia and AMD. We’d be happily paying more to upgrade sooner or get faster hardware to be able to play games with effects like these turned on.

        • Goty
        • 7 years ago

        Most of it actually runs pretty well on CPUs, too, with the exception of some fluid simulations.

          • jihadjoe
          • 7 years ago

          Would have been a great way to make use of all those extra cores.

        • Deanjo
        • 7 years ago

        Nothing is stopping from AMD or another company to push a open physics solution that could run on any GPU. Why does it have to be nvidia that does the work for all. Why doesn’t intel push and continue to develop Havok FX? Why isn’t AMD pushing for adoption of libraries like Bullet? You can hardly blame nvidia for keeping physx exclusive to their own products. The rest of the players in the graphics game are not doing anything towards it so why should Nvidia worry if it runs on the competitions hardware or not.

          • Airmantharp
          • 7 years ago

          Really, it has to come from Microsoft, who has to get support for it from AMD, Nvidia, and to a lesser extent Intel. Even if it is just OpenCL in DirectX, that’s what it’s going to take for developers to actually use it, I think.

            • Zizy
            • 7 years ago

            You mean DirectCompute, which is part of DirectX? OpenCL is CUDA’s competitor for GPGPU and I dont think it is really relevant here, but there is OpenGL’s equivalent of DirectCompute.
            There are also open GPU physics libraries, but there doesnt seem to be too much interest in these. I guess most developers prefer to hammer GPU with graphics and use AI/physics to keep CPU busy.

            • Airmantharp
            • 7 years ago

            Actually, I don’t- DirectCompute isn’t related strictly to physics, though it can be used for it. I’m looking for a Havok or PhysX type framework from Microsoft that is ambiguous to GPU, CPU, APU, whatever- something that everyone can get behind, because if they don’t bring it, we’re stuck with different outfits pushing different proprietary solutions.

            Note that what I’m looking for would actually require hardware support, not just driver support, but would result in broader physics processing capabilities throughout the industry- see what happened with DX10 and universal shaders. That flexibility got pushed to every other platform, even though it debuted on and was primarily used for Windows games initially.

          • Vaughn
          • 7 years ago

          If you want to play with Physx do what I did I added a 650SC to my rig my primary is a 7970Ghz and now I get the best of both words.

          Complaining about the politics of what nvidia is doing is great because I agree but it won’t change anything at this point and time…..

            • Deanjo
            • 7 years ago

            Don’t need to do that, the Titan handles it all just fine.

            • Vaughn
            • 7 years ago

            ya but The titan cost $1000.

            7970 Ghz I paid $400

            650SC I paid 130

            I will take $530 vs $1000

            Heck I could crossfire the Radeons and still come under $1000.

            Titan = bad value friend!

            • Deanjo
            • 7 years ago

            [quote<]Titan = bad value friend![/quote<] Not in my case, it was purchased for cuda development which in turn paid for both titans in a months time. I didn't buy the titans for gaming, it was bought for development.

            • destroy.all.monsters
            • 7 years ago

            How did you get past the lockout? Some kind of hack? Nvidia’s been blocking physx in mixed systems for years afaik.

          • Antimatter
          • 7 years ago

          AMD has been pushing open physics solutions. Tomb Raider uses DirectCompute and AMD has a partnership with Bullet Physics. Both of which are available on Nvidia GPUs.

          [url<]http://www.amd.com/us/press-releases/Pages/amd-announces-new-levels-of-realism-2009sept30.aspx[/url<]

            • Deanjo
            • 7 years ago

            Yup and that’s about the last time AMD contributed anything to Bullet. Back in 2009/2010 before the cuts happened.

            • destroy.all.monsters
            • 7 years ago

            Given the depth of the cuts at AMD I’m always surprised they get much done at all. They likely won’t return to putting engineers on it again.

Pin It on Pinterest

Share This