Remedy shows off quad-core gaming at IDF

The company behind the Max Payne game series has shown off its upcoming title, Alan Wake, running on system with an Intel Kentsfield quad-core processor overclocked to 3.73GHz at IDF. The video is now up on YouTube, and it looks impressive to say the least. In the video, Remedy’s Markus Mäki says a game like Alan Wake “simply couldn’t be done on a single-core processor.” The game’s large outdoor environments transition pretty seamlessly between clear, sunny skies and different weather conditions like rain, wind, and fog, as well as night and day cycles. The latter part of the video also shows a tornado tearing apart some buildings and tossing debris around with full physics simulation, with Mäki adding that the game dedicates an entire processor core to physics computations alone. The end result actually doesn’t look that far off from the Cell Factor demo Ageia uses to show off its PhysX physics processing unit. Thanks to TR reader Pjer for the heads-up.

Comments closed
    • Cannyone
    • 13 years ago

    Ok so, YES it is an impressive demo. I’m so glad Intel was able to find someone to slap something together for them on fairly short notice. They seem to have got their money’s worth.

    My first question is: Did they have to overclock to 3.73 GHz in order for this “demo” to run smoothly? How well would it run on a Quad core, where the actual core speed was somewhere near 2 GHz? My point is that I thought they didn’t want to have to use highly clocked CPUs, and that was the reason for multiple cores.

    See I don’t care how they want to hype this stuff. I can’t afford a $1000+ CPU. If games are going to require physics processing, and Quad cores are going to be expensive, then I might become interested in an Ageia PPU. I would also be very interested in software that could turn one of my old “X1000 series” ATI cards into a general purpose processor. But I’m not going to rush out and buy a Quad Core.

    And I suspect that companies, who want to sell games, won’t produce games like this for quite some time. Well unless Intel bankroll’s the entire project… which wouldn’t be something new. Still it’s a nice idea. And Intel is obviously trying to maintain an image of “pushing ahead”, and being innovative. But it’s NOT going to be even close to “mainstream” even by the end of 2007, and by 2008 they’ll have different types of cores integrated on the same die.

    That means that “Quad Core” is nothing more than passing fancy.

    • Anomymous Gerbil
    • 13 years ago

    Haha, we start to see evidence of multi-threaded game programming, and yet the skeptics are out saying “it can’t be done!”. Why can’t people just accept that games programmers will pretty quickly get the hang of multi-threaded programming that multi-core CPUs will process these applications quickly?

    Why the abject fear of a multi-core world?

      • Krogoth
      • 13 years ago

      We are not saying that it cannot be done. On the contary, we know that parallalism will be the talk of the near-future. The problem is that average joe and gamer do not understand that parallalism has inherent hardware and software limitations. They think that dual-core = 2xCPU Ghz or quad-core = 4x CPU Ghz! We know quite well that this isn’t simply the case. A 90% parallalized program code (which is very optimstic) starts to see sigficant diminshing returns at 6-8 cores. A more realistic 50-60% parallized program only sees benefits from using two cores and anything beyond that it scales next to nothing.

        • echo_seven
        • 13 years ago

        At the same time, though, game developers have the ability to /[

        • Anomymous Gerbil
        • 13 years ago

        That’s incredibly short-sighted. Are you really suggesting that it is beyond the wit of game developers to develop apps that can make reasonable use of more than two threads/cores? And the whole argument that “the cores won’t get used at even 90% (or even 50-60%) therefore this is a waste” is a complete furphy – you’re already assuming that programmers are limited to using two busy threads simultaneously, which is obviously incorrect. Will 4 or 8 cores get used at 90%? Who knows, probably not for the time being… but will we see benefits when programmers start to utilise several cores running at even 50-60%? Of course.

        And not to forget that we’re only talking about games – multi-cores will obviously be useful for the many users who run many apps simultaneously. Will everyone need this tech tomorrow? No. But why get upset about advances in tech that will definitely benefit some of us now, and probably everyone in a year or three? As hardware and software leapfrogs each other, PCs become capable of things we never dreamed of yesterday, or things we dreamed of but assumed would never happen. To suggest just a few of trivial game-related examples – realistic AI for many computer-generated opponents, or much better CPU-generated physics, or real-time ray-tracing in games with the help of fast video cards?

        Speaking of which, it’s the same when new power-hungry video cards comeout; so many posters here get their knickers in a twist about it, completely forgetting that (i) they don’t have to buy those cards, and more importantly that (ii) their current air-cooled low-power 7600GT b[

          • Krogoth
          • 13 years ago

          Short-slighted eh? Make more like making sense out of the hype and marketing non-sense. In a nutshell, the primary hundle for increasing CPU and single-user application performance through parallalism will end-up being a software problem rather then a hardware problem.

          Parallalism’s biggest ticket is simply the ability to multi-task more smoothly if you are not memory and I/O limited. Mulit-user enviorments is an obivious advantage, but that only happens with servers not desktops.

    • alex666
    • 13 years ago

    Was this all dx9c?

      • slot_one
      • 13 years ago

      I’m wondering the same thing

    • kigmatzomat
    • 13 years ago

    I think Aegia missted their mark by a couple of years. If they’d introduced their product before the X2, it might have had a good shot of getting adoption. Now it’s really just a proof of concept on how multiprocessor support of their, or other, physics API is good.

    This also dovetails in well with that discussion the other day about dwindling rates of return for multi-threaded apps. I think that there will be a push to modularize code into sub-apps, especially anything that runs continuously. I suppose that is parallelization at a macro level. Physics runs continously using streams of data from other apps and a lack of data doesn’t impact the need for the physics engine to operate. By the same token, some AI could be independent, using lag time when the “game” thread is bogging down to run additional logic trees on the next action.

    • Krogoth
    • 13 years ago

    (Yawn) it is nothing more then a PR stunt like the Conroe preview.

    It still does not prove why you even need a quad-core CPU. The lack of a graph showing the load balance is a red light for the skeptic in me.

    The more important question these days is can developers deliver a multi-threaded game that has actual gameplay not overhyped marketing non-sense and eye candy?

      • Retiky
      • 13 years ago

      I second that.

      • indeego
      • 13 years ago

      me over the last 3 years:

      1. holy sh*t that looks awesome!
      2. /me looks at game synopsis
      3. /me continues working, sighsg{<.<}g

        • Vrock
        • 13 years ago

        Agreed. I long for the day when shooting at things on a computer was acutally fun.

    • DragonFli
    • 13 years ago

    This is the kinda stuff that made me excited for multicore’s in the first place, Load Balancing! You have 4 cores and divide the work four ways: 1 core for extraneous processes (Windows API, graphics drivers, DirectX that kinda thing) 1 core the the game it self, 1 core for physics and the last 1 core for AI! Perfect Balance!

      • StashTheVampede
      • 13 years ago

      Which thread gets screwed when someone only has 4 cores? How about when the 8 core chip is in the box, where do these threads go?

        • DragonFli
        • 13 years ago

        Good question! Something nifty those programmers can hopefully figure out?

        • kigmatzomat
        • 13 years ago

        Same thing that happens now; threads queue up. Likely there’ll be something more complicated at play, such as the ability for entire threads to be disabled just like GPU settings. It’ll be a pull down of “# threads” right next to “#x AAF” in the game configs. More than likely, it will auto configure based on the developer’s testing when it does a hardware detect. If you only have 2 cores then you can either have the physics option or the “accelerated” AI option but not both, for instance.

        This will obsolete a whole lot of high-Ghz systems once dual-core becomes a necessity.

    • Inkedsphynx
    • 13 years ago

    Another underlying point to make is that it seems these chips will still have good overclockability. Hitting a 1.07GHz OC seems pretty damn decent, given that it’ll be occurring on 4 cores.

    • Inkedsphynx
    • 13 years ago

    There you go people, the future of gaming. Who was it that just said in another thread it’d take games 5 years to even utilize 2 cores? And already we’ve got games in development that will utilize 4 cores. Never underestimate technology.

      • tsoulier
      • 13 years ago

      I would be with a good GPU a dual core could run that just as well

        • Inkedsphynx
        • 13 years ago

        Given that they specifically state it’s designed to use 4 cores, I sincerely doubt that. Especially since (they mention an Nvidia video solution, and given the specs of the processor itself) they’ve got to have at least a single 7950GX2, if not a pair of them, in there. Either way, I doubt dropping 2 cores would yield similar performance if they are in fact utilizing all 4 to begin with.

          • Jigar
          • 13 years ago

          Agreed on ur comments. My patients are paying me well, i am patiently waiting for K8L quad core and then i will decide which one to go for ,.. if K8L does’nt perform as Kentsfield then i guess Kentsfield would be my choice. But waiting for K8L means drop in Kentsfield price as it would 6 months old by that time.

            • tsoulier
            • 13 years ago

            thats right Intel tells the truth

          • Retiky
          • 13 years ago

          Specifically designed to use the threads if their available, not only designed to run on quad core. It would make no sence for a game developer to release a game that runs only on quad core machines. I saw no evidence in that video speaking to its quad core nativity. The demo could have been running at 120fps for all we know. Remember, he said “…not possible on a single core…”

          That being said, this lookes really impressive. I would use it as exhibit ‘a’ in the testimony against PS3 having a monopoly on “…a truly new and different…” gaming experience. I’m tired of people misjudging the potential power of current hardware like the 360 or even duel core machines. I’m willing to bet that this game can run playable on the highest eyecandy settings with the x6800. The quad core part of the video is mostly pr.

            • Inkedsphynx
            • 13 years ago

            Whether native or not, if they have processes running on 4 cores using more than 50% on every core, you WILL lose performance by cutting 2 cores out, because the other two can’t keep up with the slack.

            Furthermore, if this program truly is running in full parrallel (or some extent of it), cutting 2 cores out halves your performance, because now 2 cores are running 2 threads, finishing, and then grabbing the other 2 threads.

            There’s obviously no way for us to know how well it’d run on any given machine, but I think the biggest point to be made here is that it CAN utilize 4 cores, which is a MAJOR step in the right direction.

            • poulpy
            • 13 years ago

            yeah righ, no numbers given whatsoever on a game demo that looks outrageously like a PR stunt? Give me a break..

            We don’t know how busy the cpus/cores are, we don’t know what kind of physic is used, and the only statement is that it wouldn’t be doable on a single core. Which for all we know could be as true as “Skype needs the power of an Intel Dual Core to handle a large conference call”

            And even if this PR was true that doesn’t prove anything about the game industry as a whole.

            IMO a dual core with an integrated PPU would be far more efficient at gaming than a quad core. As they’re so much you can compute in parallel and PPUs leave in the dust CPUs for those dedicated tasks (and they can do real physics not just blowing stuff around).

            • Retiky
            • 13 years ago

            y[http://www.techreport.com/onearticle.x/10877<]§ y[

        • ew
        • 13 years ago

        I’ll bet that with a fast enough single core it would run just as well too. That is of course unless they’ve hard coded something to stop working if only one core is detected. But these sorts of arbitrary limitations can be fix with a simple patch. (See §[< http://slashdot.org/articles/06/03/04/1430243.shtml<]§ for an example)

      • Patrickr
      • 13 years ago

      Shazbot!

      I miss tribes. Tribes 1 was great, but after I got rid of my voodoo 2 it didn’t run right in OGL. Tribes 2 was a blast, but had quite a few problems, and like you said was pretty rough on the hardware at the time.

      I remember playing tribes 1 with dial-up and only a select few a had cable/dsl. LPBs (low ping bastard)s could always chain gun and snipe better than anyone else.

        • Gungir
        • 13 years ago

        I still play Tribes: Vengeance when I have the time. It’s one of the most visually interesting, if not most detailed, first person shooters ever created, and the playstyle is incredibly unique.

    • alex666
    • 13 years ago

    Amazing video. Definitely check it out, as it’s the future of games and God knows how many other potential programs, e.g., mapping programs.

    • maxxcool
    • 13 years ago

    😀 like i said in the “how many cores thread”

    2 cores for gaming, 2 for physics (ok and maybe AI)

    Go team Quad!

      • StashTheVampede
      • 13 years ago

      Eventually, we’ll get some dedicated GPU cores. 2 for CPU, 2 for GPU, 2 for physics, 2 for sound, etc.

      Then we’ll look at other chip designs (like the Amiga or Cell) and understand why they went with seperate and dedicated chips.

        • maxxcool
        • 13 years ago

        I have been slapped around here for some time becasue I have been saying we would go to dedicated cores and remove the load from the slow fpu’s in the typical proccessor.

        Glad some one elses agrees…..

          • StashTheVampede
          • 13 years ago

          The largest question really is: how will it all connect? I’m in favor of AMD’s approach: more sockets on the board. Slap in your needed chip (and associated ram) and go off to do your cool stuff. PCI-E is fine, but I’d rather ditch the PCB and slap something right into a socket.

        • Beomagi
        • 13 years ago

        gpu’s are already segmented. The idea of a multi core gpu is more like something voodoo6000 and ati xpert 98’s did or even some recent nvidia designs, where they used software to combine 2 or more cores. I dont expect to hear about multicore gpu on a die, because such a term should not exist. you get 16-24 pipelines in top gpus today, and the load balancing within the core is better on the outside – i.e. using 2 cpu die’s linked by software for afr/sli/checkers/split etc. Graphics has been processesed in parallel for ages because it’s simple to do so.

    • Beomagi
    • 13 years ago

    WONDERFUL!

    and cellfactor STILL doesnt show off ageia’s product. it shows off what a cpu can do, SAYING they use ageia NOT telling you ageia’s product is exclusivly needed for ooze and cloth.

Pin It on Pinterest

Share This