First Unreal Engine 4 videos now available

Last month, we saw the first screenshots of Epic’s first Unreal Engine 4 demo. Sadly, the full demo was kept from public eyes, so a handful of still shots was all we got.

Well, that’s been rectified. On its GeForce.com website, Nvidia has posted both the UE4 Elemental video, from which the screenshots originated, and another video that guides viewers through the engine’s individual features (including additions to make game developers’ lives easier). The videos are downloadable in 1080p format. Be warned though: they do make for a hefty download, weighing in at 1.4GB when put together.

If you don’t mind a slight step down in quality, the folks at GameTrailers have posted 720p, watermarked versions of the videos on YouTube:

Pretty impressive stuff. It was hard to tell from the screenshots, but the Elemental demo has loads of real-time destruction and liquid simulations. The real-time global illumination is pretty impressive, too. I imagine both features require a fairly quick GPU, however—in the latter half of the second video, the FPS counter seems to drop under the 30 FPS mark in at least a couple of places.

By the way, Nvidia’s UE4 coverage on GeForce.com includes an interview with Epic founder and tech guru Tim Sweeney. In the interview, Sweeney reveals that Nvidia’s PhysX API was used for physics collisions in the Elemental demo. He also confirms that, with UE4, he seeks to "define the next generation of graphics capabilities that are achievable with DirectX 11 PC’s and future consoles."

Comments closed
    • HisDivineOrder
    • 7 years ago

    Probably best to watch the 720p versions since that’s what new consoles will be rendering games like this at, if at all. I mean, do you think that Sony or MS have the cahoneys to release consoles with the hardware to pump those demos (GF680-class hardware) at 1080p? No, not at the prices they’ll cost, not even next year.

    So we’ll have a case like we have now where the hardware CAN go 720p, but it’s often rendered in the hardware at sub-720p and then uprez’ed after the fact. Except now we’ll at least have min HD (720p) uprez’ed to 1080p. And called 1080p.

    I suppose it’s progress, but best to go ahead and see the content the way it’s going to be when it shows up on consoles (if you’re a console gamer). Watch it at 720p and imagine yourself weakly watching a HDTV while slouching on a couch with a controller, screaming like a lil’ girl, and creaming your pants. Or you can imagine yourself sitting upright in a chair stoically watching a monitor with a keyboard and mouse in front of you, silently resolved to destroy all enemies like the vengeful/valiant hero you are, and in unstained pants at 1080p+.

    The choice is yours.

      • SPOOFE
      • 7 years ago

      Oh yes. Stoic. That’s what you’re doing spending time playing video games. You’re not “having fun” or “relaxing” or even merely “wasting your time”. You’re being “stoic”.

      God, I never thought you fanatical PC losers could get this pathetic.

      • Duck
      • 7 years ago

      Consoles at 720p ??? Ha! More like 240p internal resolution.

    • walkman
    • 7 years ago

    Wow, I thought the demo was great. Especially the way the lava flows generated light from within, the swirling snowstorm, and the way the knight made a realistic stomp when walking. The thing I was waiting to see was a human character.

    • cjb110
    • 7 years ago

    From a technical point of view that on the fly code editing was seriously impressive. Ok its a very modular setup which will help with the dynamic load and unloading of the code, and he only altered one small part. But to switch like that when the codes compiled with no delay is very cool.

    • Arclight
    • 7 years ago

    Meh, not impressed by the demo. That’s not to say that the new engine is bad, rather the art work….that lava demon looked not so detailed. The destruction felt to artificial, didn’t like it.

    • PrincipalSkinner
    • 7 years ago

    So when the games using this tech get released, current or even next gen cards might be too feeble to run them.
    Are we going to have to wait for 14nm or w/e generation of cards?
    P.S. I’m getting too old waiting.

    • DeadOfKnight
    • 7 years ago

    I think the Samaritan demo actually looked better…at least I found it more technically impressive.

      • Arclight
      • 7 years ago

      Yes that demo looked stunning. This looks bad really. If i were to only see this i’d think that the engine used for Crysis 1 in 2007 looked a lot better.

      • lethal
      • 7 years ago

      Samaritan was running with a triple-SLI configuration though. This is running with just one card.

        • Narishma
        • 7 years ago

        When it was first shown, yes. But a few months later it was shown again running on a single GTX 680.

          • khands
          • 7 years ago

          Technically the image fidelity was way less on the 680 too though.

    • indeego
    • 7 years ago

    Please kill lens flare. Are we honestly supposed to be playing the game through the eyes of a crappy camera or immersed as the character?

      • Meadows
      • 7 years ago

      An option is an option. I have no issues with lens flare and other cinematic effects, just as I don’t have an issue immersing myself when watching a movie.

      The video hasn’t established that this is a first-person game concept and hasn’t mentioned that lens flare cannot be turned off, so the effect is fine by me.

      • Bensam123
      • 7 years ago

      Yes sir… behold the crappiness that is post processing. I never understood lens flare, still don’t. The same can be said about depth of field. HDR makes moderate sense, but only in specific scenes and not caked all over the world.

        • Meadows
        • 7 years ago

        DOF is a good effect when not exaggerated.

          • lilbuddhaman
          • 7 years ago

          ie: DOF is a good effect when 1 week after a game’s release some guy releases an FXAA injector that fixes the way overtuned effects the devs decided on the game.

          • Bensam123
          • 7 years ago

          You do depth of field naturally no matter what you look at, even if it’s a computer screen. It’s redundant.

            • Meadows
            • 7 years ago

            Do not confuse depth of field with defocus. A computer screen is (roughly) at a uniform distance to you, so if you shift your eyes to a corner and can’t read what’s in the centre, that’s not DOF. I’ve heard this false argument too many times already.

            • Bensam123
            • 7 years ago

            I think you’re attempting to create a false dichotomy here. Depth of field is where it takes focus off of everything around you and focuses directly where it wants you to. That’s what your eyes do naturally. If you’re aiming down a sight of a gun you wont be focusing on anything around it. The game is merely taking away your freedom from being able to look away from it. It’s redundant, like I said.

            And no, you haven’t heard that argument so many times because I’m one of the few people that argues against such post processing and uses that argument.

    • Meadows
    • 7 years ago

    Well done, Epic. Colour! [i<]Wehehell dooone![/i<] At first, I was like "what the hell, Epic, again?! Again with the godforsaken crap GRAY AND THE PASTEL NEONS BLOOMING?" But then I noticed the lava actually used a saturated colour. I am so amazed, I think I'll cry. Well done Epic, you developed colour!

    • Bensam123
    • 7 years ago

    Oh PhysX, how neglected and crippled you’ve become. All because Nvidia stole you away from your true masters and turned you into their little pet.

      • l33t-g4m3r
      • 7 years ago

      What? Dx10+PhysX is the future. Dx11 is just a gimmick. /Prime1

      edit: added /prime1 to clarify sarcasm.

        • Meadows
        • 7 years ago

        Up until now I was under the genuine impression that you’re an irredeemable idiot, but now I just think you’re trolling.

          • derFunkenstein
          • 7 years ago

          Can he be both?

        • Bensam123
        • 7 years ago

        I don’t understand…

          • l33t-g4m3r
          • 7 years ago

          Exactly. PhyxX was a gimmick before nvidia touched it, even though nvidia turned that into their own pet lock-in project. There’s been enough investigative articles, which you’d know if you’ve actually followed up on it. PhysX doesn’t use multitasking or any CPU optimization on purpose, and nvidia gets deniability by leaving the controls in the dev’s hands, which they happen to bribe through the twimtbp program. Blaming this on on nvidia is a nogo, since it’s been admitted nothing has been changed since Ageia in the cpu code path. Ageia designed physx to be a scam, whereas competition like havok was not. Look at how well PhysX runs on consoles and mobile hardware, then compare that to the PC. I knew it was a scam from day one, and that nvidia would quit supporting it as soon as they had dx11 hardware, and they did. Fermi did better than anyone expected with tessellation, so that became the new gimmick. Eg: Crysis2. PhsyX is dead. It’s probably a complete mess of code, and it doesn’t even perform well accelerated by nvidia’s cards. Nvidia want’s higher benchmark numbers, and coders probably don’t want to mess with unoptimized garbage, so PhysX is slowly being put out to pasture. Don’t bother shilling if you’re a fan, it’s dead. Instead, do something constructive and move on to the alternatives. That would save everyone, including Nvidia. They apparently don’t want to actively support the garbage bloatware, and a superior alternative would help them save face instead of being forced to publicly admit failure with a monopoly. Making PhysX open source would help, much like creative switched to OpenAL, but that was too late. That or giving it to MS for Dx12. Hopefully nvidia doesn’t make the same mistake, but I don’t think they actually care if physx suicides. At the moment it’s still useful enough to keep on the back burner, just in case.

            • Bensam123
            • 7 years ago

            No, it wasn’t. I was here for the original review of Ageia’s PPU, I know full well what I’m talking about.

            I’m not going to bother reading further then the first two sentences since you don’t seem capable organizing your thoughts enough to put them into paragraphs.

            • l33t-g4m3r
            • 7 years ago

            [quote<]I know full well what I'm talking about.[/quote<] No you don't, you overzealous and out of touch fanboy. Everything I've said is directly backed up by articles, both investigative and direct interviews. Nvidia changed nothing, and physx was always a scam. Bloody ridiculous that I have to google it for you, but I will find you the quotes in a few minutes, provided they're still around. [url<]http://www.tomshardware.com/news/nvidia-physx-amd-gpu-multicore,9481.html[/url<] [quote<]I have been a member of the PhysX team, first with AEGIA, and then with Nvidia, and I can honestly say that since the merger with Nvidia there have been no changes to the SDK code which purposely reduces the software performance of PhysX or its use of CPU multi-cores. Our PhysX SDK API is designed such that thread control is done explicitly by the application developer, not by the SDK functions themselves. One of the best examples is 3DMarkVantage which can use 12 threads while running in software-only PhysX. This can easily be tested by anyone with a multi-core CPU system and a PhysX-capable GeForce GPU. This level of multi-core support and programming methodology has not changed since day one. [/quote<] They admittedly pass the buck to the developers to support multithreading, which provides plausible deniability, while secretly paying them off to not enable it in games. [url<]http://www.realworldtech.com/page.cfm?ArticleID=RWT070510142143&p=4[/url<] [quote<]The truth is that there is no technical reason for PhysX to be using x87 code. PhysX uses x87 because Ageia and now Nvidia want it that way. Nvidia already has PhysX running on consoles using the AltiVec extensions for PPC, which are very similar to SSE. It would probably take about a day or two to get PhysX to emit modern packed SSE2 code, and several weeks for compatibility testing. In fact for backwards compatibility, PhysX could select at install time whether to use an SSE2 version or an x87 version – just in case the elusive gamer with a Pentium Overdrive decides to try it. But both Ageia and Nvidia use PhysX to highlight the advantages of their hardware over the CPU for physics calculations. In Nvidia’s case, they are also using PhysX to differentiate with AMD’s GPUs. The sole purpose of PhysX is a competitive differentiator to make Nvidia’s hardware look good and sell more GPUs. Part of that is making sure that Nvidia GPUs looks a lot better than the CPU, since that is what they claim in their marketing. Using x87 definitely makes the GPU look better, since the CPU will perform worse than if the code were properly generated to use packed SSE instructions. [/quote<] It's a scam, has been from day 1. That includes Ageia, and nvidia bought them out so they'd have a unique proprietary lock-in checkbox feature, which is totally unnecessary to use if you want physics. All modern PhysX titiles are TWIMTPB sponsored games. Nobody is using PhysX otherwise. Also, TR has mentioned things about PhysX too: [url<]https://techreport.com/discussions.x/19216[/url<] [quote<]games implement PhysX using only a single thread, leaving additional cores and hardware threads on today's fastest CPUs sitting idle. single-threaded PhysX code could be roughly twice as fast as it is with very little extra effort. Between the lack of multithreading and the predominance of x87 instructions, the PC version of Nvidia's PhysX middleware would seem to be, at best, extremely poorly optimized, and at worst, made slow through willful neglect. The PhysX logo is intended as a selling point for games taking full advantage of Nvidia hardware, but it now may take on a stronger meaning: intentionally slow on everything else.[/quote<] [quote<]I was here for the original review of Ageia's PPU[/quote<] Apparently that's the only thing you were here for, otherwise you'd know better. P.S. My original comment about dx10+PhysX was an imitation of Prime1, and Meadows does have a faulty sarcasm detector, among other things related to common sense. I retroactively added /Prime1 for the slow people.

            • Bensam123
            • 7 years ago

            So, first article is from Toms Hardware after the Ageia merger. Nvidia is claiming that they didn’t cripple PhysX in order to further promote their graphics cards and hinder the competition (AMD). Hmmm… I wonder who I’m going to believe here.

            Give me a article about this BEFORE the Ageia merger.

            PhysX was created with the ability to offload all the physics operations in software, which is extremely parallelized to however many CPU cores are available. That’s why the PPUs didn’t sell well in addition to there being no games that took advantage of PhysX beyond eye candy. A normal CPU could very easily offload physics and beat the performance on a PPU at the time. Matter of a fact, Ageia used that offloading as one of their selling points.

            If there was a mystical switch that simply needs to be turned on to make it work, then Nvidia was the one that put it there in the first place.

            The second article once again takes place two years after the Ageia merger. Third article is just an analysis of the second article.

            See the trend here? The results and the complications came AFTER the Ageia merger and after Nvidia started pushing PhysX and how awesome it is on their GPUs. It’s entirely possible for Nvidia to recompile the code in a different way. In other words we’re back Nvidia causing schenanagins.

    • Duck
    • 7 years ago

    Meh. UT99 still looks better to me.

    This is just a giant display of dick waving from Epic and NVIDIA.

      • Deanjo
      • 7 years ago

      If Ruby had a dick to wave I’m sure she would be doing it too.

      • crabjokeman
      • 7 years ago

      Duck is not impressed? Wait a minute…

        • Meadows
        • 7 years ago

        The coarse language version of Krogoth?

          • derFunkenstein
          • 7 years ago

          Now with slightly improved grammar and spelling!

            • Duck
            • 7 years ago

            🙂

      • rrr
      • 7 years ago

      Well, UT3 did sucked in terms of gameplay compared to UT99, so you do have some kind of point.

    • homerdog
    • 7 years ago

    It would be sweet if they’d release a demo that we could play around with. They don’t have any hardware that we (technically) don’t have access to.

    • PenGun
    • 7 years ago

    They are giving us pretty good bandwith on those downloads. I’m getting my 3 MBs pretty well constantly.

    They are 911MB and 3.2 GB the 1080 ones.

    Whu I have the smaller one … impressive.

      • cynan
      • 7 years ago

      Oh yeah? Well I also have a 28 Mbit connection.. so.. so.. So THERE!!

        • PenGun
        • 7 years ago

        Oh noes … beaten by 4 Mb … what will I do now?

          • cynan
          • 7 years ago

          Be oh so very envious? 🙂

          I was trying to rib you for sounding like you were bragging about your connection speed (though you weren’t) – but ended up making myself look like that douche instead.

          Or maybe I did it on purpose out of altruism to misdirect from you? You’re welcome.

    • glacius555
    • 7 years ago

    Previously linked final fantasy video was much better..

      • cynan
      • 7 years ago

      I was thinking that too. I was much more impressed with the second video showing the engine tool set and all the stuff you can do real time than the actual demo itself.

    • kamikaziechameleon
    • 7 years ago

    UE is coming from pretty far behind, they’ve been very focused on poly counts and texture mojo. Thing is that stuff isn’t what makes a game feel good. I’ll believe it when I see it in a game from these guys.

Pin It on Pinterest

Share This