So I'm in Montreal, where Nvidia is hosting a two-day event about... gaming? Something? It's apparently a big deal, but whatever's happening in here is also shrouded in secrecy--although we know it's not the introduction of a new GPU. So there's that.
Anyhow, we're free to report on everything going on for the next few hours as it happens, so I figured I'd start up a live blog and see what happens. Be advised that this is just day one, and I understand day two may prove more interesting on the hardware front.
As usual, we pretty much lack any technical guile when it comes to live blogs, so you'll have to hit refresh to see any updates as they happen.
Ok, time to get started.
Nvidia's Tony Tamasi is here to kick things off. He's talking about the massive size of the PC gaming market. Says there are 600 million PC gamers worldwide. There are billions of dollars in gaming.
Now he's introducting Nvidia GameWorks program. Goal of it is to enable game developers. About 300 VFX engineers and artists at Nvidia work on it. The product of their work is the GameWorks library. Will be showing you a bunch of games that integrate it. Will demo some of the world-class tools used to do these things.
Nvidia's goal is to create a great gaming experience.
"We invest more in working with game developers than anyone on the planet."
Ten or so games we've worked with: Hawken, F2P mech combat game with PhysX. Borderlands 2 from Gearbox. Planetside 2, squad vs. squad. Project Cars, fantastic, realistic visuals, probably the most advanced car racing game on the PC. War Thunder, great fligt sim game, integrating our WaveWorks tech for water sim. Everquest Next, really ground-breaking ideas about MMOs. Star Citizen, truly stunning visuals in a flight sim running at 4K.
Batman: Arkham Origins, has a "panoply" of amazing Nvidia technology. Ohh, words.
Assassin's Creed Black Flag. Who doesn't want to be a pirate? (Oh, we have those on the PC. Yarr.)
Watch Dogs. Working with them to delay.... err, no. Working to add Nvidia technology, I mean.
GameWorks is Nvidia's developer platform. Is Nvidia's way to enable developers.
Three core pillars:
People, about 300, many of them Ph.D.s, world-class experts in their domains. Develop algorithms for amazing visuals and gameplay, while giving an artistic feel to it.
GameWorks library. We'll walk you through that in quite a bit of detail.
Developer tools. We have the world's best tool platform for graphics and game development.
What's in the GameWorks library? Six main things.
VisualFX SDK: tools and tech for complex effects, like WaveWorks.
Core SDK. Foundation tech for using GeForce and other Nvidia platform features.
Game compute library. Analogous to graphics library but for GPU compute.
Optix. SDK for building ray-tracing engines.
Let's dive into developer tools. We cover all the major platforms and APIs, integrated and stand-alone tools.
Demo time! Here's FaceWorks. Digital "Ira" -- virtual dude's head.
Ohh, just the eyeballs.
He's showing how you can debug shaders.
And now digital Ira is making faces. Eerily... realistic. Running on Logan SoC in single-digit watts!
Now we're gonna debug shaders on the Logan devkit in Linux. The tools work in Linux as in Windows.
"When SteamOS ships, we'll have tools that support SteamOS."
Dude, I am live blogging a debugging session. How did my life come to this?
Next piece: PhysX. Most popular physics SDK on the planet. Something like 500 games ship with it. Core system for game engines including Unreal Engine and Unity.
Covers a wide variety of simulation domains. We're gonna show you a little example of how some of those effects are authored. Inside of Unreal Engine, right inside of the editor.
Physics particle emitter on the ground. Have integrated turbulence into a number of games, including Batman.
Now he's like... developing a game, right in our faces.
Adding a "jet" to move the smoke from the particle emitter around.
He's tossing a ball around through the smoke and the smoke parts, flows around the ball realistically.
Tamasi: This is built right into Unreal Engine, so you can just do it.
One of the challenges of all physics simulation is the core algorithms can be complicated to combine and integrate. One of the things we're announcing today is called Flex. A unified GPU physics system. Allows different effects to influence each other. A unified solver. Great for parallelization across GPUs. Technology will be integrated into PhysX some time next year.
Demo. This is still research, so it's a little programmer-arty, but you should get the idea.
Combining a rigid-body sim with a fluid sim. Water knocks down blocks, and the blocks collide with each other. In the new Flex system, they just all kind of work. Now adding cloth + fluid.
Because it's all unified, we can just build these things and play.
Now showing a squishy frog. Realistic deformable rubber is usually hard to do.
Ooh, and they've made water balloons. "It all just kinda works." You've got balloons, there's water, they bounce around, they break.
Now we're combining water, fluid, soft bodies... rubber duckies in the tub!
So.. Flex is coming to PhysX next year.
Can do interactive ray tracing, ambient occlusion, procedural surfaces, and light baking. Optix is used as part of a lot of game dev's core pipeline.
Bungie is using it to calculate ambient obscurance (for lighting to be pre-baked into its next game.)
Now for a demo of a tool for using Nvidia's horizon-based ambient occlusion (HBAO) technique.
HBAO algorithm can be tweaked to increase realism. For instance, increasing shadow radius to allow larger objects to occlude more.
Developers can just grab a trace from them games, try out HBAO in this tool, see if it's going to work for them.
Our HBAO is the higher performance ambient occlusion and the highest quality. Best in class on both vectors.
Another thing we're working on is contact-hardening shadows. Next level of shadow detail. Hard edges close to light sources, software farther away.
Our shadowing and HBAO are fastest now just on GeForce, but also on Radeon. Game devs love it.
Combine multiple effects. Turn-key solutions to integrate lots of cutting-edge techniques together.
These are also multi-platform. We'll deliver on any platform that makes sense.
One of the new effects: GI Works for global illumination. Real-time simulation of GI.
Devs usually bake GI into their worlds, but it isn't dynamic. Often difficult to have updates happen. We've been working on a real-time GI library. One benefit is artists don't have to spend so much time cheating. Light will bounce, so it just works.
Demo time. We're in a virtual museum. Dude is turning on, off the GI sim. Showing how you get dynamic specular from light sources. Area light sources. Includes really high quality ambient occlusion. As light moves, even parts of the room outside of its direct "cone of influence" changes because the light bounces.
Should see GIWorks library integrated into games beginning next year. GPU-based, cross-platform. Could author your game entirely with GI, not have to place hundreds of fake lights.
GI solves a bunch of problems with light and shadow. Algorithms are complex, but simplifies many things for game devs.
Another one: FlameWorks.
Flame and smoke are classically hard, too. Not really correct from a volume perspective.
FlameWorks does volumetric effects much like in film, but done in real time. Multi-grid solver. Stochastic shadows and scattering.
Demo! Dragon is breathing flame, and flame is fluidly flowing around solid objects, acts as a light source. Really quite good.
Will see this in games soon, too. Already started integrating into core engines.
Folks, I am not making this up. Anand just showed up next to me carrying three different tablets to review. I think he'll probably finish one as I'm blogging here.
Tamasi is gonna wrap up his talk on GameWorks. Summing up what it is, lots of acronyms.
So that's the first part of today. The warm-up. We're gonna take a 15-minute break and see some games using this tech.
Ok, coffee time!
All right, folks. The next bit of this is going to be game developers making presentations, showing videos and demos. I think we'll wrap up this blog and resume tomorrow, when there are more announcements of new technology planned. I expect some truly interesting news tomorrow, so check back in the morning.
24 comments — Last by f0d at 9:48 PM on 10/17/13
|AMD's high-bandwidth memory explainedInside the next generation of graphics memory||258|
|The TR Podcast bonus video: AMD, Zen, Fiji, and moreWith special guest David Kanter||53|
|BenQ's XL2730Z 'FreeSync' monitor reviewedFirst of its breed and 144Hz speed||240|
|EVGA's Torq X5 and X10 mice reviewedRodentia evgae||36|
|Nvidia's GeForce GTX Titan X graphics card reviewedYour GTX 980 is puny. I spit on it. Ptoo.||443|
|Five GeForce GTX 960 cards overclockedHow do I compare thee? Dunno, really||189|
|The TR Podcast 169.5 bonus edition: Q&A intensifiesYou ask, we attempt to answer||5|
|BenQ's XL2420G G-Sync monitor reviewedTwo scalers, one monitor, zero tearing||54|
|Google unveils new Android OS, releases developer preview||12|
|Zotac shows off Kaveri Zbox with quad DisplayPort outs||3|
|AMD refreshes Kaveri APU with unlocked A10-7870K||30|
|Passively-cooled Zotac mini-PC packs Core M processor||16|
|OCZ is bringing a TLC-based SSD to Computex||10|
|Cisco says video will drive massive growth in Internet traffic||29|
|Perception first-person explorer puts players in a blind||20|