I'm here at day two of Nvidia's Montreal 2013 event. Nvidia CEO Jen-Hsun Huang is on stage now. He stepped up there early, catching me off guard a bit. We're starting our live blog while things are already underway. Follow along as we stumble through, and be sure to hit "refresh" to see updates, since our live blog techniks are weak.
Jen-Hsun is introducing some new stuff! The first one is called GameStream. Our belief is you should be able to enjoy video games the way you do video content today. On different devices. We've created a technology for remote graphics. It's rendered, compressed, and streamed to you so fast, you barely notice it's far away. You've seen us demonstrating pieces of it streaming from GRID to Shield, from GeForce to Shield, we use GameStream.
Want to show you a few examples. Streaming from a PC to Shield.
Single best game machine that we all have is our PC. On the other hand, my favorite display is my 84" Sony 4K TV. Would be incredible if I could get graphics from my PC with Titan inside to my 84" Sony 4K TV.
Ok, we're gonna get a demo of streaming a PC game to Shield. Taking some time to explain. Jen-Hsun is talking about low latency and high quality video, how hard it was to get this right.
Interesting, so the Shield is plugged in to the TV and they're using a wireless Bluetooth controller connected to the Shield. So this is a little unexpected, given that Shield *is* a controller.
They're running Arkham Origins to the Shield over a wired GigE connection.
"The beautiful thing is you don't have to move your PC."
Yeah, it does look really good. Hard to see any artifacts at all.
This is a weird fit, putting Shield into "game console" mode, and GigE is a big requirement. Still, the tech does look to be solid. Would like to see it working over 802.11ac.
We want everyone to try it, so if you get a GTX 770, 780, or Titan, it will come with three games plus $100 off of Shield.
GTX 660 or 700 comes with two games and $50 off of Shield.
We'll launch this on Oct. 28, the same time the Shield game console mode goes live. There will be an OTA update for Shield and an update to GeForce Experience to make GameStream happen.
The next piece of technology I'd like to announce is ShadowPlay. If you want to record yourself while gaming, you don't want to affect your gaming experience by taxing the CPU. ShadowPlay uses the encoder in Kepler GPUs to record your gaming sessions. The "tax" is so low it can be always on, 1080p, auto-records the last 20 minutes if you'd like. Wonderful way to share with your friends. Also coming on October 28th.
What's really cool is combining ShadowPlay with GameStream and integrating to Twitch.tv. GeForce Experience streams directly to Twitch, easy integration.
Nvidia's Tom Petersen is gonna demo it. Feature is coming before the end of the year.
I think there's about to be a game on screen. Probably in a window.
Ah, here's the UI. You can set up where your webcam will be overlaid in the picture. Nifty! You can pick the resolution. Will upload to Twitch at 60 FPS.
Pulling up Splinter Cell: Blacklist. Fraps shows we're getting 100-120 FPS. Turning on ShadowPlay.
Still 120 FPS while streaming. So CPU-GPU load is negligible while streaming to Twitch.
There is a 2-3 second lag between the game and the Twitch stream, but they look very, very similar in quality.
Ok, demo's done. Time to talk 4K. Jen-Hsun's back.
Talking about 4K having, you know, a lot more pixels.
Later, in the back, have a look at 4K Surround. 24M pixels rendered silky-smooth.
To make this happen, GPU has to have enough horsepower and SLI has to work perfectly.
As you know, getting multiple GPUs to run in a synchronized way requires good technology. We've made sure SLI gives not just high frame rates but silky smoothness.
At the end of the day, the gamer's experience comes down to three things: stutter, lag, and tearing.
That is as frustrating for a PC gamer as foot fungus, acne. You'd like it to go away.
These things have been the bane of our existence for as long as we remember.
More brain cells have been dedicated to eliminating these three conditions in PC gaming than any other.
Trouble is that monitor and GPU operate independently, autonomously. Monitor sample rate is different from the source rate, so you will see stutter and lag. If you try to overcome it, you will see tearing.
In a perfect world, GPU would render into one buffer while GPU would scan from another. When scan is complete, if flips, scans out from another completed buffer.
In this example, we're drawing faster than the refresh rate. In a perfect world, we're always ready for the scanout, so every 16 ms, we see a new frame. This is your flight sim experience, this is your game console experience. You could have so much performance, you have every single frame at 60Hz.
But reality is frame rates are highly variable.
Every scene is a little bit different. Could be dramatically different. In this example from BF3 on the GTX 760, it varies from 60Hz to 35Hz.
GPU has to wait until monitor is done scanning all of its vertical lines before it flips to a new frame. If the frame draw takes longer than 16 ms, you have to wait until the next vertical sync. There are times where you scan effectively onto the monitor the same frame twice.
This is a fundamental problem where the frame rate is different than the sample rate of the monitor. That's stutter or, in TV talk, judder.
Triple buffering helps, but stutter and lag still happen to anyone who has vsync on. Only way to avoid it is to make sure you're rendering at some incredible rate so that the minimum rate stays above 60Hz. But that's really unlikely, and undesirable since we'd like content to get better, higher quality.
This is a problem that needs to be solved at a fundamental level.
Gamers are smart. They want to win. So they turn off vsync, update as soon as the frame in complete.
However, the monitor can be scanning when a frame update happens. You'll end up scanning part of the last frame and part of a new frame. Creates an artifact called tearing.
Sometimes, you're just tearing all over the place. Lag is lower, but lots of tearing.
So the tradeoffs are buffering with lag or vysnc with tearing.
The problems come because the GPU and monitor are independent of each other.
For the last several years, our engineers have been working on trying to solve this problem. Only possible to solve by creating a new GPU, new monitor tech, and software that integrates the two.
Nothing frustrates us more than building the world's most powerful GPU and the sucker is tearing all over the place. Something fundamentally wrong about that. We had a fantastic team of engineers working to solve the problem.
Today, we'd like to announce one of the most important works we've ever done for computer graphics: Nvidia G-Sync.
Is an end-to-end architecture that starts with a Kepler GPU and ends with a brand-new technology, the G-Sync module that goes into a gaming monitor, replaces the scaler, and achieves some magic we're gonna show you. The results will bring you so much joy in watching smooth, buttery graphics.
This is one of those things you have to experience.
Will be integrated into the top gaming monitors around the world. Asus, BenQ, Philips, ViewSonic have already signed up.
Here's what G-Sync does. Instead of the monitor controlling the timing with a refresh every 60Hz, we transfer the timing to the GPU. We shouldn't update the monitor until the frame is done drawing. And as soon as it is done drawing, we should update the monitor as soon as we can.
Logic is simple, but implementation is complicated. LCD monitors are complicated at the electronics level and at the molecular level.
Our engineers have figured out how to drive the monitors so the color continues to be vibrant and beautiful, so that the color and gamma are correct as timing fluctuates.
If we have an incredibly high frame rate, we drive the screen at the max refresh rate, as fast as it can be driven. That's now 144Hz. Lag drops to 2-3ms, stutter is eliminated. You're completely asynchronous to the frame buffer, and tearing is fundamentally gone.
The "wow" is gonna come when you see the monitors in operation. They look like normal monitors. However, when you go into 3D mode, magic has to happen.
Welp, I've seen it, and it really, really works. I'm gonna need a new monitor.
Incredibly smooth animation without tearing. There's no way really to convey with conventional videos or the like. You have to see it for yourself. The crazy thing is that 40 FPS can look quite smooth.
Jen-Hsun: this is such a big deal, we invited three people to come talk about this. The first person, one is a juvie. As a kid, he was put in juvie school. He and I shared that in common, only I was put there accidentally. I had a bad lawyer. In this case, it was justified. Two, second person, ZZT. Third person, if you look up his name on Google, you'll get all the most beautiful actresses in the world as the result. I don't know why.
The reason we've invited these three people is that they happen to be some of the most important contributors to modern 3D graphics.
If not for their work, our contribution wouldn't have been possible. They're true artists, amazing technologists. Their contribution to the world is now completely unquestionable. All three have never been in the same place at once. This is the first time they've fallen in love simultaneously with a tech that they wanted to share the stage together to talk about it.
Out come John Carmack, Tim Sweeney, and Johan Andersson.
Tim on G-Sync: "It's really nice." G-Sync gives all games a chance to make smooth animation. Don't have to target 60 FPS to get smooth animation. It's one of those experiences that's better without being easy to quantify, like using an iPhone the first time. Game devs won't have to assure they're hitting 60 FPS. Anywhere between 30 and hundreds of FPS, the game will continue to operate smoothly.
Johann: It really changes the perception quite a bit, and I was really surprised by that. This completely changes the perception of the picture, a continuous stream of that your mind interprets as a smooth picture.
We design games for a specific budget, but games now are massive experiences with very different environments. Impossible to render at 60 FPS all of the time. This, we don't have to worry about it as much, makes it much easier to create the experiences you want.
John: For years, the PC space has had a willful blind spot for the artifacts of tearing and stutter. We spend a lot of time on subtle effects, but have this egregious tear line running down the screen. We made the compromise to be 60Hz on our last title, but had to leave 20% or more of GPU power on the table. To be able to use all of the power you've got without suffering the stuttering or tearing is such a better experience. And once you go across 90-120 Hz, and enabling low-persistence modes, you'll see additional benefits. Only way to get games to higher frame rates is by having this incremental display tech. Frame rates can creep up, and it will make sense to have some crazy SLI system. It will be awesome.
Jen-Hsun: We have one more announcement to make, and then we'll have these guys come back and you can bombard them with questions.
If this is not one of the more important things we've ever done, I would be shocked. GPUs, work with game devs have contributed to advancing games in a big way, but this is one of those innovations that only comes along every so often. I'm super-excited about that.
Well, you can't really do an Nvidia press conference without graphics cards. I think I would disappoint tremendously if we didn't announce a new GPU.
Today, we're announcing the Nvidia GTX 780 Ti. This is our new high-end enthusiast GPU. Reviewers will be getting this soon. Will be on shelf mid-November. Performance is outstanding and the power is low. It is cool, quiet, and fast.
When you measure it, I think you'll agree, this is the best GPU that's ever been built.
(Hmm. So I guess it's like Titan++? GK110 with all units enabled, perhaps?)
Yesterday, we announced GameWorks. Flex unified physics solver. World's first global illumination real-time lighting system called GI Works. FlameWorks.
This morning, we announced GameStream, disconnects the computer from the display. Is essentially Nvidia's AirPlay. Very low latency, very fast gaming.
Also announced ShadowPlay. Available October 28. Streaming with very little tax on the system.
Lastly, we announced the revolutionary G-Sync. Changed the synchronization from the monitor to being controlled by the GPU. Created a G-Sync monitor, adopted by all the major gaming monitor companies. They'll be available in Q1 of next year.
And lastly lastly, the GTX 780 Ti. Our brand-new enthusiast GPU. Not only is it fast, but it's also quiet and low power. Works wonderfully in SLI driving three 4K displays in Surround mode.
Hope you got a sense of our perspective. How we think about enabling better gaming experiences. Thanks.
And we're done! I'm gonna go stalk Carmack. And maybe Sweeney. Know any good Canadian lawyers?
32 comments — Last by Arclight at 11:58 AM on 10/31/13
|AMD's high-bandwidth memory explainedInside the next generation of graphics memory||245|
|The TR Podcast bonus video: AMD, Zen, Fiji, and moreWith special guest David Kanter||53|
|TR's April 2015 peripheral staff picksMonitors, mice, and keyboards, oh my||58|
|BenQ's XL2730Z 'FreeSync' monitor reviewedFirst of its breed and 144Hz speed||240|
|EVGA's Torq X5 and X10 mice reviewedRodentia evgae||36|
|Nvidia's GeForce GTX Titan X graphics card reviewedYour GTX 980 is puny. I spit on it. Ptoo.||443|
|The TR Podcast 170.5: The Heimlich maneuverSuperfish aggravates our trust issues, and G-Sync vs. FreeSync doesn't help much||9|
|Five GeForce GTX 960 cards overclockedHow do I compare thee? Dunno, really||189|
|New Need for Speed looks like a lean, mean machine||47|
|Umbra action RPG uses Megascans tech to glorious effect||13|
|Deal of the week: 27'' AHVA monitor for $300, The Witcher 3 for $39||15|
|F1 2015 offers a new formula for racing fans||6|
|The Witcher 3 developer explains controversial graphics downgrade||28|
|Frostbite engine lead teases next-gen Radeon||28|
|Join us right now for a TR Podcast live stream||6|
|Gigabyte's Z97-HD3 motherboard reviewed||11|