Google Stadia promises low lag with ‘negative latency’

Not every game requires split-second, frame-perfect response time. The games that do, however, make a good case against cloud-based gaming. Google wants to change all that with Google Stadia and something it’s calling “negative latency.”

Talking to Edge, Stadia’s VP of Engineering Madj Bakar said that Stadia will be more responsive than in-home gaming solutions.

“Ultimately, we think in a year or two, we’ll have games that are running faster and feel more responsive in the cloud than they do locally, regardless of how powerful the local machine is,” Bakar said.

Bakar explained negative latency (which sounds like fuzzy math or any of those other shady terms) as being a buffer of predicted latency between the server and the player. The server can then do things like run the game at a super-fast framerate or predict a player’s button presses.

According to Edge, Bakar said that with these techniques, a game could feel more responsive in the cloud than a console game running locally at 30fps with a wireless controller.

Let’s pick through that.

Negative latency is a silly term

But this isn’t as ridiculous as the “negative latency” term makes it sound. A game running at 30fps using a wireless controller inherently has a lot more latency than even a 60fps game or something running at 120fps. A game running at 30fps updates 30 times per second instead of 120; there’s much less input for Google to account for and more space in which to do it. And while the latency a wireless controller introduces is generally imperceptible, it stacks up with all that other latency.

Just putting those qualifiers on this idea makes the stars Google is reaching for much closer; matching the experience of a StarCraft II or Street Fighter V pro player on a 144hz display with a wired controller is a different thing altogether. Those players have taken measures to minimize latency in play. An average gamer, on the other hand, might be playing on a game console or an older computer with more limited hardware.

The idea of Google predicting my button presses makes me uncomfortable, but I believe it’s possible; Google has the vast experience with machine learning and artificial intelligence necessary to do exactly that. It already knows that when I search for “sink grind-y thing” because I can’t remember the right word that I mean “garbage disposal.” Single-player games are much more predictable than multiplayer, too.

How this works in practice remains to be seen, and it’s possible it might even stink at first. But if Google sticks with the technology, it’ll likely get better and better. Google Stadia is launching in November, so we’ll be able to see how well it works soon.

avatar
12 Comment threads
4 Thread replies
0 Followers
 
Most reacted comment
Hottest comment thread
15 Comment authors
ZtempDeadOfKnightThe lost catgrimdanfangopsuedonymous Recent comment authors
  Subscribe  
newest oldest most voted
Notify of
Ztemp
Guest
Ztemp

You cannot break the laws of physics Jim!

DeadOfKnight
Guest
DeadOfKnight

Sounds similar to retroarch’s run ahead mode for emulation.

grimdanfango
Guest
grimdanfango

However much nonsense-or-not the actual technology is, either way it’s still a solution in search of a problem.

People into high-fidelity game experiences have gaming hardware. People who aren’t just play smartphone games on the phones they already have.

Casual gamers might be fine with Stadia… but they won’t be interested in it.

Who exactly is going to bother with this? Did a whole new bizarrely uncatered-for demographic spring up out of nowhere since OnLive flopped?

The lost cat
Guest
The lost cat

I’d suspect a lot of people are tired of buying storage, and endless patching, and not being able to play with whatever hardware they have on them when they have time to kill…
I plan to try it myself, though I’m not preordering.

psuedonymous
Guest
psuedonymous

Just like Microsoft’s old Project Delorean lag-compensation technique. It has the key failing that the end client needs to receive all the frame varaints before selecting the one to display, which means the client can potentially analyse the incoming frames and select the preferable one (e.g. the hit rather than the miss) and synthesise the input for it to return to the server as the ‘real’ input. Google’s claim that this will be ‘faster than local rendering’ is pure bullshit. Not only are the techniques they describe just as applicable to local rendering as remote rendering, they are already implemented… Read more »

brucek2
Guest
brucek2

If there’s anything real to this tech, it could just as easily be implemented on a local machine too, thus allowing local to retain its advantage.

Heiwashin
Guest
Heiwashin

Has anyone pointed out “onlive” tried this years ago and failed miserably. Latency hasn’t improved that much since that time, and the predictive shit is smoke and mirrors and if they actually tried to use a predictive input in high accuracy high speed games, the experience would be more frustrating than the latency.

Vaughn
Guest
Vaughn

unless google plans to roll out fiber to everyone in the USA I will believe this when I see it.

Bob
Guest
Bob

Its predictive button pressing just cheating?
I mean unless you own the servers and all the clients…. it has to be.

Bob
Guest
Bob

Um Isnt predictive button pressing just a form of cheating?
I mean, unless you control the whole game engine and all clients, that is pretty much just cheating glossed over.

chuckula
Guest
chuckula

Just have the bots replace the humans and make the game controllers be like those door-close buttons on elevators that don’t actually do anything!

Latency problem solved.

Wirko
Guest
Wirko

Elevators! You’re on to something here.
Elevators with negative latency

willmore
Guest
willmore

It actually sounds pretty reasonable to me. Google has proven to be very good at using neural nets to predict behaviors based on past behaviors. Gaming–with its limited sets of controls and a nicely time quantized frame-to-frame logic–seems like a very reasonable application to it. Yes, calling it negative latency is moronic, but I’m willing to over look that. Let’s say they can easily generate 120 frames/s but only need to deliver 30 of them. They could easily generate four possible outcome frames (what if they kept the controls as they are? What if they did the most likely change?… Read more »

XaiaX
Guest
XaiaX

They could easily generate four possible outcome frames This tech exists and has been used for retro game emulation, but it only works there because it’s effectively doing branch prediction and just following the possibilities down each path and discarding the counterfactual ones. That’s a lot easier to do when your entire system is, say, an NES with 256KB of memory than if you’re playing some modern game with an irreversible game state. The big challenge here isn’t going to be the neural network modeling and the control prediction, it’ll be altering the game engines to be able to have… Read more »

Krogoth
Guest
Krogoth

“Anti-Lag”, I like it better when it was called gaming under the influence.

Meadows
Guest
Meadows

It’s 20+ year old tech, except back in Q3A it was called “time nudge” and didn’t try to guess ghost button presses.

Pin It on Pinterest

Share This