Why are games so bad at animating eyes?

I will admit, I don’t personally get a strong “Uncanny Valley” feeling of unease from watching most computer generated animation. I’m usually paying attention to the characters’ mouth movements or how their hair or fur looks — you know, nerd shit.

Still, if you’ve ever really looked closely at a game character’s face, there’s usually no way to mistake it from a real live human face. Even the best efforts — The Last of Us comes to mind; above — always fall short in some respect. But why?

Over at Fastco Design, Mark Wilson explores some of the manifold reasons that human eyes remain a tall order for videogame animators. Some of it is pretty intuitive — we pay more attention to ‘realism’ in human faces than other parts of our body — but a lot of it comes down to sheer processing power, and the fact we don’t really know why eyes do some of the things they do.

Hollywood studios, which can spend several hours rendering a single frame of a film, can use a technique called ray tracing. Ray tracing simulates real light passing through planes and bouncing around objects, essentially duplicating the physics of how light interacts with objects in our physical world. But video games don’t have several hours to render a frame, since players demand 30 to 60 frames per second for it to feel smooth. This means the Unreal Engine has just 16 ms to visualize everything in a frame.


When characters are removed from their carefully pre-choreographed contexts, the true limits of our understanding of eye logic float to the surface. For instance, we can make several saccades — or eye movements — voluntarily and involuntarily each second, for reasons that can be hard for scientists to quantify. Suddenly, for a character to be convincing on screen, they need to calculate and perform this confusing, visual logic for all other humans to judge.

“You need an AI eyeball. You need an AI to do saccading with the correct speed,” says [Epic Games senior programmer Brian] Karis. “Even if you’re not going to speak to a character, having it sit in a chair, acting like a human, shifting its weight, looking like a believable passive person, that’s a challenge!”

I don’t know about you, but I’ve never thought about videogame characters as having their own eyeball AI. There are other complicating factors as well, like the fact our eyes are technically several transparent layers all refracting light through one another, and also made of gross goo milk stuff. (OK, that part’s a paraphrase.)

Head on over to Fastco Design for the full article. Wilson’s writing is very accessible and conversational, and it’s a great Friday read!

(h/t Gamasutra.)

Disclosure: Zam’s parent company Tencent owns a majority share of Epic Games, employees of which are quoted in the article. Epic Games has no control over our editorial.