Advanced

This page contains more advanced learnings about designing for social avatars:


Gaze Filters

When creating eye behavior for player-controlled avatars in VR, it’s important to make the eyes behave naturally. Here we take some of nature’s rules and apply them to the gaze signal to achieve as close to real eye behavior as possible.

As highly social animals, humans need to be able to quickly and easily identify people and their moods by sight, this means we have evolved heightened sensitivity to facial features and expressions. Eye movements that do not follow “the rules” are immediately noticeable therefore it is important to spend some extra time trying to filter out some of those Uncanny Valley candidates as early as possible. Here are some suggestions on how to use those “rules” to filter the gaze signal and create a more robust and natural looking avatar eye control system.

Divergence/Convergence

Real life eyes converge when viewing closeup objects (or when making silly faces) but they almost never diverge. We can use this fact to filter divergence and strengthen the signal and hence realism. If the gaze directions of the left and right eye diverge then combine them and take the average for both eyes.

Combined Vertical Axis

Eyes typically never point in different directions on the vertical axis. You rarely see someone with one eye looking up and the other looking down. This physiological characteristic can be used to our advantage by combining the left and right eye’s vertical component, delivering a more robust and realistic result.

Smoothing

Jittering eyes can appear extremely unnerving. A simple averaging filter can be used to great effect, bearing in mind too much smoothing can make eyes look sleepy or unwell. You might also like to consider increasing the amount of smoothing at greater angles as gaze signal noise can increase at higher angles.

Clamping

The average pair of human eyes rarely look up more than 25 degrees, or down more than 30 degrees, nor do they look left or right further than 35 degrees. This offers another small filtering opportunity for avatar eye control. This angular clamping is useful by itself but even more so when considering avatar model designs with larger eyeballs and smaller eye openings.

Cross Eyed Correction

On the subject of avatar model design, it can happen that a model appears cross-eyed (or boss-eyed) even when they actually are not, adding the option to adjust for this is good design practice.


Facial Expressions

It’s not only the eyes that bring an avatar to life, facial movements also play a significant role. Although we cannot accurately predict facial expressions from eye direction alone, we can link their motion into subtle facial expressions to make them feel more alive.

Feature Transitions

Simple facial feature transitions can be used to bring a face to life, for example moving the eyebrows and the mouth.

It’s important not to overdo facial expressions based on the user’s eye movements. Users might be unaware of what facial expressions their avatars are making and it might not reflect their real facial expressions and mood.

Blend Shapes

Using Blend Shapes (or morph targets) with eased animation curves allows for more complex and convincing expressions.

Micro Expressions

Real living faces are rarely motionless, facial muscles are constantly shifting and adjusting very slightly. These movements are known as “micro expressions”. Adding this kind of random motion can also add an extra touch of realism.

Fallback Gaze Control System

Although eye tracking might be a killer feature in social you don’t want users without an eye tracker, or the few who’s eye’s cannot be tracked, to suffer unnecessarily. It is therefore worth building in some kind of backup system. There are some off-the-shelf solutions that can add some sort of eye movement in such cases. They can also add random blinking, micro and macro saccades and facial micro expression.