On this page, you can read about the basics when designing your avatar and NPCs:
Table of Contents
Reflect the User’s Eyes
When creating eye behavior for player-controlled avatars in VR, it’s important to make the eyes behave naturally.
Consider how the eyes move in real life - they cannot move individually in the vertical axis, and the same thing should be reflected in the avatar. If you allow the avatar’s eyes to move individually in the vertical axis, the avatar may appear to have lazy eyes. Our eyes can however move individually in the horizontal axis, for example when converging at an object up close. This should be reflected in the avatar’s eyes.
Make sure to have smooth eye movements in order to create stable looking eyes, this is especially important when the user is looking at wide angles as the data is less reliable there.
Clearly Visible Eye Movements
Eye size, avatar distance and headset resolution all play a part in how easy or how difficult it is too see another avatar’s eye movements.
When working with realistic-looking avatars, it’s not so easy to increase the size of the avatar’s eyes without breaking the realistic look, so consider scaling the environment to a good fit for users to be able to see each others’ eyes clearly.
Having more cartoony avatars with bigger eyes can also help seeing each others’ eye movements clearly at a distance.
Avoid scenarios where users are stuck at a too far distance from each other to see each other’s eyes.
Contrast the Eyes
In order for users to be able to see where an avatar is looking, it’s important to have a strong contrast between the white part of the eye (the sclera) and the iris/pupil.
This is inherent for realistic avatars but it’s important to consider when designing cartoony avatars that might not necessarily have a sclera. Having big eyes with a clear border around also makes them more visible from a distance.
Make Avatars Look Alive
When our eyes move, several other parts of our face also move. For example when you look upwards, small muscle movements happen in your forehead and eyebrows.
Using eye tracking, it is possible to simulate many of these facial movements.
Simulate facial movements with eye tracking to make the avatar look more alive and at the same time combat the risk of getting an uncanny feeling when creating more realistic avatars.
Enable Gaze Aware NPCs
If your non-player characters (NPCs) are aware of where the users look, they can become more sophisticated and life-like.
The behavior of gaze aware NPCs can be as simple or complex as you wish. Here are some other suggestions:
- Reacting to the user’s gaze and looking back
- Express varying levels of human behavior like aggressiveness, shyness, paranoia etc.
- Be aware of where the user is looking, and get a glimpse of what the user is interested in. An example is to change the conversation to something the user is paying attention to.
Don’t instantly make the NPCs react to your gaze, rather have them react after a short time threshold using the Dwell Activation state. This makes them appear more life-like since they will appear to have a short time to “perceive” that you are looking at them before they react.