On this page, you can read about the basics when designing your Hand-Eye Coordination interactions.

Table of Contents

Picking Up Objects

When picking up an object, you likely look at it before reaching out to grab it. You can use this to simplify the picking up process to: look at object and click.

This works well and feels nice, and it also removes the physical restriction from having to be close enough to an object to be able to pick it up.

Give the user time to perceive that they are actually picking up the object. One way to do this is by making it fly to your hand.

Consider differentiating the objects that can be picked up with gaze from other static objects with a specific color and presenting its own visual feedback when looked at.


By using the fact that we naturally look at a target before throwing, we can adjust the trajectory of the throw to be more accurate.

This helps the user achieve their goal of hitting the targets and reduces the frustration of throwing in VR.

Throwing in VR can be frustrating because you cannot feel the weight of an object and you are not holding the actual object, but rather the controller. This makes it more difficult for you to know how hard you should throw and often leads to you missing the targets.

Consider the balance of always hitting the target, and rarely hitting the target. Always hitting the target makes you feel powerful and in control but removes much of the skill needed. Rarely hitting the target is frustrating, but involves a learning curve, where you can improve and get better.

A scenario somewhere between is usually the most optimal, where it’s a challenge for the user but not an impossible one. But this all boils down to the use case. Sometimes you want to empower the user and give them superpowers and sometimes you just want to minimize frustration and make the experience more realistic.

The throw should match the user’s expectations of its behavior. For example, a thrown physical stone should follow the physical laws so that the throw isn’t acting like a guided missile.


Manipulating objects from a distance using gaze and telekinesis enables the user to quickly control the environment and objects to the user’s preference. This includes things like opening doors and drawers as well as translating objects around in the environment.

Selecting objects with a pointer instead of gaze can be tricky. With gaze selection it “just works” and it’s quicker.

Pointer selection also becomes tiresome after a while and it can be hard to select objects in specific angles, like selecting an object on the floor in front of you.

By having the movement of the object controlled by the telekinesis lerp depending on controller velocity and rotation, the experience feels fluent for the user.

Manipulating objects in a zero-gravity environment with air drag makes the experience feel ever better.

It’s a good idea to make the telekinesis sensitivity dependent on distance, and be less sensitive when objects are closer to the user.