On this page, you can read about the basics when designing your Hand-Eye Coordination interactions:
Picking Up Objects
When picking up an object, you likely look at it before reaching out to grab it. You can use this to simplify the picking up process to: look at object and click.
This works well and feels nice, and it also removes the physical restriction from having to be close enough to an object to be able to pick it up.
Give the user time to perceive that they are actually picking up the object. One way to do this is by making it fly to your hand.
Consider differentiating the objects that can be picked up with gaze from other static objects with a specific color and presenting its own visual feedback when looked at.
By using the fact that we naturally look at a target before throwing, we can adjust the trajectory of the throw to be more accurate.
This helps the user achieve their goal of hitting the targets and reduces the frustration of throwing in VR.
Throwing in VR can be frustrating because you cannot feel the weight of an object and you are not holding the actual object, but rather the controller. This makes it more difficult for you to know how hard you should throw and often leads to you missing the targets.
Consider the balance of always hitting the target, and rarely hitting the target. Always hitting the target makes you feel powerful and in control but removes much of the skill needed. Rarely hitting the target is frustrating, but involves a learning curve, where you can improve and get better.
A scenario somewhere between is usually the most optimal, where it’s a challenge for the user but not an impossible one. But this all boils down to the use case. Sometimes you want to empower the user and give them superpowers and sometimes you just want to minimize frustration and make the experience more realistic.
The throw should match the user’s expectations of its behavior. For example, a thrown physical stone should follow the physical laws so that the throw isn’t acting like a guided missile.