Motion Tracking, the process of digitising your movements for use in computer software, is a key component of VR systems. Without VR motion tracking, you would find yourself restricted in the virtual world, unable to look around, move, and explore.
Being able to engage and interact with the virtual world the moment you step into a CAVE or put on your VR headset – without being reminded of the real world – is crucial to the creation of a truly immersive experience.
To understand how an object is able to move in three dimensional space, we need to look at the concept of six degrees of freedom (6DoF), which refers to the freedom of movement of a rigid body in 3D space.
Essentially, the body is free to move forwards or backwards, up or down, and left to right - the three perpendicular axes, or 3DoF.
This is then combined with rotation around these axes – or 6DoF.
The virtual world must mimic the movements that we do in the real world, like using our hands, moving our heads, and moving around a room, but the degrees of immersion vary depending on the application:
For some applications, like digital prototyping for the automotive industry, tracking is necessary and will either make or break your state of immersion.
In some other cases, a Virtual Reality or Simulation experience might need a more fixed approached, e.g. a flight simulator where the person is sat in a cockpit using a joystick.
There are already different software and technologies to make the most of exploring a virtual environment, let’s look at the main two types and at how they work.
There are two different types of applications that support the tracking of movement: optical and non-optical tracking.
Optical tracking is where an imaging device is used to track body motion of an individual. The person who is being tracked is required to hold handheld controllers or an HMD (Head Mounted Display) that has the trackers on them, or to wear optical markers, which are placed on certain parts of the body. More advanced options can also use sound waves or magnetic fields.
To track movements of the users point of view in a VR CAVE, we use a number of tracking cameras which send signals to adjust the images seen by the wearer as they move around the VR environment.
To maintain the immersion in the VR environment, the tracking of the VR glasses needs to be highly accurate. Leading manufacturers like ART or Vicon specialise in this, and ensure that the users viewpoint in the 3D behaves in the same way as it would in the real world. Any delay or latency, caused by inaccurate tracking, would cause a disconnection between the two.
Non-optical tracking makes use of microscopic electromechanical sensors that are installed in hardware or attached to the body to measure and track movements. These are typically gyroscopes, magnetometers, and accelerometers.
These optical and non-optical motion tracking methods are likely to be in motion tracking systems for the foreseeable future. But, what if in the next years you wouldn’t need to move a muscle in the real world, but could freely move our virtual bodies using nothing but our minds?
Even today the development of direct brain-machine interfaces is fairly advanced. In neuroprosthetics, for example, it’s already possible for quadriplegic individuals to operate robotic arms to accomplish complex tasks. Using targeted muscle reinnervation, the truncated nerve endings of amputees can be rerouted so that robotic prosthetic limbs can read signals from them and move the way natural limbs do.
At present these control methods require invasive surgery, but advances in sensor technology and further miniaturisation of electronics promises exciting times ahead.
Have a VR/AR Project in mind? Discuss your requirements with Antycip Simulation.