Learn the Ins and Outs of Testing AR/VR
Everyday there are more and more examples of Augmented Reality (AR) and Virtual Reality (VR) being incorporated into the mainstream. Gone are the days where Tron was a fantasy. We are seconds away from completely immersing ourselves into a new universe from the comfort of our couch and driving cars that intelligently display data directly in the windshield.
For those who don’t work directly with AR or VR, they are very cool technologies that you’ve heard of and probably use regularly, but how much do you really know about them? Let’s start with the difference between the two:
AR takes a real scenario that a person is actually in, and adds on to it. Think about the car in the paragraph above. The driver will be looking through his windshield at the real road in front of him, but in addition there will be weather/traffic/music information projected within his view. More fun examples are Pokemon Go or Snapchat.
This is where Tron makes its comeback. VR immerses you into a completely new environment, that is most certainly not there. It eliminates any trace of the real world while you are living it and disappears when you’re done.
These two technologies while closely related, perform opposite optical feats. AR projects images in front of your eyes to make them seem like they are really there; VR projects images into your eyes to make it seem like YOU are really there. What AR and VR do have in common is that they need to work perfectly in order to convince the user that what they are seeing is real.
Well, for something to work flawlessly it needs to be flawlessly tested. How does one test something that isn’t real?
1. Test the Core of the Device
First thing’s first. Check to make sure the heart of your system works. Ultimately, the first piece of the pie is the projector that is transmitting an image so lifelike, it transports a person to a false reality. There is no way to convince someone they are really walking on a tightrope if the colours they are looking at are off or the image is fuzzy.
The optical assembly, the placement of the lenses and integration with the projector must be executed flawless and thoroughly validated. In AR/VR, Liquid Crystal on Silicon (LCoS) projectors are commonly used for their high resolution and minimal space between pixels. The pixel edges are also a lot smoother compared to other projection devices, making images seem more natural. They are the perfect mix between Liquid Crystal Display (LCD) projectors and Digital Light Processing (DLP) projectors. By using liquid crystal chips backed by a mirror, they can project all three primary colours simultaneously while blocking light with the liquid crystals and remain reflective. The small form factor also allows them to be more easily integrated into another system, including smaller-sized ones like a headset or glasses.
The LCoS needs to function flawlessly, both before and after integration, otherwise their new reality becomes a blur of fuzzy colours.
2. VR Device Calibration
The image that is being projected also needs to make sense. In VR, multiple projectors are aimed right at your eyes to plunge you into a brand-new environment. To do this, sets of projectors are built into the device you are using/wearing to show you something that isn’t there. This means that they need to work in sync, 100% of the time. If you have tried using a VR device before, or even if you have simply set up a new sound system at home, you will have likely already experienced calibrating a system to accommodate the environment it is in. The same must be employed for the components embedded inside the device before worrying about external considerations. Through calibration the VR device can compensate for any manufacturing offsets within a camera and sync up with the Inertial Measurement Unit (IMU).
We can’t forget that AR/VR devices aren’t meant to be watched but designed to interact with. The image being projected needs to be ready for you to turn your head quickly or swing your arms or scan through to the next screen without touching a thing. An AR/VR system isn’t tested until it’s calibrated.
An IMU is the device that captures the orientation and movement of a body and communicates it back to the projector.
3. Visor Optical Characterization
Remember that windshield from earlier… the one in smart cars that display data during the drive to make it safer? That was a lie. It’s not really the windshield that’s doing that, at least not yet. Today in AR, this data is appearing on a visor AKA a see-through helmet-mounted display (HMD).
This part of testing and validation is so crucial to AR when you consider the risks. A visor needs to keep the data clear and legible and seem far enough away to be read by a human eye without obstructing the view in any way. Consider a military pilot…the data being received is invaluable, and the view needs to remain clear.
Factors to consider in visor optical characterization include the curve of the glass, the light transmission, reflection and scattering characteristics.
4. Calibration Jig
And back to calibration. When dealing with a technology as dynamic as AR/VR there are too many moving pieces to take any risks when it comes to quality. At the end of the day, the components making up the system are the bread but test is the butter. It is as important to have a calibration jig to validate the test system(s). Otherwise as they say, garbage in — garbage out.
Ready or not, AR & VR will be more and more present in the day to day. By knowing how to test these technologies the right way, the trip into a new reality will be a much smoother ride.
To learn more about AR/VR testing or to speak with an optics expert, please contact Averna.
There’s Much More Optical Testing to See!
Learn the 7 best practices for flawless camera assembly through active alignment in our free eBook!