Immersive Technology Is All About the Tech
Or Is It?
(Source: ImageFlow/Shutterstock.com)
Whether you are talking about extended reality (XR), spatial computing, holograms, or volumetric data, people tend to have expectations as to what the experience should be. The first time I put on a VR headset a few years ago, I'm not sure what I was expecting, but I do know I was disappointed. Let’s face it: Even today. we're not quite at “Avatar” level immersion.
Still, people have not given up on immersive technology’s promise, and they are counting on engineers to deliver the goods. Almost every industry you can think of is optimistic about the benefits immersive technology will bring, making the market for this technology as expansive as an engineer's imagination. Yes, delivering on their expectations is a challenge, but then, that's the life of an engineer: Solving challenges. But what particular challenge are we dealing with?
Creating a Truly Immersive Experience
To create a truly immersive experience, you need sensors, machine vision, 3D scanning, power management, and hundreds of different components in the user device itself. In some cases, like volumetric video, hundreds of sensors and cameras surround the action. So, is technology itself the challenge?
Co-founder of HaptX and Professor of Biomedical Engineering at Cal Poly, Dr. Robert Crockett, says, not exactly. Instead, he thinks the engineers who will do well in the immersive technology space are the systems thinkers. With readily available, high-quality, off-the-shelf components, engineers can focus on how all those components work together as a system.
That means putting much of your effort into predesign work, such as thinking about the requirements and what's really important to the customer—then laying out the system and identifying where it could fail. For HaptX, that meant lots and lots of prototypes to help think through the engineering challenge of putting all the pieces together. Then, lots and lots of testing and iteration until they felt they had a reliable system. As their name suggests, that system is a glove that adds a realistic "touch" experience to virtual reality.
It is that authentic experience you want to strive for in terms of performance. Yes, everything you need is likely widely available, but choosing which part to use will often depend on the application.
For example, when we think about a realistic visual experience in VR, does that mean you need to supply the highest resolution available for your VR headset? Well, it depends.
A virtual reality game or an engineering walkthrough of a digital twin may not need the highest resolution. In those cases, thanks to The Law of Closure—the gestalt law that says if there is a small-enough break in an object, we will perceive the thing as continuing in a smooth pattern—lower resolution may be just fine. But if we're being wheeled into an operating room to undergo intricate brain surgery, it's probably better if our doctors practiced on the most realistic virtual cranium possible. Yet, even if high resolution isn't necessary, it could be the thing that helps gain market share.
In many cases, resolution is not as important as latency. Motion-to-photon latency is one of the barriers engineers are trying to break. Ideally, you want the user's spatial experience to be indistinguishable from reality. To do that, experts say you need a delay of under 15ms. Anything above that will be frustrating for a user, but it can also cause nausea or potential danger depending on the use case. If you cannot resolve problematic latency issues, your hardware will likely find its way to the landfill.
While off-the-shelf sensor systems are not quite at that 15ms rate yet, we continue to see incremental improvements. Oculus decided not to wait for component manufacturers to figure it out and built a sensor that supports sampling rates up to 1000hz and has just a 2ms delay. They've proven it can be done, so hopefully, theirs is the tide that raises all boats.
As Crockett said, there are hundreds of components going into these systems, and you have very little space in which to work. So, for example, if you are working with Extended Reality (XR), what embedded vision interface you choose is an important decision, but you'll want to keep it simple because you're going to have a minuscule amount of room for cabling.
While USB 3.0 with a good 5GB/s bandwidth is an excellent multipurpose interface and basically plug-and-play, its large connectors and rigid cabling will take up much room. On the other hand, it will also likely reduce costs and development time. That could be why the MIPI CSI-2 (Mobile Industry Processor Interface) standard is one of the most common interfaces used in head-mounted VR devices. Also, MIPI's 6GB/s high bandwidth is faster than USB 3.0. Finally, CSI-2's multi-core processors mean it uses few resources from the CPU.
Power regulation is also something you will need to pay attention to when making devices that users wear. Increasingly higher screen resolutions and refresh rates lead to high display power consumption. Oculus tells us on their developer site that a governor process on their device "monitors an internal temperature sensor and tries to take corrective action when the temperature rises above certain levels to prevent malfunctioning or scalding surface temperatures. This corrective action consists of lowering clock rates." Oculus’s advice is that no amount of optimization is ever wasted, as "every operation drains the battery and heats the device."
The Missing Piece: Sensations
While all these features are important to the immersive experience, one key piece is still missing: haptics. That key component to creating a believable interaction with computers brings us full circle and back to being able to sense it—and haptics allows this to happen. Haptics simulates sensations someone would experience in the real world when they interact with a physical object. That's why your game controller rumbles when you receive damage and your phone vibrates when you get a text message. That vibration is achieved through eccentric rotating mass (ERM) actuator, linear resonant actuator (LRA), and even piezoelectric actuators.
However, rumble and vibration are what Crockett calls symbolic haptics. They are just a cue. Engineers strive for an authentic sensation, or as exact as virtual can get, which he calls natural haptics.
Natural haptics makes the user feel they are holding the real thing. For example, force feedback, the feeling of force on the wheel in an automobile simulation, is closer to natural haptics. But what do you do when you are not physically touching an object, but it's essential that you feel like you are?
When Jake Ruben, Crockett's partner, came to him with his vision of creating natural haptics, they started picking at the problem and came to a realization: Realism in the hands was crucial to true immersion, and it would also be the most difficult to achieve.
Why focus on creating a realistic sensation of something on your chest if humans do most daily activities with their hands? It makes sense to crack that code first, then move on to other haptic experiences.
To demonstrate the reliance on touch, Crockett used the example of an aircraft simulator. Pilots must flip many switches when getting a plane off the ground, staying airborne, and, ultimately, landing. They have their eyes focused on multiple gauges and panels, and they typically do not look up to flip a switch. Instead, they know where it is through muscle memory. Building muscle memory is a common use case for immersive technology, and in this case, natural haptics is needed. But, just as you need to put the virtual switch in the exact right spot, you also want the flip to feel exactly like the real thing. To say this isn't easy to pull off is an understatement.
HaptX decided to use pneumatics for their haptic feedback. Each glove has more than 130 tiny little balloons, which they call tactors, that inflate with air and push into your skin. Those tactors replicate the exact pattern you would feel when touching, say, a switch in a cockpit. You also want the user to feel they are touching something solid. To physically stop the user from putting the hand right through the object, they've placed a series of tendons on the back of the glove. That ability to control the hand and feel the correct pattern with the proper displacement into the skin gives the user a natural feeling of interaction with an object that doesn't exist.
Overcoming the Engineering Challenge
Even Crockett admits that no engineer in their right mind would devise a system that required hundreds of tiny individual tubes, each leading to a balloon and each connected to a set of proportional control valves. It was an incredibly tough engineering challenge that has taken almost a decade to solve. Nevertheless, they eventually came up with a practical system that didn't involve any exotic technology.
Their solution is an excellent example of that systems thinking mentioned earlier, and that is why a company like HaptX looks for engineers who understand systems and think at a high level. That is, they think not just about what works, but what works together to form a more extensive design that's reliable, robust, and meets the requirements.
If you are getting the sense that immersive technology involves more than just good engineering skills, you may be right. It is certainly helpful to understand computer science, optics, and even human perception. But many experts believe an even broader skill set is needed.
A working knowledge of psychology, physiology, biology, kinesiology--well, let's just say a lot of 'ologies—will come into play when designing for user-specific experiences. That’s when things will get tricky, because no two realities are the same. Age, gender, body type, nationality, health, economic status, abilities, and more inform our lived experience.
As crucial as that inclusive experience is, universal design principles should be considered when creating the hardware and devices that enable those experiences. Originally a concept applied to architecture, universal design has spread into almost every discipline. It is designing to accommodate everyone regardless of their age, size, ability, or cognitive experience.
In a paper presented at a workshop on mixed reality and accessibility, Microsoft researchers posited that “considering accessibility as a core part of a system's iterative design process is valuable not only for the more than a billion people worldwide who have some type of disability but for all users since everyone experiences situational disabilities dependent on their context.”
An example the researchers gave of a device that is ripe for improvement is the head-mounted display. Not only are many of them quite heavy, but they are often not ideal for people who wear eyeglasses, hearing aids, or cochlear implants. Then there is the range of motion and dexterity required just to tighten the headset. These factors can restrict use in a large number of potential users.
Conclusion
Engineers are in a position to make immersive technology more inclusive by simply including universal design into their system thinking. Immersive technology is at that critical point in time many technologies encounter before becoming widely adopted game-changers. It is approaching maturity, but it is not yet widely adopted. Users still have hope that their experience will match their expectations. That makes now the perfect time to integrate accessibility into the technology.