From Unmanned Systems Magazine: VR training offers cost savings, safety benefits

Advertisement

When Amir Rubin was first exposed to virtual reality in 1994, the graphics were primitive at best. The device, if one could call it that, was large, cumbersome, and involved an unruly nest of cables and wires.

Still, the engineer and former soldier could not help but react enthusiastically to the potential of the technology he witnessed. 

“It was very complex, and lower quality than anything [most people today] have seen,” says Rubin.

Nevertheless, he describes himself as “hooked” as soon as he put the headset on. 

As CEO and founder of the Los Gatos, California-based Sixense Enterprises Inc., Rubin has helped shepherd VR into the mainstream. The weighty systems of two decades ago have given way to devices roughly the size of a good-quality cell phone, which outperform them in every conceivable manner. 

Companies run by Rubin and other entrepreneurs are making advances in VR technology, even as systems are well ensconced in workplace-training scenarios. Doctors are learning how to diagnose and treat maladies. Surgeons can “operate” on virtual “patients” with the same tactile feel they would get if they were in real operating rooms. Military members can learn how to throw hand grenades without fear of killing or injuring comrades if they do it wrong. Welders can learn to weld, painters to paint. 

Besides the obvious safety benefits, VR training technologies offer considerable cost savings. By relying upon it, users can expend less capital on physical structures, materials, and liability insurance. Government entities, like the armed services, can conduct exercises and classes with participants spread out across the globe, rather than under one roof. 

At Chicago-based ImmersiveTouch Inc., engineers have built a VR system that helps teach surgeons, ophthalmologists and other medical specialists how to practice their skills in scenarios as real as possible. 

The company’s founder, Pat Bannerjee, established the business in 2005 after spending the prior decade immersed in the possibilities of VR and its applications to medicine. Bannerjee, who holds a doctorate in engineering, noticed that VR training has been proven to optimize outcomes of procedures and decrease operation times. 

“Surgeons are more confident,” says Jay Bannerjee, the founder’s son, who now serves as ImmersiveTouch’s chief operating officer and president. 

“Imagine the headset and goggles in 3-D. It’s not just a computer screen. You’re immersed in that patient’s body. You can look at things you’ve never seen before,” Jay Bannerjee says. “We’re now unlocking the true potential of CT [computerized tomography] and MRI [magnetic resonance imaging] scans in the VR goggles.”

This way, when surgeons plan patients’ surgeries, they can better understand what needs to be done, Bannerjee says.  

ImmersiveTouch’s system is able to replicate haptic tactile feedback — the surgeon-in-training who wears the goggles actually senses what the scalpel is doing — hitting skin, puncturing bone, probing the mushy structure of the brain, or the curvature of the spine.

The system also can emulate the properties of blood and bodily fluid motion. It can trigger adverse scenarios, to which users must respond spontaneously. 

“This is no longer an idea. It’s a reality right now — a manifestation of 15 years of our founder’s vision,” Bannerjee says. “Every hospital is going to have this technology within five years or less.”

As VR expands and evolves, so too would companies that develop it. The related work being done at Intel, for example, is progressing at a pace so that it can no longer be pigeonholed solely as a semiconductor manufacturer.  

“We want to branch out into every end of the spectrum – CPUs [central processing units], modems inside of mobile devices, using our new Optane technology,” says Raj Puran, Intel’s director of XR, or extended realities. 

The job title, Puran says, is a catchall term that encompasses everything that involves both virtual and immersive reality. Using Intel’s Optane memory module, processing times and capacities for unmanned vehicle operation, photo telemetry, lidar scanning and other sophisticated operations are significantly increased. 

When paired with real-sense cameras, Optane-equipped drones can sense when they encounter objects in front of them and take active steps to avoid collisions. 

“You’re able to do sensing and tracking to see where you are, with the headset on, closed off from the real world,” Puran says.

From a training perspective, Intel incorporates technology similar to that found in electronic gaming, but in higher fidelity and with a greater set of functions. 

“If you’re training a surgeon [to handle] a neurological disorder, you’re not going to want some goofy looking thing,” Puran says. “You want a level of [three-dimensional] realism, and all the fidelity to make you think you’re working on the real thing.”

Intel provides its VR customers, Puran says, with the building blocks — system, tools, technology, software and headsets — for training-module developers to produce the most robust platform possible. Like ImmersiveTouch, the company has garnered considerable interest from the medical community. The company’s Los Angeles- and Israel-based offices have partnered with the University of California Los Angeles medical center to produce systems to help future doctors train to evaluate patients’ conditions. In time, Puran says, the team’s life sciences technicians intend to broaden the market for the resulting product. 

While the push to expand its VR customer base continues, Intel is using its products for a more immediate result: providing guidance to its employees. 

“We’re taking charge of how our factory personnel are trained in electrical safety,” Puran says. 

The old modality involves putting employees in front of a computer screen and allowing them to move through a self-paced training regimen. Once they complete the process, they take a test, get certified and move on. 

“We found we needed to do something more hands-on, which would give students more of a sense of association,” Puran says. 

Under the new protocol, the AR system can simulate spontaneous and hazardous situations to which they must respond properly or face an appropriate range of consequences. It has reaped benefits for the company’s safety record and should do the same for future customers, Puran believes.

“Imagine how that would play in the oil and gas industry,” Puran says. “If we are in a refinery and have a blowout on a drilling rig, how do you simulate that?”

Simulations on rigs now tend to be rehearsed situations. Under the new approach, the virtual space looks like a real catastrophic event. Outcomes are built into the program, and participants do not know when or where bad situations could arise.

“You engrain in the student [what to do] at a rapid rate that you wouldn’t get if you were using a textbook or Power Point. This causes a better level of retention. This is why we firmly believe VR training is so important for many industries,” Puran says. 

Intel also is working closely with law enforcement and other first responders, military personnel, and pilots to establish improved training scenarios.

At Bell Helicopter Textron Inc., in Fort Worth, Texas, Intel has set up its VR-NUC 8 systems, comparable to those used by gamers, to give prospective pilots advanced preparation in instrumentation procedures, start-ups, preflight and other aspects of operations before they ever set foot in a real cockpit. 

Again, Puran says, the system has shown to increase confidence levels among users, who can practice on it until they are completely comfortable. The cost, about  $1,500 per kit, is a fraction of that of traditional simulators, Puran says. The same systems will find use in training drone pilots, he adds. 

SimSpray uses virtual reality for a variety of training uses. Photo: SimSpray

SimSpray uses virtual reality for a variety of training uses. Photo: SimSpray
 
Teaching drones
 
Going beyond training drone pilots, the aeronautics and astronautics department at the Massachusetts Institute of Technology (MIT) is making inroads toward teaching the drones themselves to identify and avoid potential hazards. 

Success depends on more factors than simply teaching an unmanned aircraft to identify and avoid something in its path, says Sertac Karaman, an associate professor at the department. An expert in robotics and control theory, Karaman is particularly interested in developing control plans for drones that would allow them to make sophisticated flight-path adjustments while traveling at very high speeds. 

The aerodynamic drag that occurs during flight, for example, creates phenomena that are difficult to simulate. In quadrotor UAS, electronics can vary significantly during maneuvers. During turns, the four motors run at different rates. The chemistry of batteries, too, gets altered during fast flight. 

“A tight maneuver will pull a lot of current from a battery, and dip the voltage for a second,” Karaman says. “This is all extremely hard to model.” 

Karaman and his team approach the problem from several aspects. Flight tests are conducted in empty “motion capture” rooms, employing, more or less, the same techniques filmmakers use to record the motions of actors in sensor-lined suits that eventually would be used to provide movement for characters in video games. These virtually “furnished,” supercomputer-generated images are loaded into the drones, which enable them to “hallucinate” virtual environments, Karaman says. 

“If a ‘collision’ happens, there’s really no collision. But you can learn from that experience,” Karaman says. 

He and his team share a common interest in performance enhancement with the drone-racing community. The relatively new sport serves as a showcase for updated platforms. More importantly to Karaman, his work involves pitting pilotless drones against those with human operators. 

“You’ll see a drone going around an empty room, 40 feet by 40 feet [in dimension], at 15 miles per hour. It can go from one side to the other in less than a second,” Karaman says. 

In the first of 10 races between human-piloted and autonomous drones, flesh and blood prevails. 

“But by 10 laps, the computer will be better,” Karaman says. 

With each repetition, he says, human pilots actually become dizzy and less precise. 

In the near future, Karaman says, his team intends to release a “cool drone-racing simulator, open sourced.”
 
‘Believe it is real’
 
The 1994 project that gave Amir Rubin his first look at VR was part of a collaborative effort between the U.S. Navy and Israeli defense forces. From that experience, which involved a submersible combat simulator, Rubin has parlayed the knowledge gained into a company that now produces advanced, multimillion-dollar systems for a broad array of operations. 

Since then, Sixense has applied VR training to projects that include baseball bats for Louisville Slugger, golf balls for Callaway, medical training, as well as the safe operation of industrial spray-paint and welding equipment. Militaries are teaching new recruits to throw hand grenades well before they get to a real training range. 

“I was a soldier, unfortunately injured [while serving]. If I’m training in a VR environment – if I throw the grenade a few milliseconds too late, or too hard, [we] can simulate the injuries to soldiers and damage around you,” Rubin says. “You can do it again and again until you pass. And all the data is collected as to how well you did.”

With the exponentially higher resolution and lighter packaging of today’s devices, Rubin says, operators can quickly forget they are wearing VR equipment and focus upon the tasks at hand. 

Welders who make mistakes during VR training do not get burned. Meanwhile, they learn much of what they need to know before they pick up live torches and begin real training.

In health care, Sixense has developed a simulator that teaches brain surgeons how to remove blood clots in less than five minutes. 

“It’s an ingenious, brilliant vacuum cleaner basically, that can move all the way through the brain to the tiniest spot and suck out the blood clot,” Rubin says.

Penumbra Inc., the California-based company that developed the device, wanted more than a VR trainer for surgeons. They asked Sixense to assist with continuing instruction that would help care providers support patients afterward, through physical and cognitive therapy regimens.

Patients (or other users) who wear the devices — which are roughly comparable in size and capability to an iPhone 8 — are free of cables. Instead, they wear small sensors where they are needed: shoulders blades, elbows, hands or the waist. 

“You can be any avatar you want — human, robot, Spider-Man. The important thing is it’s completely transparent to the user,” Rubin says. “Reaction from patients has been amazing. For me, this is a huge improvement in how technology can help people.”

As VR has improved to the point of ubiquity, Rubin envisions further enhancements to augmented reality. 

“In VR, you bring a user-trainee into a completely synthetic and virtual world. You’re being represented by a computer-generated body,” Rubin says. 

With AR, a user puts on a headset and sees a real world, Rubin says. 

“You go to a location and see a specific object you need to weld together. You hold in your hand a live torch that fires when you press the button. But when you do, the fire that comes out is virtual,” Rubin says. 

The training, thus, creates muscle memory. For it to work, however, “The AR has a much higher bar of technical requirements,” Rubin says.

The hand, torch and virtual fire must work exactly like they would on the job. 

“You have to believe it is real,” Rubin says. 

Above and below: ImmersiveTouch's technology allows surgeons and other medical professionals to practice in a virtual world. Photos: ImmersiveTouch

ImmersiveTouch's technology allows surgeons to practice in a virtual world. Photo: ImmersiveTouch