Log in   •   Sign up   •   Subscribe  feed icon

Immersive Virtual Robotics Will Soon Allow Us To Directly Control Robotic Behemoths

As a general rule, industrial robots aren't even remotely safe to be around. It's not their fault, really. With very few exceptions, such machines have to be cordoned off from the comparatively weak, fragile, and fleshy humans, lest they accidentally rip one of their co-workers to shreds. Because of how dangerous these mammoth machines tend to be, operators and designers are generally forced to make use of a ridiculous array of hardware, software, and sensors in order to ensure nothing goes wrong.

What if there was a better way? Researchers at Johns Hopkins' Computational and Interactive Robotics Laboratory strongly believe that there is.  The answer, they claim, lies not in the physical world, but the virtual. 

For some time now, they've been hard at work on an Immersive Virtual Robotics Environment (IVRE) which will allow users to directly interact with the big bots without fear of death, dismemberment, or disembowelment.

"IVRE is a natural immersive virtual environment that enables a user to instruct, collaborate and otherwise interact with a robotic system either in simulation or in real-time via a virtual proxy," reads the website. "The user has access to a wide range of virtual tools, for manipulating the robot, displaying information and interacting with the environment. Topics include virtual robot instruction, monitoring, and user-guided perception. Our goal is to allow for interaction with remote systems or systems that either are in hazardous surroundings, or are unsafe for physical interaction with a human."

In other words, the developers are looking well beyond the factory floor with this one. The technology could, for instance, allow a human to step into the shoes of a robot proxy as it explores environments far too dangerous for living creatures to set foot, such as an earthquake site or a warzone. Far from allowing them to interact with big, scary industrial robots, such an implementation could, with proper 3D mapping software, enable us to perform rescue missions or military operations without risking human life.

On the consumer side of things, this could be big business for home robotics, particularly with affordable virtual reality hardware like the Oculus Rift just over the horizon. Imagine it: you're sitting on your couch, and suddenly you've a hankering for a bag of chips. Whereas before you might have to go and get them yourself (or bellow at someone else to do it for you), now you'll simply be able to pop on a headset and direct where you want the robot to go.

On second though...I'm not really sure that's a good thing. Aren't we lazy enough as it is? 

Anyway, in addition to its VR functionality, the IVRE offers augmented reality features, a well. Users can access information about a particular robot, the environment around them, and what a robot is trying to do.  This could be extended to any of a wide range of different applications, not the least of which would be making challenges like DARPA more accessible for people without robotics training. 

Virtual reality is awesome. Robots are awesome.  When the two are combined, how can one not be excited? The Johns Hopkins' Robotics Lab is onto something incredible here, and it's not just the manufacturing secrot that's going to benefit from their invention - it's pretty much everyone who'll ever use a robot. You can read more about the system here.