Log in   •   Sign up   •   Subscribe  feed icon

We're Going To Need Better AI To Truly Explore The Depths of Space

Seated around a room scattered with laptops and computer monitors, the team responsible for the operation of the Mars Rover is playing a waiting game. They've sent a signal up to the semi-autonomous probe ordering it to collect and analyze a particular rock sample. That signal will take a full fifteen minutes just to reach the probe. What this means is that sending the order, they won't actually know of the results for a half hour or more. It hardly seems an ideal way of doing things, does it?

Unfortunately, it's currently the only choice there is. Communication with interstellar probes has always had to account for delays. The further away from Earth a probe travels, the more complicated and difficult keeping in touch becomes. In order to operate effectively outside the bounds of our solar system, space probes are going to need to develop a certain degree of autonomy.

They're going to need to be able to think.

Hosted by the German Aeronautics and Space Research Center (DLR), the SpaceBot Cup is designed to search for a robot capable of displaying this autonomy. Taking place in a multi-purpose building near Bonn, the competition pits ten teams from German Universities and the German Robotics industry against one another as they show off the latest advances in robotics. Each team is tasked with sending their robot into an artificially-constructed alien landscape, where the bot must use every ounce of intelligence at its disposal to track down a glass of water, pick it up, and move it to another location. 

One of the most promising entries is an invention known as Lauron, a six-legged drone developed by the FZI Research Center for Information Technology and inspired by the Indian stick insect. Lauron has six flexible legs, one of which is equipped with a grasping claw to allow it to manipulate objects. The robot is further outfitted with a cargo bracket beneath its midsection and a rotating laser scanner which allows it to create a three-dimensional image of its environment.

In order to interpret its environment, Lauron is equipped with a set of eyes similar to Microsoft's Xbox Kinect. The robot, explains the team, views the world as a collection of points and pixels, which it translates into three-dimensional images.  It uses these images both to orient itself and to recognize important objects.

Lauron was just one of many promising entries in the competition, all of which managed quite easily to transport the glass once it was located. As a matter of fact, transportation was the easy part - as it turns out, the most difficult task any of the teams had was figuring out how to program their entries to recognize the glass as an objective.

"It's often the things that we think are the easiest things, such as moving around and finding objects," noted Jacobs University's Andrea Birk, "which end up being the most challenging!" 

As you can see, we're making a great deal of progress, but we've still got a long way to go before the day that any interstellar robot can truly be considered autonomous. Until that day comes, we're just going to have to settle for playing the waiting game. Look on the bright side though - more often than not, the stuff we discover is well worth that wait.

Comments
Nov 26, 2013
by Anonymous

Contrary to your story's

Contrary to your story's clear suggestion, the rover teams do not stop and wait for one command's (or small batch of commands') results before sending the next. Instead, we plan a whole day at a time, sometimes multiple days -- and that works only because the rovers have a pretty good amount of autonomy on board already.

Yes, more autonomy is always better, but please don't misstate what's there right now in order to make your case.

(How do I know this about the rover teams? Because I drove the MER and MSL rovers for nine years, that's how.)

Dec 10, 2013
by Tech Light

My mistake. I'll update the

My mistake. I'll update the article to reflect that.