Log in   •   Sign up   •   Subscribe  feed icon

Intel's Oculus Robots Are All About Playing Games

Have you ever wondered how touch screens are rated and scored for accuracy? 

In Intel's case, it turns out the process actually involves video games. Ladies and gentlemen, say hello to the Oculus. This little robotic arm is designed to test out the accuracy of touch screens on mobile devices by playing a number of different mobiles games such as Cut the Rope. Each gaming session is recorded by a Hollywood-level production camera designed by Red that captures video at 300 frames per second with a resolution above HD. 

This video is then analyzed by software within the robot, which examines how quickly and accurately the device reacted. A number of different factors are taken into account, including keyboard responsiveness, screen scrolling during navigation, and touch gestures. Once all the data has been gathered, the 'bot then sets to work determining how responsive a human being might find a particular model. 

In order to accomplish this, Intel analyzes the data based on a series of cognitive psychology experiments carried out by its Interaction and Experience Group. These tests, designed to study the relationship between humans and computers, are utilized by the Oculus to give each screen a score between one and five. Now, these scores aren't really meant for consumers.

According to User Experience Manager Matt Dunford, the intent of these tests is that they be used by partners of the company that are working on Intel-based touch-screen devices to improve speed, accuracy, and overall user experience. The tests have, Dunford explains, already proved invaluable. "We can predict precisely whether a machine will give people a good experience, and give them numbers to say what areas need improving." 

That's really cool and all, but it does seem a little much, doesn't it?  Why not just go with a more traditional approach? Why not have focus groups or user experience experts assess the devices?

According to Intel, that doesn't always offer a specific enough indication of what needs to be tweaked in a particular touch-screen. Humans aren't necessarily equipped to analyze each device to the level Intel requires; according to Intel engineer Eddie Raleigh, a good touch-screen has no more than ten milliseconds of delay. A screen with twelve milliseconds might feel sluggish for a human user, but they might be unable to explain precisely why.

Furthermore, additional tests carried out by Intel have demonstrated that how a user perceives quality varies significantly based on how they use a device. People using a stylus will unconsciously raise their standards, while children generally expect a quicker response from touch-screens than adults. While setting up such tests might prove tedious with human subjects, they're quite easy to manage with the Oculus.

"We can mimic a first-time user who is being slower or someone hopped up on caffeine and going really fast," says Raleigh. 

Currently, Intel has three Oculus robots at work, and is completing a fourth. The device, which can be used on any device, bears a decidedly humble origin: originally, the arm was intended to be used in chip fabrication, where it would move silicon wafers around. It's also extremely likely that it's not the first of its kind - more an improvement on what's already out there, one equipped with the ability to factor out how humans actually perceive touch screens. 

"The Samsungs, LGs, and Apples all have these kinds of things, but they don't talk about it because they don't want their competitors knowing," explains Sauce Labs co-founder and CTO Jason Huggins. Huggins is trying to make access to such machines more open, maintaining that they could be invaluable in helping app developers polish their software. To that end, he's created an open-source Robot known as Tapster, which - although it's significantly less accurate - is available at a fraction of the price of an Oculus unit. 

"If I can make a robot that can actually test apps, I suspect there's going to be a serious market," notes Huggins. "We have to think about this because software is not trapped inside a computer behind a keyboard and mouse anymore. You've got phones, tablets, Tesla's 17-inch touch screen, Google Glass, and Leap Motion, where there's no touching at all. These things depend on people having eyeballs and fingers, so we have to create a robotic version of that."