Log in   •   Sign up   •   Subscribe  feed icon

DARPA's Looking To Develop Processors That Can Learn

I'll be honest, there's been some pretty impressive advances in computing since I was born (and naturally before that, as well). We've seen the world go from bytes to gigabytes, we've seen previously impossible hand-held communication devices, seen computers the size of credit cards. We've borne witness to the birth and proliferation of a global information network the likes of which the world has never before seen. We've seen the world literally changing and evolving before our eyes, and it's going to keep doing so. 

One thing a lot of folks seem to be fascinated with - myself included, admittedly - is the creation of artifical life; the development of machine intelligence capable of closely mimicing our own. Though we've made great strides to that effect, there's a limit to how far we're going to be able to go. It's not the software, in this regard . it's the hardware. See, no matter how advanced we make our AI, eventually it's going to run into a brick wall when it outstrips its processor.

DARPA - short for the Defense Advanced Research Projects Agency - is looking change that, and to push computers beyond anything they've ever been capable of. Over the weekend, the organization put out an RFI (Request For Information) on a concept known as a "Cortical Processor." Essentially, it's a piece of technology that mimicks the growth of neural pathways in the human brain; it develops new connections as it gathers new information - a computer which quite literally "learns."

It sounds like a pipe dream. It sounds like an impossible technology. It sounds like the exposition of a science fiction novel. I assure you, it's very, very real.

We might as well start preparing for the inevitable arrival of our robot overlords now, eh? 

The field fo study involving learning chips is known as cognitive computing. IBM is undoubtedly one of the top dogs of this fledgling industry - in recent years. its engineers have been working on a long-term, DARPA-funded project known as SyNAPSE. One of the most recent developments of this project is a new breed of prototype chip; capable of operating on external stimuli like touch, sight, or sound. This will be an enormous boon for monitoring systems - a supercomputer designed with these chips could feasibly tap into vast, global networks of sensors, sifting through a veritable sea of data in order to seek out patterns it's learned to recognize. 

Such technology could well be the long-awaited answer to the Big Data problem (though so suggest that this is the only use for the technology needlessly devalues it). 

Of course, we're still a long haul away from anything truly world-changing. IBM's ultimate goal is to design computers which incorporate ten billion digital neurons and one hundred trillion synaptic connections - it wants to quite literally build a "brain in a box."  At the moment, we've only got two hundred fifty-six neurons and a few tens of thousands of programmable synapses.  To further along the evolution, IBM recently unveiled a new software ecosystem custom-tooled to program cognitive chips.

We've seen a lot of world-changing technologies in the past few decades. Electricity. Radio. Television. The Internet. Smartphones...the list goes on. If what IBM is working on pans out, we might well be adding yet one more item to that list. These learning machines could very well end up changing the world, taking our society in heretofore unseen directions and unlocking possibilities of which we've hardly even dared to dream. 

Via News Observer