How Computers Just Got More Human

Scientists at IBM have created a new computer chip that mimics the processes in a human brain. Dan Lyons on how this breakthrough could produce smarter, more efficient computers.

As computers go, it’s pretty hard to beat the human brain. That three-pound lump of gray matter contains 100 billion neurons and 100 trillion synapses, or connections. It operates as a massively parallel computer, yet it uses just a tiny bit of energy. To recreate this machine in silicon you’d have to build a supercomputer that weighs hundreds of tons, fills a football field and requires so much electricity that it would take down the power grid. Oh, and it would probably cost you a few billion dollars.

Yet for an elite band of computer scientists the quest to reverse-engineer the brain represents the Mount Everest of computing, the biggest challenge out there. And in recent weeks a team of scientists at IBM has taken a small but significant step forward.

The IBMers announced that they’d managed to create an actual silicon processor that mimics the way the human brain works and can even learn, sort of. Next they’re going to start trying to hook these chips together to form computers that will be radically different from today’s machines. “What we have created are the seeds of an entirely new computing architecture,” says Dharmendra Modha, who oversees IBM’s Cognitive Computing effort, which produced this breakthrough.

The new machines won’t replace the kind of computers we use today, Modha says, but rather will complement them and enable scientists to solve different problems for which traditional computers are not well suited. “Today’s computers will be with us in perpetuity. We’re going after other kinds of tasks,” he says.

Today’s computers are “left brain” machines, Modha says. They’re great at crunching lots of numbers really fast and analyzing data. But they’re lousy at “right brain” tasks like pattern recognition. A new kind of computer could do things like predict tsunamis by tracking millions of sensors strung around the globe.

Huge obstacles exist, of course. The chips that IBM has made still use traditional semiconductor materials. That’s great for a prototype. But ultimately IBM will need to design new chips, using new materials, that draw a lot less power. Another issue is software. Nobody has experiencing writing programs that can operate in parallel across billions of cores.

A new kind of computer could do things like predict tsunamis by tracking millions of sensors strung around the globe.

Finally, nobody really knows how the brain works. So these guys are trying to copy something that remains at some level a mystery, even to Modha’s 100-member team, which consists of computer scientists, psychiatrists, psychologists, neuroscientists, mathematicians, physicists, and nanotechnologists. “Mother Nature has a way of keeping us humble,” Modha says.

Modha believes that in about a decade his team will produce a computer that will contain about 10 billion neurons (about half the number contained in the cerebral cortex) with 100 trillion synapses and will use only a kilowatt of power, which would be far more efficient than today’s computers.

The idea of building a human brain in silicon fires up the imaginations of people in the singularity movement, who believe that by the middle of this century machines will become as intelligent as humans and will achieve self-awareness, while humans will increasingly become more like machines, by incorporating more and more digital technology into their brains and bodies.

But Modha insists his work has nothing to do with such fanciful visions. “We’re not trying to build a brain,” he says. “We are simply trying to draw inspiration from the brain to create a new generation of computing systems which will behave more like the brain.”

Nevertheless, the work that he and his team are doing could one day provide us with computers that could solve some of the biggest questions that scientists so far still can’t answer—about ourselves, our world, our universe. “The tree of human knowledge keeps growing,” Modha says. “This is one more branch.”