IBM's Watson recently showed that supercomputers can play the game show "Jeopardy" just as well or better than humans. The one advantage that humans still had over software simulations of learning and recall was that a human's analog nervous system was smaller and lower power than the room full of high-performance servers needed for Watson. Now, by using low-power hardware emulations of human neural networks, instead of software simulations on high-performance supercomputers, Massachusetts Institute of Technology researchers predict that microchip-sized artificial brains will learn-and-recall even faster than humans.
MIT's brain-like microchip uses more than 400 transistors and other support circuits to emulate the ionic fluids that cause real brains to learn. (Source: MIT)
In this age of computer-aided design (CAD), software simulations have become the first approximation for any smart system. The decision as to which parts of a design--if any--to implement as a hardware emulation often depends on the results of the software simulations. By identifying the most often used routines and approximating how much faster hardware emulations would speed up execution, the question of whether the extra cost of accelerators or custom application-specific integrated circuits (ASICs) can be evaluated. Emulators outperform simulators by recreating an analog function in another medium. With this new work, the researchers use electronic charge in place of chemical ions.
For speeding up human learning-and-recall simulations, Professor Chi-Sang Poon, MIT's principal research scientist in the Harvard-MIT Division of Health Sciences and Technology, and associates have created a hardware emulation of the brain's learning element--the synapse. Brain cells are connected by synapses that grow when voltage spikes cause learning, and atrophy when their absence causes forgetting. Poon claims his electronic synapses emulate both learning and forgetting even faster than in humans, by virtue of using electronic charges in wires as opposed to chemical ions in neurotransmitter channels.
"We wanted to mimic brain functions realistically, by capturing the intracellular processes that are ion-channel-based, not just the voltage spikes," said Poon. "Our model now captures all the ionic processes going on inside a synapse."
MIT's artificial synapse could eventually become a circuit element in a neural prosthetic--such as the artificial retinas that cure blindness--and may eventually be replicated across very-large-scale integrated circuits (VLSIs), where millions could emulate whole brain regions like the pattern-recognition capabilities of the visual cortex. Today, however, the MIT researchers are still perfecting a single synapse, which so far requires about 400 transistors. Poon did the research with fellow MIT professor Mark Bear, University of Texas professor Harel Shouval and former MIT postdoctoral researcher Guy Rachmuth.