IBM Unveils SyNAPSE Chip That Mimics the Human Brain

 
 
By Darryl K. Taft  |  Posted 2014-08-07 Email Print this article Print
 
 
 
 
 
 
 
IBM logo


DARPA has funded the project since 2008 with approximately $53M via Phase 0, Phase 1, Phase 2, and Phase 3 of the program. Current collaborators include Cornell Tech and iniLabs.

“It is an astonishing achievement to leverage a process traditionally used for commercially available, low-power mobile devices to deliver a chip that emulates the human brain by processing extreme amounts of sensory information with very little power,” said Shawn Han, vice president of Foundry Marketing at Samsung Electronics, in a statement. “This is a huge architectural breakthrough that is essential as the industry moves toward the next-generation cloud and big-data processing.”
The event-driven circuit elements of the chip used the asynchronous design methodology developed at Cornell Tech and refined with IBM since 2008.

“After years of collaboration with IBM, we are now a step closer to building a computer similar to our brain,” said Professor Rajit Manohar, Cornell Tech.

The combination of cutting-edge process technology, hybrid asynchronous-synchronous design methodology, and new architecture has led to a power density of 20mW/cm2 which is nearly four orders of magnitude less than today’s microprocessors.

The new chip is a component of an end-to-end vertically integrated ecosystem spanning a chip simulator, neuroscience data, supercomputing, neuron specification, programming paradigm, algorithms and applications, and prototype design models. The ecosystem supports all aspects of the programming cycle.

To promote this fundamentally different technological capability, IBM has designed a novel teaching curriculum for universities, customers, partners, and IBM employees.

IBM says this ecosystem signals a shift in moving computation closer to the data, taking in varied kinds of sensory data, analyzing and integrating real-time information in a context-dependent way, and dealing with the ambiguity found in complex, real-world environments.

Meanwhile, looking to the future, IBM is working on integrating multi-sensory neurosynaptic processing into mobile devices constrained by power, volume and speed; integrating novel event-driven sensors with the chip; real-time multimedia cloud services accelerated by neurosynaptic systems; and neurosynaptic supercomputers by tiling multiple chips on a board, creating systems that would eventually scale to one hundred trillion synapses and beyond.

Building on previously demonstrated neurosynaptic cores with on-chip, online learning, IBM said it envisions building learning systems that adapt in real world settings. While today’s hardware is fabricated using a modern CMOS process, the underlying architecture is poised to exploit advances in future memory, 3D integration, logic, and sensor technologies to deliver even lower power, denser package, and faster speed, IBM said.



 
 
 
 
 
 
 
 
 
 
 
 
 

Submit a Comment

Loading Comments...
 
Manage your Newsletters: Login   Register My Newsletters























 
 
 
 
 
 
 
 
 
 
 
Thanks for your registration, follow us on our social networks to keep up-to-date
Rocket Fuel