IBM Research Creates Programming Model to Mimic Human Brain Power
IBM is presenting these innovations this week at The International Joint Conference on Neural Networks in Dallas, Texas. IBM notes that modern computing systems were designed decades ago for sequential processing according to a predefined program. Although they are fast and precise "number crunchers," computers of traditional design become constrained by power and size while operating at reduced effectiveness when applied to real-time processing of the noisy, analog, voluminous, big data. In contrast, the brain—which operates comparatively slowly and at low precision—excels at tasks such as recognizing, interpreting and acting upon patterns, while consuming the same amount of power as a 20-watt light bulb and occupying the volume of a two-liter bottle, IBM said. In other words, as IBM's Modha told Forbes, "Think of today's computers as left brained and SyNAPSE as right brained." Being left-brained refers to being more analytical, logical and objective, thus SyNAPSE is more intuitive, thoughtful and subjective, as these are right-brained characteristics. In August 2011, IBM demonstrated a building block of a novel brain-inspired chip architecture based on a scalable, interconnected, configurable network of "neurosynaptic cores." Each core brings memory ("synapses"), processors ("neurons") and communication ("axons") in close proximity, executing activity in an event-driven fashion. These chips serve as a platform for emulating and extending the brain's ability to respond to biological sensors and analyzing vast amounts of data from many sources at once, IBM said.IBM said systems built from these chips could bring the real-time capture and analysis of various types of data closer to the point of collection. They would not only gather symbolic data, which is fixed text or digital information, but also gather sub-symbolic data, which is sensory based and whose values change continuously. This raw data reflects activity in the world of every kind, ranging from commerce, social, logistics, location, movement and environmental conditions. For example, IBM said, the human eyes sift through over a terabyte of data per day. Emulating the visual cortex, low-power, lightweight eye glasses designed to help the visually impaired could be outfitted with multiple video and auditory sensors that capture and analyze this optical flow of data. These sensors would gather and interpret large-scale volumes of data to signal how many individuals are ahead of the user, the distance to an upcoming curb, the number of vehicles in a given intersection and the height of a ceiling or length of a crosswalk. Like a guide dog, sub-symbolic data perceived by the glasses would allow users to plot the safest pathway through a room or outdoor setting and help him or her navigate the environment via embedded speakers or ear buds. This same technology—at increasing levels of scale—can form sensory-based data input capabilities and on-board analytics for automobiles, medical imagers, health care devices, smartphones, cameras and robots, IBM said.
Having completed Phase 0, Phase 1 and Phase 2, IBM and its collaborators (Cornell University and iniLabs) have recently been awarded $12 million in new funding from DARPA for Phase 3 of the SyNAPSE project, thus bringing the cumulative funding to $53 million.