Close
  • Latest News
  • Cybersecurity
  • Big Data and Analytics
  • Cloud
  • Mobile
  • Networking
  • Storage
  • Applications
  • IT Management
  • Small Business
  • Development
  • Database
  • Servers
  • Android
  • Apple
  • Innovation
  • Blogs
  • PC Hardware
  • Reviews
  • Search Engines
  • Virtualization
Read Down
Sign in
Close
Welcome!Log into your account
Forgot your password?
Read Down
Password recovery
Recover your password
Close
Search
Menu
eWEEK.com
Search
eWEEK.com
  • Latest News
  • Cybersecurity
  • Big Data and Analytics
  • Cloud
  • Mobile
  • Networking
  • Storage
  • Applications
  • IT Management
  • Small Business
  • Development
  • Database
  • Servers
  • Android
  • Apple
  • Innovation
  • Blogs
  • PC Hardware
  • Reviews
  • Search Engines
  • Virtualization
More
    Home Innovation
    • Innovation

    IBM Develops Artificial Neurons to Speed Up Cognitive Computing

    By
    DARRYL K. TAFT
    -
    August 4, 2016
    Share
    Facebook
    Twitter
    Linkedin
      IBM logo

      IBM research scientists have advanced the company’s effort to extend cognitive computing with a breakthrough that could lead to the development of neuromorphic computers.

      Neuromorphic computing, or brain-inspired computing, is the use of computing technology built to perform like the neuro-biological architectures in the human nervous system. A team of scientists at IBM Research in Zurich has developed technology that imitates the way neurons spike, such as when a person touches something sharp or very hot.

      In a blog post on the new discovery, IBM research scientist Manuel Le Gallo said IBM has developed artificial neurons that can be used to detect patterns and discover correlations in big data, with power budgets and at densities comparable to those seen in biology.

      Le Gallo co-authored a paper titled, “Stochastic phase-change neurons,” which appeared this week in the Nature Nanotechnologyjournal. This is the second scientific breakthrough IBM has published in that journal this week. Earlier this week, IBM published a paper on the company’s efforts to build new lab-on-a-chip technology to help fight cancer and other diseases.

      Le Gallo said the artificial neurons are built to mimic what a biological neuron does, though they won’t have the exact same functionality. Yet, it is close enough to achieve computation similar to that of the brain, he said.

      He also noted that typically, artificial neurons are built with standard complementary metal oxide semiconductor (CMOS)-based circuits, which is the stuff most of today’s computers are made of. However, IBM is using non-CMOS devices, such as phase-change devices, to reproduce similar functionality at lower power consumption and increased areal density, Le Gallo said.

      The goal is to imitate the computational power of a massive amount of neurons to accelerate cognitive computing for analyzing things such as the explosion of information coming from the internet of things (IoT) and other sources of big data.

      In its paper, the IBM Research team demonstrated how the neurons could detect correlations from multiple streams of events.

      “Events could be, for example, Twitter data, weather data or sensory data collected by the internet of things,” Le Gallo said. “Assume that you have multiple streams of binary events and you want to find which streams are temporarily correlated; for example, when the 1s come concurrently. We show in the paper how we could do this discrimination using just one neuron connected to multiple plastic synapses receiving the events.”

      Le Gallo said neuromorphic computing is simply more efficient than conventional computing because computing and storage are co-located in a neural network. In conventional computing, memory and logic are separate. To perform a computation, you must first access the memory, obtain data and transfer it to the logic unit, which returns the computation, he said.

      “And whenever you get a result, you have to send it back to the memory,” said Le Gallo. And this process goes back and forth continuously. “So if you’re dealing with huge amounts of data, it will become a real problem.”

      However, with computing and storage co-located in a neural network, “You don’t have to establish communication between logic and memory; you just have to make appropriate connections between the different neurons,” he noted. “That’s the main reason we think our approach will be more efficient, especially for processing large amounts of data.”

      IBM’s artificial neurons consist of phase-change materials, including germanium antimony.

      “We have been researching phase-change materials for memory applications for over a decade, and our progress in the past 24 months has been remarkable,” IBM Fellow Evangelos Eleftheriou, said in a statement.

      Eleftheriou added the new memory techniques demonstrate the capabilities of phase-change-based artificial neurons, “which can perform various computational primitives such as data-correlation detection and unsupervised learning at high speeds using very little energy.”

      MOST POPULAR ARTICLES

      Android

      Samsung Galaxy XCover Pro: Durability for Tough...

      CHRIS PREIMESBERGER - December 5, 2020 0
      Have you ever dropped your phone, winced and felt the pain as it hit the sidewalk? Either the screen splintered like a windshield being...
      Read more
      Cloud

      Why Data Security Will Face Even Harsher...

      CHRIS PREIMESBERGER - December 1, 2020 0
      Who would know more about details of the hacking process than an actual former career hacker? And who wants to understand all they can...
      Read more
      Cybersecurity

      How Veritas Is Shining a Light Into...

      EWEEK EDITORS - September 25, 2020 0
      Protecting data has always been one of the most important tasks in all of IT, yet as more companies become data companies at the...
      Read more
      Big Data and Analytics

      How NVIDIA A100 Station Brings Data Center...

      ZEUS KERRAVALA - November 18, 2020 0
      There’s little debate that graphics processor unit manufacturer NVIDIA is the de facto standard when it comes to providing silicon to power machine learning...
      Read more
      Apple

      Why iPhone 12 Pro Makes Sense for...

      WAYNE RASH - November 26, 2020 0
      If you’ve been watching the Apple commercials for the past three weeks, you already know what the company thinks will happen if you buy...
      Read more
      eWeek


      Contact Us | About | Sitemap

      Facebook
      Linkedin
      RSS
      Twitter
      Youtube

      Property of TechnologyAdvice.
      Terms of Service | Privacy Notice | Advertise | California - Do Not Sell My Info

      © 2020 TechnologyAdvice. All Rights Reserved

      Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.

      ×