BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News IBM Creates Artificial Neurons from Phase Change Memory for Cognitive Computing

IBM Creates Artificial Neurons from Phase Change Memory for Cognitive Computing

This item in japanese

Bookmarks

A team of scientists at IBM Research in Zurich, have created an artificial version of neurons using phase-change materials to store and process data. These phase change based artificial neurons can be used to detect patterns and discover correlations in the areas of big data and unsupervised machine learning.

The results of the decade-long research to use phase-change materials for memory applications were recently published in the journal Nature Nanotechnology. The research team is led by Evangelos Eleftheriou.

The development of energy-efficient, ultra-dense integrated neuromorphic technologies for applications in cognitive computing is getting a lot of attention. This technology, foundation for event-based computations, could lead to the development of extremely dense neuromorphic computing systems (computers inspired by the efficiency of the human brain) with co-located memory and processing units to speed up cognitive computing and analyze IoT Big Data.

Deep learning, the rapidly developing branch of artificial intelligence, is inspired by the science behind the biological brains and how they are put together.

The team applied a series of electrical pulses to the artificial neurons, which resulted in the progressive crystallization of the phase-change material, ultimately causing the neuron to fire, a function known as “integrate-and-fire” property of the biological neurons. They have organized hundreds of artificial neurons into populations and used them to represent fast and complex signals. Moreover, the artificial neurons have been shown to sustain billions of switching cycles, which would correspond to multiple years of operation at an update frequency of 100 Hz. The energy required for each neuron update was less than five picojoule and the average power less than 120 microwatts — for comparison, 60 million microwatts powers a 60 watt lightbulb.

In this interview, one of the research team members, Manuel Le Gallo (IBM Research Scientist), talked about what makes neuromorphic computing more efficient than conventional computing.

In conventional computing, we have a separate memory and logic unit. Whenever you want to perform a computation, you must first access the memory, obtain your data, and transfer it to the logic unit, which returns the computation. And whenever you get a result, you have to send it back to the memory. This process goes back and forth continuously. So if you’re dealing with huge amounts of data, it will become a real problem.

In a neural network, computing and storage is co-located. You don’t have to establish communication between logic and memory; you just have to make appropriate connections between the different neurons. That’s the main reason we think our approach will be more efficient, especially for processing large amounts of data.

There are a lot of use cases of this technique in the big data and machine learning areas. For example, in the Internet of Things (IoT), sensors can collect and analyze volumes of weather data collected at the edge for faster forecasts. The artificial neurons could be used to detect patterns in financial transactions to find discrepancies or use data from social media to discover new cultural trends in real time. Large populations of these high-speed, low-energy nano-scale neurons could also be used in neuromorphic coprocessors with co-located memory and processing units.

The next step is to experiment with linking the neurons into networks which could be attached to sensors and tuned to detect different types of IoT data like unusual temperatures in factory machinery, electrical rhythms in a patient’s heart or specific types of trade in financial markets.

Bigger versions of these networks could be baked onto standard computer chips, offering a fast, frugal co-processor designed to excel at pattern-recognition tasks like speech recognition or face recognition, now performed by slower, less efficient software running on standard circuitry. The goal is to shrink the conceptual gap between artificial brains and real ones.

InfoQ spoke with Manuel Le Gallo about their research project and what are the next areas of focus.

InfoQ: Can you briefly explain our readers about your research project and the recent breakthrough in creating artificial neurons to store data?

Manuel Le Gallo: We have imitated the integrate and fire functionality of a neuron using phase-change materials. This functionality is the foundation for event-based computation and, in principle, is similar to how our brain triggers a response to external stimuli, for example when we touch something hot.

These artificial neurons can process data on their own or can be organized into large sets (populations) in which their collective computational power is used, similarly to the way the neurons in the brain function. This technology could lead to the development of neuromorphic computers with highly co-located memory and processing units to speed up cognitive computing and analyze data from the Internet of Things.

InfoQ: How can the artificial neurons help with big data and machine learning use cases?

Le Gallo: Artificial neurons and synapses are computationally very powerful; already a single artificial neuron can be used to detect patterns and discover correlations in real-time streams of event-based data. For example, in the Internet of Things, sensors can collect and analyze large volumes of weather data collected at the edge for faster forecasts. Artificial neurons could also detect patterns in financial transactions to find discrepancies or use data from social media to discover emerging cultural trends in real time. Large populations of these high-speed, low-energy nano-scale artificial neurons could also be used in neuromorphic coprocessors with co-located memory and processing units.

InfoQ: How do you compare neuromorphic technologies to traditional approaches in being energy efficient and cost effective?

Le Gallo: Today's computers are based on the von Neumann architecture, in which the computing and the memory units are physically separated. This architecture is highly inefficient for the data-centric nature of cognitive computing, in which large amounts of data must be transferred between memory and the computational units at high speeds. To build efficient cognitive computers, we need to transition to non-von Neumann architectures in which memory and logic coexist in some form.

Neuromorphic computing is a very promising approach to non-von Neumann computing that draws inspiration from the inner workings of the biological brain. By mimicking the way biological neurons function, energy and the areal costs can be drastically decreased for complex computational tasks such as pattern recognition, feature extraction and mining of data in noisy environments.

InfoQ: Can you discuss any constraints of using the artificial neurons in storing and processing data?

Le Gallo: As we wrote in our paper in Nature Nanotechnology:

In comparison to the generic spike-based neuron models used in neuroscience, as well as CMOS-based neuronal circuitry and higher-level platforms for the implementation of neuronal networks, the functionality of phase-change neurons has certain limitations. As most of the stochastic properties arise directly from the physics of crystallization, the degree to which the stochastic response, the neuronal parameters and the membrane potential dynamics can be tuned with minimal circuitry requirements is limited. In particular, high-dimensional parameter adaptation and homeostasis, as well as comparably fast recurrent dynamics such as (nonlinear) leakage, might require further dedicated logic and electronic components. An alternative approach is to use the memristive device as an additive source of stochasticity in conventional circuitry. Also, realizations on extremely small technological nodes may exhibit increased effects of fixed-pattern as well as temporal noise due to device variability and nanoscale physical effects, which might be disadvantageous for algorithms in which stochasticity must be tightly controlled.

InfoQ: What is the future of the research project? What are the next areas of focus?

Le Gallo: Phase-change neurons can be seamlessly combined with phase-change synapses, as our follow-up research demonstrates (Tuma et al., IEEE Electron Device Letters and Pantazi et al., IOP Nanotechnology). This enables the creation of dense neuromorphic systems and goes even beyond phase-change memristive devices. The work is complementary to the results obtained on other memristor-based synapses, memristor-based neurites and other memelements, and in combination with those will contribute to achieving a reduction of the number of active elements and an increase in the density of massively parallel computing systems.

 

Rate this Article

Adoption
Style

BT