neurones artificiels million plus rapides que notres

Artificial neurons are millions of times faster than ours

⇧ [VIDÉO] You may also like this partner content (after the ad)

Creating artificial neurons that are more efficient than human neurons… the idea is not new. But researchers at MIT have taken it to a new level. They claim to have created an artificial neural network capable of working millions of times faster than humans.

This feat would have been achieved using an “analog” neural network. But what is the point of creating artificial neurons? To understand this, we should already go to the concept of “neural network”. If we start again definition Brain Research Consortium, Neurons Can Be Seen As ” Basic unit of work “of the brain. They are specialized cells. They send information to other nerve cells depending on their specialized area. They usually contain:

  • DendriteIt receives a nerve signal
  • A Soma, The cell body that decodes it
  • A moldIt transmits

These neurons are interconnected by synapses, which connect the axon and dendrite. They communicate via electrical signals called “action potentials”: this releases neurotransmitters. The latter are “chemical messengers” responsible for passing through synapses to transmit information. So we have a natural neural network.

An artificial neural network is commonly associated with the field referred to as “artificial intelligence”. It is actually a system that is “fed” with large amounts of data to “learn” and extract logical connections in view of a given objective. These learning methods are inspired by the activity of biological neurons, which is why we speak of an “artificial neural network”.

A learning system inspired by biological neurons

In fact, the transmitted data is spread over an artificial “grid” of neurons, usually virtual. They are actually points in a network connected by computer code (syncs in a way). So this network transmits incoming information, training data and outgoing information.

In both cases, we see a “learning” phenomenon involving data processing. In our (biological) brains, connections between neurons, called synapses, are strengthened or weakened by experience and learning. In an artificial neural network, the principle is somewhat similar: connections between points in the network are weighted according to the processing of large amounts of data. That’s why we talk about deep learning.

The innovation presented here by the scientists is a neural network that performs these calculations very quickly and with low energy requirements. For this, they explain that they are not based on digital neural networks, but analog. So let’s go back to the difference between analog and digital.

Analog and digital are two different processes. They both allow data to be transported and stored. For example, an audio, a picture, a video… analog system appeared from the beginning of electricity. On the other hand, digital appeared with the computer. In an analog system, the basic principle is to reproduce the signal to be recorded in the same form.

Digital and Analog

For example, analog television worked on this principle. The image to be retransmitted is converted into electrical signals, called “video signals”, characterized by their frequency, i.e. the number of oscillations per second. These electrical signals are retransmitted via an electromagnetic wave, which follows the same amplitude as the original signal. So the transmitted signal is a kind of “reproduction” of the original signal.

In digital, the signal to be recorded is converted into a sequence of 0s and 1s. So the amplitudes are no longer reproduced, but encoded and coded on arrival. This changed with the transition to digital television, as the video below explains well.

In digital, we get a signal with two amplitudes instead of infinity in analog. So far, artificial neural networks mostly work on the digital principle. Network weights are programmed using learning algorithms, and calculations are performed using sequences of 0s and 1s. However, by using an analog system, the MIT scientists were able to build a neural network much faster and more, according to them. More efficient than humans. A million times faster, to be exact.

In an analog deep learning system, the transmission of data in the form of 0s and 1s is not important, but ” Increase and decrease in electrical conductivity of proton resistance It implements machine learning and reads Press release from MIT. Conductivity is defined as the ability to allow current (inverse of resistance). ” Conduction is controlled by the movement of protons. To increase conductivity, more protons are pushed into the channel of the resistor, while to decrease conductivity, protons are removed. This is accomplished by using an electrolyte (similar to the one in a battery) that conducts protons but prevents Electrons “.

Electrical resistance is a physical property of a material that controls the flow of current in a circuit. A component with this characteristic is used to control the passage of electrons in an electrical circuit. In the present case, it is an important element because it regulates the movement of protons.

Strong resistance to electrical impulses

Why does this process allow for faster activation of the neural network? ” First, computation is performed in memory, so large data loads are not transferred from memory to a processor. “, explain the scientists. ” Analog processors also work in parallel. As the size of the matrix increases, an analog processor does not need much time to perform new operations because all calculations occur simultaneously. “.

The speed achieved is measured in nanoseconds. If this is possible, it’s because the scientists used a specific material: inorganic phosphosilicate glass (PSG), a material similar to that found in desiccant bags. This material is a good conductor because it has large nanometric pores that allow protons to pass through, while also being able to withstand high pulsed voltages. According to the scientists this quality is necessary because this strength allows higher voltages to be used and therefore higher speeds.

The action potential of biological cells rises and falls on a time scale of milliseconds because a voltage difference of about 0.1 volts is limited by the stability of water. Says lead author Zhu Li, Battelle Energy Alliance Professor of Nuclear Science and Engineering and Professor of Materials Science and Engineering, “ Here we apply up to 10 volts through a special nano-thick solid glass film that conducts protons without permanently damaging them. And the stronger the field, the faster the ion devices “.

Scientists believe that the system can be redesigned to be suitable for high-volume production. They have high hopes for this development: ” Once an analog processor is developed, it’s no longer necessary to train networks that work for everyone, but networks of unprecedented complexity that no one else can afford, surpassing anything previously possible. In other words, it’s not a fast car, it’s a spaceship. Murat Onen, lead author and postdoctoral fellow at MIT, adds.

Source: Science

Check Also

World Snooker Champion Kyren Wilson Finds Form Ahead of Triple Crown Bid

World Snooker Champion Kyren Wilson Finds Form Ahead of Triple Crown Bid

In 1977, snooker’s UK Championship was contested for the first time. It immediately earned the …

Leave a Reply

Your email address will not be published. Required fields are marked *