ICE Seminar « Analogue hardware for energy efficient AI »

Dec. 3rd 2020, 2 pm
Online event (see link below)


Neural Network computing, popularized more recently as Artificial Intelligence, is a concept dating back to the 40s and 50s. After decades of a chaotic scientific trajectory alternating extremely enthusiastic expectations and dramatic disappointments, it finally achieved a more than convincing potential, and even brilliant demonstrations, during the last ten years. This was obtained through the many success of Deep Learning approaches, technically made possible thanks to the crossing of a certain computing power threshold for modern digital computers. The many confirmed attempts for artificial intelligence, have however nearly always been up to now achieved through conventional computing machines, which principles were established by Turing in the 30s, and which were technically implemented by von Neumann in the 40s. It is thus essentially through those conventional Turing – von Neumann machines that AI is being currently explored, with the yet unsolved drawback of a very poor computing efficiency. The latter is mainly explained by the fact that digital computers are used to simulate neural networks, and there is currently a huge lack of dedicated hardware for making AI not only very powerful, but also energy efficient, such that it could allow autonomous devices (not needing any network access) broadly deploying AI for the citizens.

We will illustrate one attempt to propose such a dedicated hardware demonstrating neural network processing, according to the concept of Reservoir Computing, and implementing it through a photonic nonlinear delay dynamical system.

Laurent Larger has explored many facets of nonlinear delay dynamical systems, applying some of their properties for various applications in photonics: Chaotic motion was used to develop new physical layer cryptographic schemes for optical telecommunications, periodic oscillations can achieve extremy low phase noise level to improve Radar resolution, and the infinite dimensional phase space of delay systems is a powerful feature to mimmic neural network processing concepts as they were observed in the brain.

Laurent Larger is currently director of the FEMTO-ST institute (CNRS), he is honorary member of the Institut universitaire de France (Junior 2007), and he is a fellow member of the Optical Society of America.