Eveline’s and Tim’s paper on Hardware implementation of backpropagation using progressive gradient descent for in situ training of multilayer neural networks is now out in Science Advances.

The highly predictable tuning characteristics of organic EC-RAM allow us to now train multilayer artificial neural networks directly in hardware. 

First authors Eveline Van Doremaele and Tim Stevens found a way to overcome the energy-inefficient storing of the partial derivative of the weights digitally, by sequentially updating each layer in the network directly in hardware

This novel progressive gradient descent algorithm truly paves the way for highly efficient training of advanced artificial intelligence computing systems. Many thanks to co-lead Marco Fattori, the other Eindhoven University of Technology authors and collaborators, and funding by EHCI – Eindhoven Hendrik Casimir Institute

Find the open access paper here: https://www.science.org/doi/10.1126/sciadv.ado8999 

And find the news article on the TU/e website here: https://www.tue.nl/en/news-and-events/news-overview/12-07-2024-neural-network-training-made-easy-with-smart-hardware 

An organic neuromorphic chip based on ECRAM devices used for in situ hardware training of neural networks, built by Tim Stevens and Eveline van Doremaele in groups of Yoeri van de Burgt (Mechanical Engineering ) & Marco Fattori (Electrical Engineering), TU Eindhoven. Photo credit: Bart van Overbeeke.
Categories: Publications

0 Comments

Leave a Reply

Avatar placeholder

Your email address will not be published. Required fields are marked *