Тёмный

DDPS | Learning paradigms for neural networks: The locally backpropagated forward-forward algorithm 

Inside Livermore Lab
Подписаться 9 тыс.
Просмотров 286
50% 1

• DDPS Talk date: June 14, 2024
• Speaker: Fabio Giampaolo (University of Naples Federico II, scholar.google...)
• Description: Backpropagation is the most widely used method for training Neural Networks. It has proven its effectiveness across a wide array of contexts, facilitating the efficient optimization of deep learning models. However, it exhibits certain weaknesses in specific scenarios that must be addressed to broaden the applicability of AI strategies in real-world situations. This is especially true in the integration of Deep Learning (DL) strategies within complex frameworks that deal with physics-related problems. Challenges such as the incorporation of non-differentiable components within neural architectures, or the implementation of distributed learning on heterogeneous devices, are just a few examples of the hurdles faced by researchers in the field. Inspired by one of the recent works of Geoffrey Hinton, the Locally Backpropagated Forward Forward training strategy is a novel approach that merges the effectiveness of backpropagation with the appealing attributes of the Forward-Forward algorithm. This combination aims to provide a viable solution in contexts where traditional methods show limitations.
• Bio: Fabio Giampaolo is a Research Fellow in Computer Science and Artificial Intelligence and a member of the MODAL Research Group at the University of Naples Federico II, where he received his Ph.D. in in Mathematics and Applications. Member of the Editorial Board of the journal Neural Computing and Applications, published by Springer, he has co-authored articles in International Journals and conference papers on both applicative and methodological research in the Deep Learning field. His research interests include Machine Learning and Deep Learning, with a particular focus on exploring the dynamics of learning. This exploration encompasses a wide range of topics, including the development of innovative neural network architectures, the investigation on new learning algorithms' efficiency, and the application of these methodologies to solve complex real-world problems.
DDPS webinar: www.librom.net...
💻 LLNL News: www.llnl.gov/news
📲 Instagram: / livermore_lab
🤳 Facebook: / livermore.lab
🐤 Twitter: / livermore_lab
About LLNL: Lawrence Livermore National Laboratory has a mission of strengthening the United States’ security through development and application of world-class science and technology to: 1) enhance the nation’s defense, 2) reduce the global threat from terrorism and weapons of mass destruction, and 3) respond with vision, quality, integrity and technical excellence to scientific issues of national importance. Learn more about LLNL: www.llnl.gov/.
IM release number is: LLNL-VIDEO-865970

Опубликовано:

 

28 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии    
Далее
КОТЯТА В ОПАСНОСТИ?#cat
00:36
Просмотров 921 тыс.
MIT 6.S191: Convolutional Neural Networks
1:07:58
Просмотров 74 тыс.
Liquid Neural Networks | Ramin Hasani | TEDxMIT
13:00
Think Fast, Talk Smart: Communication Techniques
58:20