Тёмный
No video :(

Bobak Kiani: On the hardness of learning under symmetries 

One world theoretical machine learning
Подписаться 1,9 тыс.
Просмотров 184
50% 1

Speaker: Bobak Kiani
Date: 29 May 2024
Title: On the hardness of learning under symmetries
Abstract: We study the problem of learning equivariant neural networks via gradient descent. The incorporation of known symmetries ("equivariance") into neural nets has empirically improved the performance of learning pipelines, in domains ranging from biology to computer vision. However, a rich yet separate line of learning theoretic research has demonstrated that actually learning shallow, fully-connected (i.e. non-symmetric) networks has exponential complexity in the correlational statistical query (CSQ) model, a framework encompassing gradient descent. In this work, we ask: are known problem symmetries sufficient to alleviate the fundamental hardness of learning neural nets with gradient descent? We answer this question in the negative. In particular, we give lower bounds for shallow graph neural networks, convolutional networks, invariant polynomials, and frame-averaged networks for permutation subgroups, which all scale either superpolynomially or exponentially in the relevant input dimension. Therefore, in spite of the significant inductive bias imparted via symmetry, actually learning the complete classes of functions represented by equivariant neural networks via gradient descent remains hard.

Опубликовано:

 

27 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии    
Далее
SPONGEBOB POWER-UPS IN BRAWL STARS!!!
08:35
Просмотров 21 млн
СЕРЕГА ПИРАТ - TEAM SPIRIT
02:37
Просмотров 313 тыс.
Variational Autoencoders
15:05
Просмотров 493 тыс.
Animation vs. Math
14:03
Просмотров 68 млн
Support Vector Machines: All you need to know!
14:58
Просмотров 142 тыс.
Lattice-based cryptography: The tricky math of dots
8:39
SPONGEBOB POWER-UPS IN BRAWL STARS!!!
08:35
Просмотров 21 млн