Тёмный

An Introduction to PAC-Bayes 

MLG Reading Group
Подписаться 763
Просмотров 4,1 тыс.
50% 1

Speakers: Andrew Foong, David Burt, Javier Antoran
Abstract:
PAC -Bayes is a frequentist framework for obtaining generalisation error bounds. It has been used to derive learning algorithms, provide explanations for generalisation in deep learning, and form connections between Bayesian and frequentist inference. This reading group will cover a broad introduction to PAC bounds, the proof ideas in PAC -Bayes, and a discussion of some recent applications.
Suggested reading:
Computing Nonvacuous Generalization Bounds for Deep (Stochastic) Neural Networks with Many More Parameters than Training Data: arxiv.org/abs/1703.11008
PAC -Bayesian Theory Meets Bayesian Inference: arxiv.org/abs/1605.08636
Learning under Model Misspecification: Applications to Variational and Ensemble Methods: arxiv.org/abs/1912.08335

Опубликовано:

 

5 май 2021

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 7   
@wowtbcmagepvp
@wowtbcmagepvp 2 года назад
The clarification about what’s random at
@Noah-jz3gt
Explanation is a bit unclear but the slides contain enough information for me to understand the whole derivations of PAC bayes bound. Thanks!
@vtrandal
@vtrandal Год назад
What is PAC? PAC is an abbreviation for ... what?
Далее
The Statistical Finite Element Method
1:16:56
Мой инстаграм: v1.ann
00:13
Просмотров 93 тыс.
PAC Learnability
30:24
Просмотров 8 тыс.
Ali Ghodsi, Lec 19: PAC Learning
28:01
Просмотров 27 тыс.
Optimal Transport Metrics
1:25:25
Просмотров 2,5 тыс.
MMD, Kernel Trick and Deep Learning
13:05
Просмотров 8 тыс.
Мой инстаграм: v1.ann
00:13
Просмотров 93 тыс.