Тёмный

Eric J. Ma - An Attempt At Demystifying Bayesian Deep Learning 

PyData
Подписаться 161 тыс.
Просмотров 69 тыс.
50% 1

Опубликовано:

 

26 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 14   
@ThierryAZALBERT
@ThierryAZALBERT 10 месяцев назад
Great video to develop a simple mind model of neural networks. Bonus : frequentist vs. Bayesian made simple! Great work Eric!
@mherkhachatryan666
@mherkhachatryan666 2 года назад
Love the charisma, enthusiasm put in this talk well done!
@cnaccio
@cnaccio 2 года назад
Huge win for my personal understanding on this topic. I wish every talk was given in this format. Thanks!
@harshraj22_
@harshraj22_ 2 года назад
1:00 Intro to Linear, Logistic regression, Neural Nets 9:40 Going Bayesian 14:32 Implementation Using PyMC3 24:27 QnA
@HeduAI
@HeduAI Год назад
Excellent talk! Thank you!
@BigDudeSuperstar
@BigDudeSuperstar 2 года назад
Incredible talk, well done!
@bracodescanner
@bracodescanner 6 месяцев назад
I understand the benefit of modelling aleatoric uncertainty, e.g. to be able to deal with heteroscedastic noise. However, why do we need to model epistemic uncertainty? The best prediction after all, lies in the middle of the final distribution. If you sample from the distribution, you will lose accuracy. So is uncertainty only useful for certain applications to determine different behaviour based on high uncertainty? For example: If uncertainty is high, drive slower?
@catchenal
@catchenal 2 года назад
The other presentation Eric mentions is that of Nicole Carlson: Turning PyMC3 into scikit learn ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-zGRnirbHWJ8.html
@suzystar3
@suzystar3 10 месяцев назад
Thank you so much! This has helped me so much with my project and really helped to understand both deep learning and bayesian deep learning much better. I really appreciate it!
@MiKenning
@MiKenning 2 года назад
Was he referring to Tensorflow when he denigrated an unnamed company for its non-pythonic API? The new Tensorflow is much better!
@cherubin7th
@cherubin7th 2 года назад
Great explanation!
@sdsa007
@sdsa007 Год назад
great energy! and nice philosophical wrap-up!
@vtrandal
@vtrandal Год назад
Point #1 is wrong. You left out activations.
@bonob0123
@bonob0123 5 месяцев назад
The tanh and Relu nonlinearities are the activations. He is not wrong. You are wrong. Learn to be humble.
Далее
History of Bayesian Neural Networks (Keynote talk)
40:25
Kenji's Sushi Shop Showdown - Brawl Stars Animation
01:55
🎙ПЕСНИ ВЖИВУЮ от КВАШЕНОЙ
3:05:21
Тонкости французской кухни🥰
00:48
Bayesian neural networks
6:45
Просмотров 12 тыс.
The Boundary of Computation
12:59
Просмотров 1 млн
The better way to do statistics
17:25
Просмотров 231 тыс.
Kenji's Sushi Shop Showdown - Brawl Stars Animation
01:55