Тёмный

Uncertainty Quantification and Deep Learning ǀ Elise Jennings, Argonne National Laboratory 

Argonne Meetings, Webinars, and Lectures
Подписаться 7 тыс.
Просмотров 14 тыс.
50% 1

Опубликовано:

 

29 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 16   
@EigenA
@EigenA Год назад
Love how she handled the questions in the middle of the presentation. Great work on the research too!
@jijie133
@jijie133 Год назад
me too.
@nickrhee7178
@nickrhee7178 8 месяцев назад
I guess that the size of uncertainty will depend on the size of dropout rate. how can I determine optimal dropout rate?
@jiongwang7645
@jiongwang7645 7 месяцев назад
at around 10:00, last line, should be integration over theta, correct?
@a2002
@a2002 2 года назад
Great presentation. Can we get a copy of the code or the github link? Thank you
@corentink3887
@corentink3887 3 года назад
good presentation, do we have acces to the code?
@alexandterfst6532
@alexandterfst6532 3 года назад
that was an excellent explanation
@saderick52
@saderick52 10 месяцев назад
I feel there is big gap between the lecture and audience. Variational inference is a pretty complicated process by itself. It’s difficult to introduce BBN without talking about how variational inference works
@masisgroupmarinesoftintell3299
@masisgroupmarinesoftintell3299 3 года назад
how do you interpret the uncertainties in prediction
@michaelsprinzl9045
@michaelsprinzl9045 2 года назад
"How do you parameterize a distribution?" Answer: "Like you are parameterize every distribution". Ok I got it.
@ivotavares6576
@ivotavares6576 2 года назад
This was a really interesting presentation!!!
@KarriemPerry
@KarriemPerry Год назад
Outstanding presentation!!
@charilaosmylonas5046
@charilaosmylonas5046 3 года назад
13:31 - It's a really interesting mistake that she mixes the "Laplace" (which is the correct distribution she wanted to say) with the Poisson distribution! It has to do with PDEs: the Laplace PDE is the homogeneous version of the Poisson PDE! hehe (I could easily do the same mistake)
@siddhantrai7529
@siddhantrai7529 2 года назад
Hi Chariloas, Could you please describe how L1 corresponds to Poisson as she mentioned? And how Laplace as you mentioned is a correction over it. I am able to understand why l2 and normal makes sense, but for L1 I feel a bit clueless, I would really appreciate your guidance in this. Thank you
@charilaosmylonas5046
@charilaosmylonas5046 2 года назад
@@siddhantrai7529 check any reference on "Bayesian interpretation of regularization" - I answered before but the comment seems to disappear for some reason! Also, note the dependence on exp^(-|X|^2) in the Gaussian PDF and the dependence on exp^(-|X|^1) in the Laplace (not Poisson!) PDF. There is a "Poisson" distribution but it's not relevant to L1 regularization! She makes an honest mistake because of the connection of Poisson and Laplace in the diffusion PDEs! (There is also a Poisson and Laplace PDE - that's what my comment was about!).
@siddhantrai7529
@siddhantrai7529 2 года назад
@@charilaosmylonas5046 Thank you for the reply, makes sense now. For sure, I would look into "Bayesian interpretation of regularization" as you mentioned. Thanks again. 😁😁