Тёмный

The Probability Monad 

Compose Conference
Подписаться 2,7 тыс.
Просмотров 8 тыс.
50% 1

Tikhon Jelvis
C◦mp◦se :: Conference
www.composeconference.org/2017/
May 18, 2017
Probability distributions form a monad, giving us a lightweight, surprisingly simple probabilistic language embedded in Haskell. We can write stochastic models as normal Haskell programs and then interpret them either exhaustively or by random sampling.
I’ll give an in-depth explanation of how a simple discrete probability distribution monad works, along with real-world examples from my work on supply chain optimization at Target. This simple probability monad has been a great fit for the stochastic optimization problems we’re facing at Target, where different solution methods require random sampling (simulation-based optimization) or the entire distribution (policy iteration, linear programming). As a bonus, this’ll give you a brief primer on supply chain optimization.
I’ll also talk about the very real performance shortcomings of this approach-which we’ve mostly managed to dodge at Target.
Finally, I’ll introduce some recent research that defines a free-monad based distribution type that can take advantage of cutting-edge research on probabilistic programming. This approach lets us work with continuous distributions and Bayesian conditioning, and lets us deploy modern sampling and probabilistic inference algorithms with performance comparable to dedicated probabilistic programming languages like Anglican.
This talk primarily draws on two papers:
- ‘Probabilistic Functional Programming in Haskell’ by Martin Erwig and Steve Kollmansberger which describes the discrete probability monad
- ‘Practical Probabilistic Programming with Monads’ by Adam Scibior, Zoubin Ghahramani and Andrew D. Gordon which describes how to extend the basic monadic approach with modern probabilistic programming techniques

Наука

Опубликовано:

 

18 сен 2017

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 7   
@ashwinrao3721
@ashwinrao3721 6 лет назад
Beautifully explained, Tikhon!
@rnavarro50
@rnavarro50 4 года назад
What an amazing talk! Thanks a lot Tikhon!
@mateja176
@mateja176 5 лет назад
He spits out words faster than an auctioneer
@S3thc0n
@S3thc0n 6 лет назад
I was having quite some trouble trying to calculate the conjunction of two probabilities that are based on the same probabilities. Vastly simplified: What is the chance of when two dice are thrown, the higher of the two being above 3, and the lower of the two being below 4. Do you deal with anything like that? I imagine one would lift the sort, but it remains somewhat unclear how this actually plays out.
@S3thc0n
@S3thc0n 6 лет назад
Could the choice to normalize (in the exhaustive version, with only types that are equatable) be made automatically by a runtime if the compiler can be taught that addition and multiplication are commutative?
@solid8403
@solid8403 6 лет назад
I think this talk is a bit much for me. Lost me at 'Distribution of distributions'. I just want to learn how to write a Haskell program with aeson, mongodb, scotty and some State monad transformer stuff with logging and error handling
@ancbi
@ancbi 4 года назад
Let me help explain. A distribution of discrete set of outcomes can be be drawn as a tree of height 1 with the leaves labelled with each outcome and its weight. A distribution of distribution is can just be drawn as a tree of height 2.
Далее
A Categorical View of Computational Effects
1:12:41
Просмотров 22 тыс.
Brian Beckman: Don't fear the Monad
1:07:10
Просмотров 397 тыс.
George Wilson - The Extended Functor Family
21:57
Просмотров 15 тыс.
"Propositions as Types" by Philip Wadler
42:43
Просмотров 125 тыс.
Monad Transformer State - Michael Snoyman
33:46
Просмотров 13 тыс.
What is a Monad? - Computerphile
21:50
Просмотров 595 тыс.