Тёмный

Week 4: Counting Uncertain Ways with Lagrange 

Bill
Подписаться 17
Просмотров 6
50% 1

In this episode we go over the basics of information uncertainty called entropy, find a useful measure of log likelihoods and compare some forecast models between two local rivals. Algebraic geometry raises its head here. No calculus, just arithmetic and a little algebra. Even log() is expressed as the simple polynomial (5/2)(p - p^2) = H(p), our measure of information entropy.

Опубликовано:

 

23 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии    
Далее
I tricked MrBeast into giving me his channel
00:58
Просмотров 11 млн
СДЕЛАЛИ СОБСТВЕННЫЙ МУЛЬТИК
25:15
Think Faster, Talk Smarter with Matt Abrahams
44:11
Просмотров 1,7 млн
Bayes theorem, the geometry of changing beliefs
15:11
Math 273 Linear Algebra Lecture 4
58:59
Просмотров 100
WHY IS THE HEAP SO SLOW?
17:53
Просмотров 253 тыс.
Bayesian Statistics with Hannah Fry
13:48
Просмотров 397 тыс.
Why Information Theory is Important - Computerphile
12:33
A pretty reason why Gaussian + Gaussian = Gaussian
13:16
I tricked MrBeast into giving me his channel
00:58
Просмотров 11 млн