In this episode we go over the basics of information uncertainty called entropy, find a useful measure of log likelihoods and compare some forecast models between two local rivals. Algebraic geometry raises its head here. No calculus, just arithmetic and a little algebra. Even log() is expressed as the simple polynomial (5/2)(p - p^2) = H(p), our measure of information entropy.
23 окт 2024