Тёмный

Detecting Anomalies Using Statistical Distances | SciPy 2018 | Charles Masson 

Enthought
Подписаться 67 тыс.
Просмотров 35 тыс.
50% 1

Statistical distances are distances between distributions or data samples and are used in a variety of machine learning applications. In this talk, we will show how we use SciPy's statistical distance functions-some of which we recently contributed-to design powerful and production-ready anomaly detection algorithms. With visual illustrations, we will describe the inner workings and the properties of a few common statistical distances and explain what makes them convenient to use, yet powerful to solve various problems. We will also show real-life applications and concrete examples of the anomalous patterns that such algorithms are able to detect in performance-monitoring and business-metric time series.
See the full SciPy 2018 playlist here: • SciPy 2018: Scientific...

Наука

Опубликовано:

 

14 июл 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 17   
@arshamafsardeir2692
@arshamafsardeir2692 2 года назад
The best explanation of Statistical Distances that I have found. Easy and nice explanation of Kolmogrov-Smirinov, Wasserstein distance, and KL-divergence.
@mathman2170
@mathman2170 2 года назад
Love it when a talk presents the material in a carefully developed, logical, manner. Merci!
@MauricioSalazare
@MauricioSalazare 6 лет назад
Well done! Nice explanation!
@kimmupfumira3417
@kimmupfumira3417 2 года назад
Great explanation! Easy to digest.
@nomcognom2332
@nomcognom2332 6 лет назад
Good!
@112ffhgffg12
@112ffhgffg12 2 года назад
Thanks
@minesinitiativesrussie1778
@minesinitiativesrussie1778 5 лет назад
T'es le meilleur fillot ! J'ai rien compris mais c'est quand même la classe !
@jamesmckeown4743
@jamesmckeown4743 4 года назад
17:13 there should be a negative in the definition of KL
@Mayur7Garg
@Mayur7Garg 3 года назад
I think the negative should be based on whether you are minimizing or maximizing it. By definition, distances are always positive.
@nmertsch8725
@nmertsch8725 5 лет назад
This is a great presentation! Is there a reason why you did not commit the nth Wasserstein distance to SciPy?
@canmetan670
@canmetan670 4 года назад
docs.scipy.org/doc/scipy/reference/generated/scipy.stats.wasserstein_distance.html As of this date, latest stable version of scipy is 1.3.1 on pip. This has been allegedly available after 1.0.0
@TheBjjninja
@TheBjjninja 4 года назад
6:15 we should either reject or fail to reject H0 i believe. Instead of “accept H0”
@harry8175ritchie
@harry8175ritchie 4 года назад
AKA accept. I think it depends on where you learn statistics. My professors always said accept and reject.
@mikhaeldito
@mikhaeldito 4 года назад
Semantically, "accepting H0" and "failing to reject H0" are the same. But they are not! P-value is a measure of the probability of our data assuming that the null hypothesis (such as no difference between two groups) is true. So, it is a measure against the null, not in favour of the null. This is why we have a statistical test of no difference, or similarity, that is called equivalence tests.
@joelwillis2043
@joelwillis2043 3 года назад
@@harry8175ritchie AKA NO. You can't conclude your assumption based on your assumption. This is like logic 101. HARD FAIL GO DRIVE A TRUCK FOR LIVING.
@harry8175ritchie
@harry8175ritchie 3 года назад
Not the way to handle it buddy.
@jesuse4691
@jesuse4691 4 года назад
G
Далее
Two Effective Algorithms for Time Series Forecasting
14:20
NumPy vs SciPy
7:56
Просмотров 33 тыс.
Собери ПК и Получи 10,000₽
1:00
Просмотров 2,6 млн
Я УКРАЛ ТЕЛЕФОН В МИЛАНЕ
9:18
Просмотров 110 тыс.