Protecting Your Machine Learning Against Drift: An Introduction
[EuroPython 2021 - Talk - 2021-07-29 - Parrot [Data Science]]
[Online]
By Oliver Cobb
Deployed machine learning models can fail spectacularly in response to seemingly benign changes to the underlying process being modelled. Concerningly, when labels are not available, as is often the case in deployment settings, this failure can occur silently and go unnoticed.
This talk will consist of a practical introduction to drift detection, the discipline focused on detecting such changes. We will start by building an understanding of how drift can occur, why it pays to detect it and how it can be detected in a principled manner. We will then discuss the practicalities and challenges around detecting it as quickly as possible in machine learning deployment settings where high dimensional and unlabelled data is arriving continuously. We will finish by demonstrating how the theory can be put into practice using the alibi-detect Python library.
There are no hard prerequisites for understanding this talk, although background knowledge on machine learning and statistical hypothesis testing might be useful.
License: This video is licensed under the CC BY-NC-SA 4.0 license: creativecommons.org/licenses/...
Please see our speaker release agreement for details: ep2021.europython.eu/events/s...
6 авг 2024