Тёмный

Generalization and Overfitting 

Computational Thinking
Подписаться 2,9 тыс.
Просмотров 5 тыс.
50% 1

By fitting complex functions, we might be able to perfectly match the training data with zero loss. In this video, we learn how to separate the data into training and evaluation data. If the training loss is better than the evaluation loss, our function is too complicated, and we are over-fitting to the training data. We should use cross validation to make sure that our model works just as well for unseen data.

Опубликовано:

 

17 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 3   
@insidecode
@insidecode Год назад
Nice and clean!
@pratyushbhattarai5632
@pratyushbhattarai5632 Год назад
thanks
@kukuruzayevhenii8764
@kukuruzayevhenii8764 Год назад
nice :)
Далее
Bias-Variance Tradeoff
5:05
Просмотров 1,3 тыс.
AI can't cross this line and we don't know why.
24:07
Просмотров 556 тыс.
GIANT Gummy Worm Pt.6 #shorts
00:46
Просмотров 8 млн
New Discovery: LLMs have a Performance Phase
29:51
Просмотров 15 тыс.
Support Vector Machines - THE MATH YOU  SHOULD KNOW
11:21
All Learning Algorithms Explained in 14 Minutes
14:10
Просмотров 239 тыс.
Watching Neural Networks Learn
25:28
Просмотров 1,3 млн
Machine Learning Evaluation
6:18
Просмотров 4,7 тыс.