Тёмный

7.4 Boosting and AdaBoost (L07: Ensemble Methods) 

Sebastian Raschka
Подписаться 37 тыс.
Просмотров 9 тыс.
50% 1

Sebastian's books: sebastianrasch...
This video discusses the general concept behind boosting -- one of the model ensembling approaches in machine learning. Then, it goes over an early boosting algorithm and approach called adaptive boosting (AdaBoost), which boosts weak learners (i.e., decision tree stumps) to strong classifiers.
-------
This video is part of my Introduction of Machine Learning course.
Next video: • 7.5 Gradient Boosting ...
The complete playlist: • Intro to Machine Learn...
A handy overview page with links to the materials: sebastianrasch...
-------
If you want to be notified about future videos, please consider subscribing to my channel: / sebastianraschka

Опубликовано:

 

18 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 17   
@kairiannah
@kairiannah Год назад
Thank you for these ML videos! I will buy your book to support your work
@SebastianRaschka
@SebastianRaschka Год назад
Thanks a lot @كاي بيدرام
@dragosmanailoiu9544
@dragosmanailoiu9544 3 года назад
Thank you for the clarifications I just finished this part in your book and thank you as well for the stacking extra lecture
@mahmoudsalhab3007
@mahmoudsalhab3007 3 года назад
Thanks for these super helpful and amazing tutorials! can't wait for the rest of the course ♥!
@ayushdudedon
@ayushdudedon 5 месяцев назад
Hi Sebastian, liked the videos and the detail. One thing i noticed for calculating alpha the function which is used, it will give the more importance to the weak learner with higher rate as well. Only thing that will happen we will be getting a negative sign attached to the output leading to choosing other class(assuming binary classification in classes (-1,1)). in the video, at 27:08, you mentioned classifier with high error is not important for prediction. Let me know if i am missing something.
@newbie8051
@newbie8051 11 месяцев назад
Simple and complete explanation, thanks prof !!!
@noureddineadjir
@noureddineadjir Месяц назад
Thank you for the details. You haven't used the validation set.
@sinanstat
@sinanstat 3 года назад
This looks like a great method. However, during the iterative "weighting" process, there are too many fixes. Wouldn't that result in an over-fitting issue?
@SebastianRaschka
@SebastianRaschka 3 года назад
I practice, I would say it is not overfitting more than other algorithms, necessarily. It's actually better than most non-ensemble classifiers but that might be something to look into on a selection of datasets. In practice, I think why it doesn't suffer from overfitting that much is that a) the decision trees are still just 1-level deep and b) you consider the ensemble of decision trees from the different rounds instead of just the last tree.
@sinanstat
@sinanstat 3 года назад
@@SebastianRaschka It makes sense, thank you very much for the explanation!
@negarmahdavi4330
@negarmahdavi4330 5 месяцев назад
26:17, Misclassified instances gain higher weights: so the next classifier is more likely to classify it correctly. I think what you said was the opposite of this which doesn't seem correct.
@konstantinostzaferis5318
@konstantinostzaferis5318 Год назад
Sebastian I have a qustion. I am following this course while reading your book (Machine Learning with Pytorch ..). My question is this. In your book you code out a perceptron model using python. Do we need to know the code behind these algorithms, like the ID3 tree or the AdaBoost code? Do we need to go into the anaconda3 libraries and search for the algorithms and actually know the code behind them? Or we it is sufficient to only know how to call them from the scikit learn Library? I am asking because I suppose to be able to become a machine learning engineer you have to know the code behind the algorithms and actually be able to code them out yourself, or I am I completely wrong?
@RahulPrajapati-jg4dg
@RahulPrajapati-jg4dg 3 года назад
Sir what even pdf and link which you are mentioning the lectures, that link please mention the here sir
@SebastianRaschka
@SebastianRaschka 3 года назад
I need to add all the links to the PDFs to the video descriptions some time. For now, all the lecture slides can be found here: sebastianraschka.com/pdf/lecture-notes/stat451fs20/
@liammartin6793
@liammartin6793 3 года назад
What is meant by a model been expensive ?
@SebastianRaschka
@SebastianRaschka 3 года назад
Good question. Here, I meant that it is computationally expensive, i.e., it takes a long time to run and/or requires more computational resources than other simpler models.
@shanurmilon5433
@shanurmilon5433 Год назад
Thanks Man. It’s really a good tutorial ❤.
Далее
7.5 Gradient Boosting (L07: Ensemble Methods)
1:04:05
Просмотров 12 тыс.
7.7 Stacking (L07: Ensemble Methods)
34:13
Просмотров 11 тыс.
For my passenger princess ❤️ #tiktok #elsarca
00:24
У НАС ДОМА ЗАВЕЛАСЬ КРЫСА 🐀
01:00
Добрая весть 😂
00:21
Просмотров 548 тыс.
AdaBoost, Clearly Explained
20:54
Просмотров 755 тыс.
7.3 Bagging (L07: Ensemble Methods)
37:46
Просмотров 6 тыс.
Boosting
34:05
Просмотров 6 тыс.
All Learning Algorithms Explained in 14 Minutes
14:10
Просмотров 239 тыс.
7.2 Majority Voting (L07: Ensemble Methods)
23:32
Просмотров 13 тыс.
12. Clustering
50:40
Просмотров 299 тыс.
AdaBoost : Data Science Concepts
12:26
Просмотров 18 тыс.