Тёмный

Trevor Hastie - Gradient Boosting Machine Learning 

H2O.ai
Подписаться 22 тыс.
Просмотров 151 тыс.
50% 1

Professor Hastie takes us through Ensemble Learners like decision trees and random forests for classification problems. Don’t just consume, contribute your code and join the movement: github.com/h2oai
User conference slides on open source machine learning software from H2O.ai at: www.slideshare....

Опубликовано:

 

17 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 22   
@rsachs
@rsachs 8 лет назад
Why did video editor put the inset video right in the middle over the last line of the slides? Should have put video so that it would not block the slides, like in in the corner.
@jasmeetsasan902
@jasmeetsasan902 9 лет назад
BIg Fan of Prof. Hastie! Genius Statistician indee!
@newbie8051
@newbie8051 10 месяцев назад
Great lecture, got to revise the basics Thanks prof 🙌
@alkodjdjd
@alkodjdjd 5 лет назад
This guy is awesome: he knows his stuff quite well and can explain it great!
@masteronepiece6559
@masteronepiece6559 5 лет назад
He is the only one who knows his stuff.
@hpent9940
@hpent9940 8 лет назад
it's funny how boosting applies to real life---like cleaning my basement of 20 year accumulation of stuff and having emotional parameters attached--basically a divide and conquer method called boosting while one processes painful emotion to stuff predictors out of each diminished remainder pile of stuff. at the end, you'll say "phew. that was not as herculean as cleaning the augean stables."
@KanishkaSharma05
@KanishkaSharma05 7 лет назад
I know that profesor Trevor Hastie is good but if you are here for learning about Gradient Boosting then this is not the video for you. The lecture is only of 30mins and rest of the 15mins. is Q&As. It only gives a very brief overview of Gradient Boosting and also covers other algorithms which doesn't help the title.
@Sam-AZ
@Sam-AZ 6 лет назад
Thanks
@UtkarshMishra1958
@UtkarshMishra1958 6 лет назад
Any resource or youtube videos where i can learn GBM and XGBOOST ?
@scomeron
@scomeron 8 лет назад
great presentation
@seguranca2009
@seguranca2009 8 лет назад
Incredible talk.
@MagicmathmandarinOrg
@MagicmathmandarinOrg 6 лет назад
Excellent advice in Q & A. Thank you.
@TheCsePower
@TheCsePower Год назад
I didnt know he had a South African accent! Greetings from Africa.
@adubey40
@adubey40 7 лет назад
Bagging is poor man's Bayes .. haha ;)
@apanapane
@apanapane 8 лет назад
Amazing lecture. Thank you!
@emrahyigit
@emrahyigit 7 лет назад
Very good talk! Congrats!
@yitzweb
@yitzweb 6 лет назад
At 9:40, why is bias slightly increased for bagging just because the trees are shallower? If it were just a single tree then, yes bias would be increased with a shallow tree vs a deep tree. However, if the definition of bagging is to average the results of N shallow trees, then shouldn't the definition of bias also take into account that we are defining the model as using N trees???
@puneetrajput2994
@puneetrajput2994 2 года назад
Shallow trees tend to miss out the overall tracking of the pattern thus a little increase in the biasness. Ensembling did reduced the biasness but the shallowness will have it's contribution too in increasing the biasness. Tradeoff.
@cashphattichaddi
@cashphattichaddi 8 лет назад
Brilliant!! :-)
@kparag01
@kparag01 5 лет назад
Legend 🙏
@rostiposkis
@rostiposkis Год назад
oh yeah
@user-bq4jb5os7x
@user-bq4jb5os7x 7 лет назад
Далее
AI can't cross this line and we don't know why.
24:07
Просмотров 471 тыс.
ДОМИК ДЛЯ БЕРЕМЕННОЙ БЕЛКИ #cat
00:38
AdaBoost, Clearly Explained
20:54
Просмотров 755 тыс.
Gradient Boosting : Data Science's Silver Bullet
15:48
7.4 Boosting and AdaBoost (L07: Ensemble Methods)
39:40
Машинное обучение 7. Gradient boosting
1:08:01
Data Science @Stanford Trevor Hastie 10/21/2015
59:00
ML Was Hard Until I Learned These 5 Secrets!
13:11
Просмотров 298 тыс.
17. Learning: Boosting
51:40
Просмотров 316 тыс.
ДОМИК ДЛЯ БЕРЕМЕННОЙ БЕЛКИ #cat
00:38