Тёмный

193 - What is XGBoost and is it really better than Random Forest and Deep Learning? 

DigitalSreeni
Подписаться 108 тыс.
Просмотров 40 тыс.
50% 1

Опубликовано:

 

2 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 61   
@riti_joshi
@riti_joshi 5 месяцев назад
I never comment on any RU-vid videos, but I am compelled to do here, because I learned most of my analyses for my dissertation following your tutorials. You're such a great tutor. Thank you so much.
@DigitalSreeni
@DigitalSreeni 5 месяцев назад
Wow, thank you!
@axe863
@axe863 10 месяцев назад
XGBoost with Regularized Rotations and Synthetic Feature Construction can approximate Deep NN deepness
@mhh5002
@mhh5002 2 года назад
Very well explained, sir. It was intuitive for beginners. The analogies are interesting as well.
@DigitalSreeni
@DigitalSreeni 2 года назад
Glad to hear that
@pvrnaaz
@pvrnaaz 2 года назад
Very organized and clear with excellent examples that make it so easy to understand. Thank you!
@grantsmith3653
@grantsmith3653 2 года назад
Sreeni said we need to normalize, but I always thought we didn't need to do that with trees... Am I confused on something? Thanks for the video!!
@vzinko
@vzinko Год назад
Another case of data leakage. You can't scale X and then split it into test and train. The scaling needs to happen after the split.
@Beowulf245
@Beowulf245 Год назад
Thank you. At least someone understands.
@andyn6053
@andyn6053 Год назад
So this video is incorrect?
@barrelroller8650
@barrelroller8650 Год назад
It's not clear where did you get a dataset in a CSV format - the .zip archive from provided link includes only `wdbc.data` and `wdbc.names` files
@kakaliroy4747
@kakaliroy4747 3 года назад
The example of bagging is so funny and I can fully relate
@Ahmetkumas
@Ahmetkumas 3 года назад
Thanks for the video and effort. Can you make a time series video using xgboost, or something with multiple features(lags, rolling mean, etc..)
@rezaniazi4352
@rezaniazi4352 3 года назад
thanks for the video what we have to change if we want to use XGBRegressor() insted if classifier ? xgboost documantation is so confusing !
@user.................
@user................. 3 месяца назад
bro trying to share about life n forgot wts hes teaching 🤣🤣🤣🤣 only were i gt complete idea about xgboost tq
@mouraleog
@mouraleog Год назад
Awesome video, thank you ! Greetings from Brazil!
@DigitalSreeni
@DigitalSreeni Год назад
Thanks for watching!
@Fran171UA
@Fran171UA 24 дня назад
An excellent tutorial! Thank you very much! 🙏
@DrNuyenVanFaulk
@DrNuyenVanFaulk 2 месяца назад
I really appreciate this explanation! Thank you!
@andyn6053
@andyn6053 Год назад
This was very clear and useful! Do u have any link to your code? Also, could xgboost be used for linear regression aswell?
@sbaet
@sbaet 3 года назад
can you make a quick video on normalization and standardizetion for a image dataset
@VarunKumar-pz5si
@VarunKumar-pz5si 3 года назад
Awesome Tutorial, Glad I got a great teacher..Thank you...
@ashift32
@ashift32 3 года назад
Very well explained, clear and concise. Thanks for taking your time
@DigitalSreeni
@DigitalSreeni 3 года назад
You're very welcome!
@caiyu538
@caiyu538 2 года назад
Perfect tutorial, I am using XGBoost and random forest to analyze some work. Perfect tutorials for me. Always appreciate your continuous efforts to share your knowledge through youtube.
@sathishchetla3986
@sathishchetla3986 Год назад
Thank you so much for your explanation sir
@alejandrovillalobos1678
@alejandrovillalobos1678 3 года назад
can you talk about transformers please?
@sudippandit6676
@sudippandit6676 3 года назад
Very organized and straight forward! Waiting other videos. Thank you for sharing this knowledge.
@SP-cg9fu
@SP-cg9fu Год назад
very useful video ! Thank you!
@semon00
@semon00 6 месяцев назад
Wow your explanotion is awesome!!! Dont stop plz
@Frittenfinger
@Frittenfinger 11 месяцев назад
Nice T-Shirt 😃
@ahmedraafat8769
@ahmedraafat8769 2 года назад
The dataset has been removed from the website. is it possible to upload it?
@DigitalSreeni
@DigitalSreeni 2 года назад
Just google search for the keywords and you'll find it somewhere, may be on Kaggle. I do not own the data so I cannot share it, legally.
@v1hana350
@v1hana350 2 года назад
I have a question about the Xgboost algorithm. The question is how parallelization works in the Xgboost algorithm and explain me with an example.
@andromeda1534
@andromeda1534 3 года назад
Looks like when you demo-ed random forest, you didn't comment out the xgb line, so you actually showed the fitting for xgb twice with the same results.
@farhaddavaripour4619
@farhaddavaripour4619 2 года назад
Thanks for the video. Something I noticed in the figure above that you might have missed is that in the figure you show the most evolved species has lighter hair than less evolved which could interpret a false impression that species with lighter hair are more evolved. It would be great if you could adjust the figure.
@multiversityx
@multiversityx Год назад
What’s the method name? When are you presenting at NeurIPS? (I’ll be attending it :)
@tannyakumar284
@tannyakumar284 3 года назад
Hi. I have a 1500x11 dataset and I am trying to see which out of cognitive ability, non-cognitive ability, family structure, parental involvement, and school characteristics predict academic performance (measured in terms of grades ranging from 1-5). Should I be using XG Boost for this problem or random forest? Thanks!
@longtruong9935
@longtruong9935 3 года назад
dataset in the UCI link not avaiable now. could any one can provide update link?
@kangajohn
@kangajohn 3 года назад
if your explanations were a kaggle competition it would be top 1%
@drforest
@drforest 2 года назад
Awesome comparison. Super thanks
@Lodeken
@Lodeken Год назад
Wow that analogy! 😂 Amazingly apt lol!
@khairulfahim
@khairulfahim 2 года назад
Where can I get the exact .csv file?
@omeremhan
@omeremhan 2 года назад
Magnificant!!! Thanks for clear explanation Sir.
@RealThrillMedia
@RealThrillMedia Год назад
Very helpful thank you!
@Bwaaz
@Bwaaz Год назад
very clear thanks :)
@evazhong4419
@evazhong4419 2 года назад
your explanation is so interesting haha, it helps me a lot understand the material
@venkatesanr9455
@venkatesanr9455 3 года назад
Thanks, Sreeni sir for your valuable and knowledgeable content. Also, waiting for the next semantic segmentation series and also discusses the hyperparameters and their tuning, Time series analysis that will be highly helpful.
@darioooc
@darioooc Год назад
Great!
@ghafiqe
@ghafiqe 2 года назад
Perfect
@evyatarcoco
@evyatarcoco 3 года назад
A very useful episode, thanks sir
@abderrahmaneherbadji5478
@abderrahmaneherbadji5478 3 года назад
thank you very much
@vikramsandu6054
@vikramsandu6054 3 года назад
Well explained. Thank you so much for the video.
@DigitalSreeni
@DigitalSreeni 3 года назад
Glad you liked it
@3DComputing
@3DComputing 3 года назад
You're worth more money
@DigitalSreeni
@DigitalSreeni 3 года назад
I am priceless :)
@ramakrishnabhupathi4995
@ramakrishnabhupathi4995 3 года назад
Good one
@DigitalSreeni
@DigitalSreeni 3 года назад
Thank you! Cheers!
@kangxinwang3886
@kangxinwang3886 3 года назад
loved the arrange marriage example! Made it very intuitive and easy to understand. Thank you!
@agsantiago22
@agsantiago22 2 года назад
Merci !
@DigitalSreeni
@DigitalSreeni 2 года назад
Thank you very much.
Далее
Negative Time is Real, Physicists Confirm. Kind Of.
6:59
Random Forest Algorithm Clearly Explained!
8:01
Просмотров 610 тыс.
Gradient Boosting : Data Science's Silver Bullet
15:48
XGBoost in Python from Start to Finish
56:43
Просмотров 227 тыс.
Meta Has Changed The Game.
10:17
Просмотров 6 тыс.