Тёмный
No video :(

Gradient Boost Part 2 (of 4): Regression Details 

StatQuest with Josh Starmer
Подписаться 1,2 млн
Просмотров 289 тыс.
50% 1

Опубликовано:

 

24 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 887   
@statquest
@statquest 4 года назад
NOTE: Gradient boost uses Regression Trees, which are explained in this StatQuest: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-g9c66TUylZ4.html Corrections: 4:27 The sum on the left hand side should be in parentheses to make it clear that the entire sum is multiplied by 1/2, not just the first term. 15:47. It should be R_jm, not R_ij. 16:18, the leaf in the script is R_1,2 and it should be R_2,1. 21:08. With regression trees, the sample will only go to a single leaf, and this summation simply isolates the one output value of interest from all of the others. However, when I first made this video I was thinking that because Gradient Boost is supposed to work with any "weak learner", not just small regression trees, that this summation was a way to add flexibility to the algorithm. 24:15, the header for the residual column should be r_i,2. Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
@giuseppefasanella5446
@giuseppefasanella5446 4 года назад
Hi, the video is great and gives a detailed insight of the algorithm, so thanks for your work. I have a note on min. 15.47. I think the way the output gamma is defined has the wrong indeces in the summation. To my understanding, for gamma_jm you don't want to sum over R_ij but over all the x_i which belong to R_jm, the same terminal region. Otherwise, if you sum over x_i belonging to R_ij you are jumping from one terminal region to another, while you want R_jm to be fixed and just pick up the different x_i in there. Hope I managed to explain myself. Cheers.
@statquest
@statquest 4 года назад
@@giuseppefasanella5446 You are correct! That's another typo. One day, when StatQuest is making the Big Bucks, I'm going to hire an editor. That's the dream! :)
@giuseppefasanella5446
@giuseppefasanella5446 4 года назад
@@statquest It's a beautiful dream! If you want, from time to time, depending on my working constraints, I could do it for free. You can contact me in private if you want. Cheers!
@statquest
@statquest 4 года назад
@@giuseppefasanella5446 That would be awesome. I have one on XGBoost math coming up in mid-january. Contact me through my website and I'll send it to you in advance. statquest.org/contact/
@cosworthpower5147
@cosworthpower5147 2 года назад
@@statquest Hi, I still wonder why there is the similarity between gradient descent and gradient boost regarding trees. Apparently, there is no Partial Derivative by a Parameter in gradient boost. Simply as a decision tree has no model intern parameters in contrast to a Regression model, where it is obvious, that the betas have to be iterarively tweaked in order to lower the applied loss function. It would be great if you could help me out there :)
@romans4436
@romans4436 3 года назад
You have what many others lack: clarity and simplicity. The visualization is very good. Thank you!
@statquest
@statquest 3 года назад
Wow, thank you!
@ulrichwake1656
@ulrichwake1656 5 лет назад
They said "Give a Man a Fish, and You Feed Him for a Day. Teach a Man To Fish, and You Feed Him for a Lifetime." Thank you very much for your video. I really like when you try to explain the algorithm and the math notation. I hope you keep doing that. :)
@statquest
@statquest 5 лет назад
Thank you! Yes, I plan on doing more algorithms for machine learning.
@daniyalahmed4440
@daniyalahmed4440 4 года назад
@@statquest Thanks a lot for these videos, these are simply amazing and super helpful.
@magus3267
@magus3267 3 года назад
kayaknya kenal
@marcellusorlando3414
@marcellusorlando3414 3 года назад
I realize it's kinda randomly asking but do anyone know of a good website to stream new series online ?
@arlodamian4565
@arlodamian4565 3 года назад
@Marcellus Orlando flixportal :D
@gunnvant
@gunnvant 5 лет назад
The visual description where you are adding consecutive models is the best summary of the gradient boosting description that I have seen so far.
@statquest
@statquest 5 лет назад
Thank you very much! :)
@HuyLe-nn5ft
@HuyLe-nn5ft Год назад
This explanation cannot be found anywhere else. You won't ever know how thankful i am, dude. Keep up the good work!
@statquest
@statquest Год назад
Thank you!
@madatbrahma4389
@madatbrahma4389 5 лет назад
Josh, you are the best . Master in simplifying complex topics .
@statquest
@statquest 5 лет назад
Thank you very much! :)
@adityanjsg99
@adityanjsg99 4 года назад
@@statquest I know a madat brahma from Bangalore who runs a food business.! You that Brahma?
@StackhouseBK
@StackhouseBK 21 день назад
The content of this channel is what makes internet great
@statquest
@statquest 21 день назад
Thanks!
@soumendas592
@soumendas592 2 года назад
You are the best, when every shortcut to understanding ML algorithm fails, you come at last as our savior with all the necessary details.
@statquest
@statquest 2 года назад
Thank you!
@MugiwaraSuponji
@MugiwaraSuponji Год назад
man the way you sound like a preschool teacher is making me emotional, you really made the first trauma-free math class 👍🏻👍🏻👍🏻👍🏻👍🏻
@statquest
@statquest Год назад
BAM! :)
@varun0505
@varun0505 5 лет назад
There are blogs explaining the gradient boosting on a dataset, there are blogs explaining the maths. I was facing difficulty in connecting those two. Hands down! Best video I came across in a long time. Thanks a lot. Please keep up the great work.
@statquest
@statquest 5 лет назад
Thank you! :)
@Shubhamkumar-ng1pm
@Shubhamkumar-ng1pm 3 года назад
i have no words for josh starmer.teachers like him deserve a special place in heaven.thnk you josh.
@statquest
@statquest 3 года назад
Thank you! :)
@flavialan4544
@flavialan4544 3 года назад
@@statquest he really does
@statquest
@statquest 3 года назад
@@flavialan4544 Thanks!
@meysamamini9473
@meysamamini9473 3 года назад
100 % agreeed
@jasonfaustino8815
@jasonfaustino8815 3 года назад
Timestamps!! 6:30 - Step 1 - Initialize model with constant value. Comes up to be the average of the target values. Cool math trick 9:10 - Step 2.0 - Set M for number of iterations 10:02 - Step 2.A - Create residuals 12:47 - Step 2.B - Fit a regression tree 14:40 - Step 2.C - Calculate output values (I recommend jotting down notes as a lot is happening in this step) 20:39 - Step 2.D - Make Predictions if m == M, then proceed to step 3, else, repeat step 2 Step 3 - Output FsubM(X) Thank Josh!! Really smoothed out my knowledge for Gradient Boosting methods.
@statquest
@statquest 3 года назад
Awesome!!!
@thomashirtz
@thomashirtz 3 года назад
@@statquest If you put it in the description youtube will create chapters for you :)
@statquest
@statquest 3 года назад
@@thomashirtz Great idea! BAM!
@pranavraj3024
@pranavraj3024 5 лет назад
This is the best explanation for GB regression that i have ever seen/read. Thank you so much explaining it in such simple terms!
@statquest
@statquest 5 лет назад
Thank you very much!
@lokeshmadasu4146
@lokeshmadasu4146 4 года назад
You are one of the best teacher i ever seen,visualization gives me clear understanding of the concept,math behind it.Every time ,i wish the video have been more minutes..
@statquest
@statquest 4 года назад
Thank you very much! :)
@Sorararawr
@Sorararawr 2 года назад
Probably the best explanation of this complex statistical method I have ever found in the entire semester. Thank you for all your hard work sir!!!
@statquest
@statquest 2 года назад
Wow, thank you!
@hubert1990s
@hubert1990s 4 года назад
it's unbelievable how well you explain it all. following this, I can even imagine spending a Friday evening learning ML :)
@statquest
@statquest 4 года назад
Wow! That's quite a complement. :)
@heitornunes6225
@heitornunes6225 3 года назад
I'm literally doing this right now hahah
@adarshtiwari7395
@adarshtiwari7395 Год назад
This channel is a blessing to prospective machine learning engineers. I am tired after the entire video but a sense of pride towards my efforts a sense of gratitude towards you Joshua made this ride worth while!
@statquest
@statquest Год назад
Awesome! :)
@matthewmiller3653
@matthewmiller3653 5 лет назад
Absolutely fantastic. I graduated college "on the verge" of higher math knowledge, but never quite put in the work for the courses. I've now jumped into ML research, but have found notation to consistently be the hold-up in a lot of my understanding, despite that the equations often express intuitive concepts. Being able to "translate" as you've done with this video connects many dots in a world that's often unnecessarily thought of as sink or swim. Awesome!
@statquest
@statquest 5 лет назад
I’m so glad to read that you like this video. I want to make more like it, where we just go through complicated sounding, and looking, algorithms step-by-step and show that they are simple things in the end.
@navyasailu18
@navyasailu18 3 года назад
@@statquest Hence the world needs you
@davidcho8877
@davidcho8877 2 года назад
I am studying with all the videos in Machine Learning playlist to prepare for my interviews. These videos are all awesome. But this one is especially more awesome. I majored in Statistics and occasionally study the papers to catch up on some recent ML skills. I always had a hard time understanding the steps of algorithms even though I also minored in Mathematics. I have never seen a professor who can teach steps of an algorithm this easy and clear. Thank you Josh for this amazing video. Would really appreciate it if you can make more videos about the fundamental details of ML techniques more (and if you have time, some interesting papers too)! From. Biggest fan of StatQuest
@statquest
@statquest 2 года назад
Wow!!! Thank you very much! :)
@gunjantoora863
@gunjantoora863 2 года назад
Can't thank god (and you) enough for these videos. All those textbook chapters with just formulas and notations were driving me crazy. YOUR VIDEOS ARE AMAZING!!!!
@statquest
@statquest 2 года назад
bam! :)
@aimenslamat1264
@aimenslamat1264 6 месяцев назад
from Algeria, u are the best.. none can explain ML like you Master
@statquest
@statquest 6 месяцев назад
Thank you!
@shangauri
@shangauri 3 года назад
If the intention is to clearly explain a complex topic, then start with an example and then get into the equations step by step. Most academicians make the mistake of scaring people by showing the equations at the start itself. You are doing this perfectly Josh. Many thanks.
@statquest
@statquest 3 года назад
Thank you! :)
@S2ReviewsS2
@S2ReviewsS2 3 года назад
You are a Gem Josh, with so many new and old comments, you have replied to almost all of them. Can't believe such a great person and teacher actually exists. :)
@statquest
@statquest 3 года назад
Thank you very much! :)
@kaicheng9766
@kaicheng9766 2 года назад
I don't think I have ever enjoyed this much for a math-intensive video. You are Godsend!
@statquest
@statquest 2 года назад
Wow, thank you!
@dungnintengtung8417
@dungnintengtung8417 5 месяцев назад
bro this is the best explanation on RU-vid. I love u man. You explain everything and make complex things so simple with simple word choice
@statquest
@statquest 5 месяцев назад
Thank you!
@fgfanta
@fgfanta 4 года назад
First explanation of all the GB details I find on-line which is actually easier than reading the original paper. Thanks!
@statquest
@statquest 4 года назад
Hooray! That was my goal. :)
@nguyendavid6396
@nguyendavid6396 5 лет назад
"The chainnnn ruleeeee" LOL
@phungtruong6698
@phungtruong6698 4 года назад
haha "The chainnnn rulleeeee " :v :v
@samerrkhann
@samerrkhann 3 года назад
Holy Smoke! I literally had to take small pauses to double-check if I am really living in reality. My God, how easily he explained all those intimidating math equations and notations. A BIG THANK YOU JOSH!!
@statquest
@statquest 3 года назад
Hooray! I'm glad the video was helpful.
@heyim3854
@heyim3854 5 лет назад
Thank you So much for your video. You are the 'Mozart' of the ML. Simple but infinitely subtle! 😊
@edkaprost3623
@edkaprost3623 Год назад
after watching some of ur videos i understand why it is so simple to understand ypur material comparing it with with other sources. Most of them just gives the theory without examples, u show example and then theory (use of induction). I hope that next generetaion of statistics' lecturers will use your videos as state of art in teaching field
@statquest
@statquest Год назад
Thank you! :)
@sameershah141
@sameershah141 3 года назад
There can not be a better and simpler explanation. Kudos for the efforts put in to make the presentation and the video.. (y)
@statquest
@statquest 3 года назад
Thanks a lot!
@user-fi2vi9lo2c
@user-fi2vi9lo2c 11 месяцев назад
Special thanks for correction on 21:08. I was thinking about it and was preparing to ask a question how it was possible that one sample ended in multiple leaves. Now there is no need to ask this question :)
@statquest
@statquest 11 месяцев назад
bam!
@abhijeetmhatre9754
@abhijeetmhatre9754 3 года назад
I have become fan of you after going through all your first video of ML. I haven't seen anyone explaining topics better than you. You explain any complex topic such that after watching it, viewer seems it as a simple topic. I started learning ML and deep learning since past 6 months, and I am learning a lot from your videos and your videos have given a lot of boost and confidence to learn more. I saw multiple study materials explaining gradient boosting, but it's only your video that made me help to fully understand it in a single go. Very big thank you to you sir for such wonderful video course on ML.
@statquest
@statquest 3 года назад
Thank you! I'm glad my videos are helpful! :)
@angels8050
@angels8050 2 года назад
Best simplified and visual explanations I haver ever seen on algorithms. I am definitely recommending your channel to anyone who is getting started on ML or that needs some refreshing. Keep on with the awesome work!
@statquest
@statquest 2 года назад
Wow, thanks!
@himanshutalegaonkar2522
@himanshutalegaonkar2522 3 года назад
By far the best video i've seen across all the platforms for machine learning !! I haven't come across anyone who goes to this extent into explaining the complicated maths behind such algorithms !! Please do more of such mathematical breakdown for famous research papers in ML and DL.
@statquest
@statquest 3 года назад
Wow, thanks!
@saurabhkale4495
@saurabhkale4495 4 года назад
best explanation available for gradient boast on the PLANET!!!!!!
@statquest
@statquest 4 года назад
Thank you very much! :)
@pyarepiyush
@pyarepiyush 5 лет назад
You're making math interesting for me. I've love hate relationship with math, but because of the work i do (data scientist), I've to keep on coming back to the math behind the algorithms. Your videos are joy to watch ... please continue to make these awesome videos
@statquest
@statquest 5 лет назад
Hooray! I'm glad you find my videos useful. :)
@k.y8274
@k.y8274 Год назад
this youtube channel is god damn amazing. cant find any other videos with that kind of clear explanation around the globe.
@statquest
@statquest Год назад
Thanks! :)
@jokmenen_
@jokmenen_ 2 года назад
I keep getting amazed by how good your videos are! You are truly a blessing
@statquest
@statquest 2 года назад
Thank you! :)
@manojtaleka954
@manojtaleka954 7 месяцев назад
The best video tutorial for Gradient Boosting. Thank you very much.
@statquest
@statquest 7 месяцев назад
Thanks!
@thilinikalpana7206
@thilinikalpana7206 3 года назад
This is awesome, the best I've seen so far that simplifies all the complex algorithms and math. Good job and keep doing more videos like this to simplify complex problems.
@statquest
@statquest 3 года назад
Thank you very much! :)
@milay6527
@milay6527 4 года назад
I can't believe how clearly this guy explains everything
@statquest
@statquest 4 года назад
Thank you very much!!! :)
@carazhang7416
@carazhang7416 3 года назад
I wish the lecturers in uni are half as good as you. This is just treasure.
@statquest
@statquest 3 года назад
Thanks!
@charlesstrickland8839
@charlesstrickland8839 5 лет назад
Like Josh's videos before watching them. Watched bunch of Josh's videos, all of them are really helpful and easy to understand, thx a lot!
@statquest
@statquest 5 лет назад
Thanks! :)
@koshrai7080
@koshrai7080 Год назад
It took some time but I think I was able to figure out how (or why) this works? We basically just make a base prediction, and then compute a step (the pseudo-residual) in the direction of the actual value. Then we model these steps with a decision tree, and use that model to slowly improve upon our previous prediction, and just do this over and over. Great Video. Very Intuitive.
@statquest
@statquest Год назад
bam!
@debabrotbhuyan4812
@debabrotbhuyan4812 4 года назад
Thank you so much for this video Josh. I never thought Boosting algorithms could be explained so clearly. Wish I had known about your channel one year back.
@statquest
@statquest 4 года назад
Thanks! :)
@anjulkumar9183
@anjulkumar9183 4 года назад
Never seen a better video tutorial such as yours...I love you man....a lot of respect for you...you really are doing a great job...I really am going to recommend everyone to watch your videos and I hope you would keep helping in the form these videos to teach ML in the most fascinating and beautiful way...
@statquest
@statquest 4 года назад
Thank you very much!!!! I'm glad you liked the StatQuest! :)
@robertomontalti3064
@robertomontalti3064 Год назад
Insane content and very well exaplained! I appreciated a lot your correction in the description for 21:08 "With regression trees, the sample will only go to a single leaf, and this summation simply isolates the one output value of interest from all of the others. However, when I first made this video I was thinking that because Gradient Boost is supposed to work with any "weak learner", not just small regression trees, that this summation was a way to add flexibility to the algorithm." . Thank you!
@statquest
@statquest Год назад
Glad it was helpful!
@rickandelon9374
@rickandelon9374 4 года назад
Holy I finished this and actually understood everything you tried to make me understand!! The best man on youtube! Deeply grateful, Thanks a lot!!
@rickandelon9374
@rickandelon9374 4 года назад
It was like a Quest in a beautiful puzzling game, just what the name 'StatQuest' implies!
@statquest
@statquest 4 года назад
Awesome! This a hard video to get through, so congratulations!!!
@yohanjeong3869
@yohanjeong3869 4 года назад
I think among all the videos i saw about data science, this channel provides the best explanation. Bam!
@statquest
@statquest 4 года назад
Thank you! :)
@markaitkin
@markaitkin 5 лет назад
easily the best video on youtube, can't wait for part 3 and 4.
@statquest
@statquest 5 лет назад
Thank you!
@sharanchhibbar7047
@sharanchhibbar7047 3 года назад
Hats off to your way of teaching. Wish you the best!
@statquest
@statquest 3 года назад
Thank you! :)
@AdityaSingh-yp9jn
@AdityaSingh-yp9jn 4 месяца назад
Best BEST BESTESTTTTT Lecture I have ever seen and heard. Literally, this is so engaging and maths seems so funny. I am from maths background and really loved the way of explanation. Bro HATS-OFF. Please continue making such content. Especially the core maths concept and its intuition are really missing now-a-days from a lot of explanations. KEEP it UP Man! Press 'F'
@statquest
@statquest 4 месяца назад
Wow, thank you!
@ineedtodothingsandstuff9022
@ineedtodothingsandstuff9022 4 года назад
I never seen a more clear explanation(literally), thank you so much!
@statquest
@statquest 4 года назад
Great to hear!
@honza8939
@honza8939 9 месяцев назад
In schools that teach data science and other statistics, I would play your videos. Because I don't know a teacher who can explain it that simply.
@statquest
@statquest 9 месяцев назад
Thank you very much! :)
@trisa_halder
@trisa_halder 7 месяцев назад
i'm so glad i found this channel, thankyou so much!
@statquest
@statquest 7 месяцев назад
Glad you enjoy it!
@marryrram
@marryrram Год назад
Excellent way of explaining each and every step. Thank you very much
@statquest
@statquest Год назад
Thank you!
@viswanathpotladurthy3383
@viswanathpotladurthy3383 4 года назад
WOW!!! How can it be so simple.I understand you take a lot of time to make it simple.Thanks on behalf of learning community!!
@statquest
@statquest 4 года назад
Thank you very much! :)
@harshvardhanr5062
@harshvardhanr5062 3 года назад
Legends say that Josh is so cool that he replies to comments even after 2 years
@statquest
@statquest 3 года назад
Bam
@abhasupadhayay6420
@abhasupadhayay6420 4 года назад
Just started watching your videos and I am extremely glad I found you. The explanation is simply as detailed as it can get. Sometimes I wonder if you are overfitting our minds, lol..Thanks a lot
@statquest
@statquest 4 года назад
Bam! :)
@SourabhSomvanshi
@SourabhSomvanshi 4 года назад
You Sir are just awesome!!! Saying awesome is just an understatement. You make the learning fun and interesting. I found these topics so difficult to understand from other sources. You make it so simple. There are many people who know how these things but its really an art to teach these topics with so much ease. Take a bow!!! A big fan of yours. Hope to see more such videos in the times to come :) BAM!!!
@statquest
@statquest 4 года назад
Wow, thanks!
@silentsuicide4544
@silentsuicide4544 2 года назад
i love this, thank you! i find learning algorithm s through math the best way to understand them, but sometimes the math behind them looks awful, but the idea and calculations are simple, and this is what I needed to be honest. The same goes for other algorithms, i can take a "math recipe" and go through it with your explanation in the background, like i did with adaboost. Thank you!
@statquest
@statquest 2 года назад
bam! :)
@nsp7537
@nsp7537 2 года назад
excellent to see someone making a video of both the concepts, followed by the math concepts. Will subscribe for more of those
@statquest
@statquest Год назад
Thanks!
@musasall5740
@musasall5740 3 года назад
Best explanation on Gradient boosting!
@statquest
@statquest 3 года назад
Wow, thanks!
@justfoundit
@justfoundit 5 лет назад
Thanks for clarifying me the tree building logic. Using simple regression tree looked illogical to me, but using it on the gradient AND providing values for the leaves based on the actual loss function: now it makes sense :)
@statquest
@statquest 5 лет назад
Awesome! :)
@lenkahasova9428
@lenkahasova9428 4 года назад
I love the way you present this, it's exactly what my brain needs!
@statquest
@statquest 4 года назад
Hooray! :)
@15Nero92
@15Nero92 Год назад
I was struggling with this, and you are helping me a lot. thankyou so much !
@statquest
@statquest Год назад
Happy to help!
@SteveCamilleri
@SteveCamilleri 4 года назад
Finally, a mathematical explanation that can be understood! Than You
@statquest
@statquest 4 года назад
Thanks! :)
@kalpaashhar6522
@kalpaashhar6522 4 года назад
Beautifully simple explanation for a complicated algorithm ! Thank you!
@statquest
@statquest 4 года назад
Thank you very much! :)
@user-iq3ue1td4f
@user-iq3ue1td4f 4 дня назад
The best explanation ever heard, thx so much!
@statquest
@statquest 4 дня назад
Thank you!
@zhenli1965
@zhenli1965 4 года назад
This is the best explanation that I have ever seen. Thank you so much, Josh!
@statquest
@statquest 4 года назад
Thanks! :)
@katielui131
@katielui131 5 месяцев назад
This is so great - thank you so much for sharing this content with everyone
@statquest
@statquest 5 месяцев назад
Glad you enjoyed it!
@pratibhasingh8919
@pratibhasingh8919 3 года назад
Great work! The way you explained was outstanding. It can be easily understood by a layman.
@statquest
@statquest 3 года назад
Thank you! :)
@taochen746
@taochen746 2 года назад
Really appreciated your hard work, this is the best videos for stats and machine learning ever!
@statquest
@statquest 2 года назад
Glad you think so!
@sashankvemulapalli6238
@sashankvemulapalli6238 2 года назад
Thank you for this beautiful video. One suggestion I would love to make is that, it felt like the initial explanation of why the residuals are called pseudo residuals was that to differentiate it from linear regression. However, the video goes on to explain that it is called pseudo residuals because the residuals are not always (Observed - Predicted) and can be a multiple of that as well depending upon the choice of the loss function. Maybe, the initial explanation could have been avoided in order to prevent confusion! Thanks as always, these videos are the best!! :D
@statquest
@statquest 2 года назад
Noted!
@jaivratsingh9966
@jaivratsingh9966 5 лет назад
I wonder why would someone dislike this video. This is great stuff!
@statquest
@statquest 5 лет назад
Thank you! I often wonder the same thing. What's not to like? I'm not sure.
@2050techgeek
@2050techgeek 5 лет назад
Excellent video! You are the best! Can you please make one on how XGBoost achieves superior performance ?
@cyprienricque2692
@cyprienricque2692 5 лет назад
Yes, please !
@aniketdatir2633
@aniketdatir2633 4 года назад
Wonderful video Josh......very clearly explained !!!! I appreciate it...Please keep posting such lectures. Thanks
@statquest
@statquest 4 года назад
Thank you! :)
@deepranjan3474
@deepranjan3474 2 года назад
best explanation till now for me.
@statquest
@statquest 2 года назад
Thank you!
@meysamamini9473
@meysamamini9473 3 года назад
U ARE THE BEST TEACHER EVER!
@statquest
@statquest 3 года назад
Thank you! :)
@RaviShankar-jm1qw
@RaviShankar-jm1qw 4 года назад
Words evade me while praising Josh !!!
@statquest
@statquest 4 года назад
Thank you! :)
@devran4169
@devran4169 Год назад
statquest > my university machine learning courses TRIPLE BAMM!!
@statquest
@statquest Год назад
Thanks!
@veronikaberezhnaia248
@veronikaberezhnaia248 2 года назад
thank you for a (much!) clearer explanations than my professors in ML faculty have
@statquest
@statquest 2 года назад
Glad I can help! :)
@emirhankartal1230
@emirhankartal1230 5 лет назад
that's the best explanation than I've seen so far...
@statquest
@statquest 5 лет назад
Thank you! :)
@NA-rq5dw
@NA-rq5dw 5 лет назад
Great video! I found the explanation of the mathematical notation to be very helpful and would love to see more examples for other machine learning concepts. Thanks
@statquest
@statquest 5 лет назад
I'm glad to hear you appreciated the attention to the mathematical notation. I'll try to do more videos like this.
@hussaintanzim6958
@hussaintanzim6958 2 года назад
The one word that describes all videos in this series is "BAM!"
@statquest
@statquest 2 года назад
BAM! :)
@arjundhar7729
@arjundhar7729 Год назад
Sweetest technical turorial ever ! BAM, Double BAM... haha But that doesnt take away from the excellent content and the nuances. thank you
@statquest
@statquest Год назад
Glad you enjoyed it!
@trillerperviu2752
@trillerperviu2752 4 года назад
Bro i am from Russia and i barely understand English. But i understand all stuff in this video,get pleasures + you make me some laughs. I think i will understand the math of quantum physics if you will explain it. YOU ARE THE BEST, THANK YOU!!!
@statquest
@statquest 4 года назад
Awesome! Thank you so much!
@moindalvs
@moindalvs 2 года назад
Your contribution for this channel is as same as the "Favorite Color" independent feature is to predicting the "Weight" of a person, if you haven't subscribed the channel yet and liked the video.
@statquest
@statquest 2 года назад
bam!
@SimoneIovane
@SimoneIovane 3 года назад
Really really good tutorials. I always watch them when I feel I want to revise some concepts. Thanks!
@statquest
@statquest 3 года назад
BAM! :)
@SimoneIovane
@SimoneIovane 3 года назад
@@statquest you mean... Triple Bam 💣
@statquest
@statquest 3 года назад
@@SimoneIovane YES!
@tangibleoxygen1986
@tangibleoxygen1986 5 лет назад
It would be really helpful if we get one class on How Entropy is calculated for splitting decision trees? Something similar we learnt from your videos on Gini Index. Would eagerly wait for that?
@user-hi4vy7yq4m
@user-hi4vy7yq4m 2 года назад
This is very great to explain the math like you do! It is awesome! Thank you!
@statquest
@statquest 2 года назад
Glad it was helpful!
@jaikishank
@jaikishank 3 года назад
It was an awesome explanation to the granular level.Kudos to your great effort ...
@statquest
@statquest 3 года назад
Thanks a ton!
@bevansmith3210
@bevansmith3210 5 лет назад
Thank you so much Josh, I was going through these algorithms in Elements etc. and it was so difficult to figure out. Awesome explanation!
@statquest
@statquest 5 лет назад
Thank you! :)
@venkateshmunagala205
@venkateshmunagala205 2 года назад
Wow u r genius . now I clearly understood the reason behind gammas .
@statquest
@statquest 2 года назад
:)
@aracelial9188
@aracelial9188 3 года назад
You are a really good teacher, thanks a lot for your videos!!!
@statquest
@statquest 3 года назад
Thank you! 😃
@luattran5318
@luattran5318 4 года назад
Much appreciated for your thorough and detailed explanation, wish u all the best!
@statquest
@statquest 4 года назад
Thank you very much! :)
@madghostek3026
@madghostek3026 Год назад
The realisation that argmin for the initial static model turns out to be just the average is mega BAM for me, I noticed it was different than last video when first step was just to find the average, and here's some funky stuff instead, but it's just "which point minimises the variance... the average!"
@statquest
@statquest Год назад
bam! :)
@mathematicalmusings429
@mathematicalmusings429 3 года назад
this is amazing, you are a gifted teacher Josh.
@statquest
@statquest 3 года назад
Thank you! :)
@markus_park
@markus_park Год назад
THIS BLEW MY MIND!!!
@statquest
@statquest Год назад
BAM! :)
Далее
Gradient Boost Part 3 (of 4): Classification
17:03
Просмотров 261 тыс.
Solve any equation using gradient descent
9:05
Просмотров 53 тыс.
Gradient Boost Part 1 (of 4): Regression Main Ideas
15:52
This is why Deep Learning is really weird.
2:06:38
Просмотров 383 тыс.
Gradient Descent, Step-by-Step
23:54
Просмотров 1,3 млн
Gradient Boost Part 4 (of 4): Classification Details
37:00
AdaBoost, Clearly Explained
20:54
Просмотров 749 тыс.
How to Prune Regression Trees, Clearly Explained!!!
16:15
Gradient Boosting : Data Science's Silver Bullet
15:48
Is Gravity RANDOM Not Quantum?
20:19
Просмотров 327 тыс.