Тёмный

Boosting - EXPLAINED! 

CodeEmporium
Подписаться 124 тыс.
Просмотров 49 тыс.
50% 1

REFERENCES
[1] A Short Introduction to Boosting: cseweb.ucsd.edu/~yfreund/pape...
[2] A Theory of the Learnable (Valiant, 1984): web.mit.edu/6.435/www/Valiant8.... This introduced the PAC Learning model
[3] PAC Learning Model: • PAC learning
[4] Cryptographic Limitations on Learning Boolean Formulae & Finite Automata (Kearns et al., 1988): www.cis.upenn.edu/~mkearns/pa... (This paper defined weak learnability)
[5] The strength of weak learnability (Schapire, 1990): rob.schapire.net/papers/streng...
[6] A gentle intro to weak learners: www.cs.ox.ac.uk/people/varun....
[7] Boosting a weak learning algorithm by majority (Freund, 1995): pdfs.semanticscholar.org/d620...
[8] Adaptive Boosting (Section 4): rob.schapire.net/papers/Freund...
[9] Adaboost & overfitting discussion: stats.stackexchange.com/quest...
[10] Gradient Boosting: statweb.stanford.edu/~jhf/ftp...
[11] How boosting still learns even after training error hits 0: www.cc.gatech.edu/~isbell/tut...
[12] Difference between Adaboost & Gradient Boost: www.quora.com/What-is-the-dif...
[13] Adaboost Vs Gradient Boosting: subscription.packtpub.com/boo...
[14] XGBoost (Main Paper): arxiv.org/abs/1603.02754
[15] Compressed Sparse Column (CSC) format used in storing data in XGboost: software.intel.com/en-us/mkl-...
CODE
[1] Starter code with built in libraries: repl.it/@PulkitSharma1/Boosti...
IMAGE RESOURCES
[1] ConvNet: missinglink.ai/guides/convolu...
HIPPY COWBOY MUSIC
[1] Cowboy Sting by Kevin MacLeod is licensed under a Creative Commons Attribution license (creativecommons.org/licenses/...)
Source: incompetech.com/music/royalty-...
Artist: incompetech.com/

Опубликовано:

 

24 июл 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 58   
@SergioArroyoSailing
@SergioArroyoSailing 3 года назад
Dude that was a fantastic explanation! and the video illustrations were excellent! and you really went over and above with the reference links for deeper studies! subscribed! keep up the good work! :D
@saharshayegan
@saharshayegan 3 года назад
Exactly!
@CodeEmporium
@CodeEmporium 2 года назад
Sorry I'm super late to this. RU-vid didn't notify me of this amazing comment. Thanks a ton! We can chat better on Discord since I'm more active there. Link in the latest video description
@Robay146
@Robay146 2 года назад
Great explanation. Had no idea what boosting was and this video just demystified the whole thing. Big up!
@somerset006
@somerset006 2 года назад
Amazing quality of production! Appreciate your effort!
@dhineshkumarr3182
@dhineshkumarr3182 3 года назад
You got my respect man. I think this is the only video that actually cared enough to define what strong and weak learners are.
@CodeEmporium
@CodeEmporium 3 года назад
Thanks! Tried to get deep with this one
@mavichovizana5460
@mavichovizana5460 Год назад
What a great explanation and fantastic work! Appreciated those references!
@arieframadhan1244
@arieframadhan1244 4 года назад
Thanks man. Great explanation as always. Wish you all the best!
@denisjosephbarrow8330
@denisjosephbarrow8330 2 года назад
Thanks Mr. Code Emporium you are as good as 3 blue one brown at explaining the difficult.
@drsandeepvm5622
@drsandeepvm5622 Год назад
Great simplified explanation 👍 Thanks 😊
@flavialan4544
@flavialan4544 2 года назад
one of the BEST videos for this subject
@CodeEmporium
@CodeEmporium 2 года назад
Thank you so much!
@70ME3E
@70ME3E 3 года назад
crazy good quality video, thank you!
@VietnamSteven
@VietnamSteven Год назад
This is beautifully explained!
@ShashankData
@ShashankData 2 года назад
Great video! I'm using this to research for a video I'm working on now!
@CodeEmporium
@CodeEmporium 2 года назад
I am honored! Can’t wait to see it!
@SivaranjanGoswami
@SivaranjanGoswami 2 года назад
Awesome explanation 👏 👌
@justin.c249
@justin.c249 Год назад
Great Explanation!
@CodeEmporium
@CodeEmporium Год назад
Thanks so much! :)
@Klimaexperte
@Klimaexperte 4 года назад
So great, thanks man!
@yulinliu850
@yulinliu850 4 года назад
Excellent! Thanks!
@falak88
@falak88 3 года назад
So freaking amazing!
@TawhidShahrior
@TawhidShahrior 2 года назад
thank you man, this was amazing.
@healthdatascience6577
@healthdatascience6577 Год назад
Thanks! This is helpful.
@hassanrevel
@hassanrevel 2 года назад
What an amazing video.
@anissalhi5459
@anissalhi5459 3 года назад
great explanation thanks bro
@danishnawaz7869
@danishnawaz7869 4 года назад
Thank you 🙏
@95Bloulou
@95Bloulou 4 года назад
I like the format of "logistic regression - the math you should know" better, I think the intro here is a little bit long and I think the viewers of this video will know a bit about ML but are more interested in the details of boosting (speaking for myself at least) Thank you ! keep it up !
@CodeEmporium
@CodeEmporium 4 года назад
Yeah. I'm working on getting to the point much quicker. Thanks for the feedback!
@rajuofficial4205
@rajuofficial4205 2 года назад
Very nice
@ILoveMattBellamy
@ILoveMattBellamy 2 года назад
Amazing video!
@CodeEmporium
@CodeEmporium 2 года назад
Thanks!
@ahmedoumar3741
@ahmedoumar3741 2 года назад
This video is GREAT!
@CodeEmporium
@CodeEmporium 2 года назад
you are GREAT!
@Leibniz_28
@Leibniz_28 4 года назад
Awesome, I will use XGboost for some classification problems. What's program do you use to make your videos? I would like to learn about it, but I don't have a clear path to learn those skills on Internet.
@CodeEmporium
@CodeEmporium 4 года назад
XGboost can also be used for regression too. Since the base weak learner is a decision tree. I use Camtasia studio for creating these videos. It's great for recording your screen. And if you play around long enough with it, you can create decent animations.
@latinavenger7472
@latinavenger7472 4 года назад
Hi, stupid question, but how you find the research papers exactly because they're such great! Thx for the gorgeous explanation, helped me a lot!
@CodeEmporium
@CodeEmporium 4 года назад
That's a good question. If I know the topic I'm looking for, I'd just Google it (like Xgboost). For "history" of boosting though, I'd also try to find college lecture material. They have a good explanation at a high level, but I'd dig into their references for more info. Apart from that there is arxiv sanity and social media that I use for more trending research (explained this more in my video on "how to keep up with AI research. Check it out)
@satyamtripathi1732
@satyamtripathi1732 3 года назад
why it select weak model suppose if I get 95% accuracy in first model and its is selection weak model that is having 65% accuracy why?
@Kevin-fp6gk
@Kevin-fp6gk 3 года назад
Which program do you use to create the videos?
@nurkleblurker2482
@nurkleblurker2482 2 года назад
But how does a model "focus more on a problem to make sure it gets it right"? What does that mean?
@pankajshinde475
@pankajshinde475 4 года назад
Sir, just one question.... where you learn maths behind the machine learning algorithms... I am trying really hard to find courses about mathematics but i failed.... Where i can find resources to learn mathematics behind machine learning algorithms...
@last_theorem
@last_theorem 4 года назад
there is a channel called statquest you can have some decent idea math in that. MIT has a fab course called Artificial Intelligence by Patrick wilson they introduce you to some math there. And there are lot of medium articles where you can see the math. You will have to dig some more deeper. Machine learning algos are not built on one single ideas. Like in decision tree and even in ada boost you have an idea called gini score and all . Its a measure of entropy . And entropy is a information theory based ideas. Librarys are the most easiest way to approach this if you start understanding the math then there are lot of dependency. Also a decent idea of statistics , propablity , calculus can help you understand the ideas better. Because this algos are built on top of it.
@70ME3E
@70ME3E 3 года назад
I think Andrew Ng's ML videos might come handy too
@mberoakoko24
@mberoakoko24 2 года назад
SUBSCRIBED!!!!!
@CodeEmporium
@CodeEmporium 2 года назад
NO REGRETS! THENKS!
@aashishadhikari8144
@aashishadhikari8144 2 года назад
You did not explain why increasing the sample weight makes the next iteration focus on the misclassified samples.
@Raven-bi3xn
@Raven-bi3xn 3 года назад
Who are you? Where were you all my life? You are amazing! Do you have Pateon?
@LunaMarlowe327
@LunaMarlowe327 2 года назад
nice
@mohanakumaran5815
@mohanakumaran5815 4 года назад
So finally u uploaded a video 😂 I like ur explanation very much
@CodeEmporium
@CodeEmporium 4 года назад
Yup. :) I'm trying a different approach with more visuals and easier explanations (without losing detail). So it took longer. I'd actually been working on this almost every day for the last month after work. Next step is to probably decrease the video length to make it more palatable (?) - not too sure. But will see how it goes. Thanks for the support! :)
@omolluska
@omolluska 2 года назад
This has 558 likes and cat videos have millions of likes. The world is not a fair place!
@CodeEmporium
@CodeEmporium 2 года назад
A cruel world we live in :)
@davidnassau23
@davidnassau23 11 месяцев назад
You lost me when you didn't explain what a gradient is or how it differs from a weight. That made the rest unintelligible. I hope you can improve this.
@CodeEmporium
@CodeEmporium 11 месяцев назад
Yea. This video is from 4 years ago. I have definitely improved over time. But to answer your question in a nutshell. Weight = parameter, gradient = change in said parameter
@davidnassau23
@davidnassau23 11 месяцев назад
@@CodeEmporium ok thanks!
Далее
AdaBoost, Clearly Explained
20:54
Просмотров 740 тыс.
What is AdaBoost (BOOSTING TECHNIQUES)
14:06
Просмотров 330 тыс.
Gradient Boosting : Data Science's Silver Bullet
15:48
AdaBoost : Data Science Concepts
12:26
Просмотров 18 тыс.
Batch Normalization - EXPLAINED!
8:49
Просмотров 103 тыс.