Тёмный
No video :(

Ensemble Method : Boosting ll Machine Learning Course Explained in Hindi 

5 Minutes Engineering
Подписаться 649 тыс.
Просмотров 228 тыс.
50% 1

Myself Shridhar Mankar an Engineer l RU-vidr l Educational Blogger l Educator l Podcaster.
My Aim- To Make Engineering Students Life EASY.
Instagram - www.instagram....
Playlists :
• 5 Minutes Engineering Podcast :
• 5 Minutes Engineering ...
• Aptitude :
• Aptitude
• Machine Learning :
• Machine Learning
• Computer Graphics :
• Computer Graphics
• C Language Tutorial for Beginners :
• C Language Tutorial fo...
• R Tutorial for Beginners :
• R Tutorial for Beginners
• Python Tutorial for Beginners :
• Python Tutorial For Be...
• Embedded and Real Time Operating Systems (ERTOS) :
• Embedded and Real Time...
• Shridhar Live Talks :
• Shridhar Live Talks
• Welcome to 5 Minutes Engineering :
• Welcome To 5 Minutes E...
• Human Computer Interaction (HCI) :
• Human Computer Interac...
• Computer Organization and Architecture :
• Computer Organization ...
• Deep Learning :
• Deep Learning
• Genetic Algorithm :
• Genetic Algorithm
• Cloud Computing :
• Cloud Computing
• Information and Cyber Security :
• Information And Cyber ...
• Soft Computing and Optimization Algorithms :
• Soft Computing And Opt...
• Compiler Design :
• Compiler Design
• Operating System :
• Operating System
• Hadoop :
• Hadoop
• CUDA :
• CUDA
• Discrete Mathematics :
• Discrete Mathematics
• Theory of Computation (TOC) :
• Theory Of Computation ...
• Data Analytics :
• Data Analytics
• Software Modeling and Design :
• Software Modeling And ...
• Internet Of Things (IOT) :
• Internet Of Things (IOT)
• Database Management Systems (DBMS) :
• Database Management Sy...
• Computer Network (CN) :
• Computer Networks (CN)
• Software Engineering and Project Management :
• Software Engineering A...
• Design and Analysis of Algorithm :
• Design And Analysis Of...
• Data Mining and Warehouse :
• Data Mining and Warehouse
• Mobile Communication :
• Mobile Communication
• High Performance Computing :
• High Performance Compu...
• Artificial Intelligence and Robotics :
• Artificial intelligenc...

Опубликовано:

 

22 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 129   
@kitagrawal3211
@kitagrawal3211 5 лет назад
1. Bagging is a parallel learner process vs boosting is sequential. 2. Boosting is iterative vs bagging doesn't have to be. 3. Boosting can increase over-fitting whereas bagging generally decreases boosting. Ensemble learning works well when different models make independent mistakes i.e. when different models make mistakes on different examples.
@ytg6663
@ytg6663 4 года назад
Ok 😔😔😔😔
@anur749
@anur749 4 года назад
You sir should write a book! I bet it'll be bestseller....my university professors couldn't explain me this way for a whole semester !
@brain_body_soul
@brain_body_soul Год назад
Bhai tu ye chod, pdhai pr dhyan de. Sir aisehi bhot kma rhe. Taareef krne se ghar nhi chalega
@saurabhdhasmana2331
@saurabhdhasmana2331 Год назад
​@@brain_body_soul😂😂
@rugved4503
@rugved4503 2 года назад
1. Initialise the dataset and assign equal weight to each of the data point. 2. Provide this as input to the model and identify the wrongly classified data points. 3. Increase the weight of the wrongly classified data points. 4. if (got required results) Goto step 5 else Goto step 2 End
@salonilandge5353
@salonilandge5353 Год назад
Ye algorithm hai kya?
@iamvbj
@iamvbj 4 года назад
0:35 Sir got no chill 😂😂 awesome confusion hi khatam kardiya... multiple-choice answer
@tarunvarma7671
@tarunvarma7671 3 года назад
He looks like Rohit Sharma
@subhadeep1802
@subhadeep1802 4 года назад
Your videos on boosting and bagging is far more informative, clear and better than that of udacity. Thanks a lot sir and keep up the good work. Your videos are not only helping UG students for semester exams, but all other levels. I hold a masters in data analytics from one of the oldest IITs and i still clear my concepts with your videos for PG placements. Your way of teaching is amazing!
@lightsacross4663
@lightsacross4663 6 месяцев назад
Kya padhade ho sir, ek no.
@radhasingh3549
@radhasingh3549 Год назад
Your sweat shows that you are really working so hard to deliver such amazing content!!
@studyafa7159
@studyafa7159 Год назад
i guess its beacause of hot weather and non ac room
@prajwalbankar6532
@prajwalbankar6532 Год назад
@@studyafa7159 😂
@ShaidaMuhammad
@ShaidaMuhammad 4 года назад
WTF, today mere lecturer ne 1 hour laga diya par smajh nai aaya. yaar kes tarah tum itny achy se 10 minute ke andhar samjhaa dety ho, mujhe smajh nai aata . . . You're great yar. Lot of love and respect from Pakistan. As a class CR, Now this is my duty to share your videos with my class mates.
@sam9620
@sam9620 3 года назад
Sir please continue making such videos , we are watching your videos while studying in US universities because you explain so well
@emanrazzaq7189
@emanrazzaq7189 Месяц назад
• Boosting is a powerful ensemble technique used in machine learning to improve the accuracy and performance of models. - It combines multiple weak learners to create a strong learner. They do this by iteratively training models, adjusting weights based on the performance of previous models, and combining their predictions to produce a final, robust prediction
@priyakhedekar184
@priyakhedekar184 5 лет назад
Thanks sir.. Please make videos of cloud computing.. And more videos on machine learning
@sheetala_tiwari
@sheetala_tiwari 3 года назад
har video kamal ka hota hai seriously, LOve you sir
@pushkarbansal1926
@pushkarbansal1926 9 месяцев назад
didn't knew rohit sharma could teach so well
@milliesadie486
@milliesadie486 Год назад
thank you sir i have been watching boosting videos alot but after watching your its not required anymore excellent teacher
@SARTHAK15bhatnagar
@SARTHAK15bhatnagar 5 лет назад
Yes could you also in a similar way make a video on gradient decent, gradient boosting machine and xboost...
@dr.junaidslectures583
@dr.junaidslectures583 10 месяцев назад
What a good lecture. You have explained very clearly. Thank you so much
@rohanbhavale6077
@rohanbhavale6077 4 года назад
Machine learning in Hindi.Best thing I have seen on RU-vid today.True democratisation of knowledge.
@bhavikdudhrejiya4478
@bhavikdudhrejiya4478 4 года назад
Very nice video brother. Please cover Adaptive Boosting, Gradient Boosting & XG Boosting.
@storyofstories6341
@storyofstories6341 Год назад
Bagging (Bootstrap Aggregating): In bagging, each base learner is trained on a different bootstrap sample (randomly selected with replacement) from the original dataset. This means that each learner sees a slightly different version of the dataset. Boosting: In boosting, each base learner is trained on the entire dataset, but the weights on the data points are adjusted based on the performance of the previous models. This means that each learner focuses more on the data points that were misclassified by previous learners.
@alipbhaumik6242
@alipbhaumik6242 8 месяцев назад
Commenting to boost ur videos in algorithm
@winviki123
@winviki123 4 года назад
Superb
@SARTHAK15bhatnagar
@SARTHAK15bhatnagar 5 лет назад
Amazing Sir!! You made it so easy to understand!! Appreciate it!!
@SamruddhaShah
@SamruddhaShah 5 лет назад
Sir pls toc ke pda ke vdo dalo sir pls 3 may ko exam hai Aap ka explaination dimag gusta hai sir
@SamruddhaShah
@SamruddhaShah 5 лет назад
Thanks sir
@kiwi4916
@kiwi4916 4 месяца назад
nice kathak sir please upload more tutorials
@shaiksuleman3191
@shaiksuleman3191 4 года назад
U,Krish Naik and Code basics are three doctors.Without pain you inject injection means simply super B.No More Questions
@aanchaldogra
@aanchaldogra 5 лет назад
bhai you are awesome, great thank you so much
@deepaksingh9318
@deepaksingh9318 4 года назад
Perfectly explained bhai..Isi baat pe like aur subscribe done :D
@vedantkale3785
@vedantkale3785 5 лет назад
What if the condition like 50% votes have been given to class 0 and remaining 50% is given to class 1 occur..then in that case how model will decide to which class that particular instance belonged to?
@dharminshah2722
@dharminshah2722 4 года назад
use odd no of classifiers
@user-hj6zn8js3i
@user-hj6zn8js3i 11 месяцев назад
Great clarity. Thanks!
@niluthonte45
@niluthonte45 Год назад
Thank you sir very much helpfull 😍
@shubhragarg875
@shubhragarg875 3 года назад
Please make videos on ADABOOST and gradientBoost topics also.
@aniketgarg9575
@aniketgarg9575 Год назад
Great explanation sir
@malikasif8029
@malikasif8029 Год назад
jordar.... sir please make series with practical implementation in python with all ML algorithms.
@swagatmishra9350
@swagatmishra9350 3 года назад
Amazing amazing explanation.. Thank you very much...
@aniketdatir6926
@aniketdatir6926 4 года назад
Great work.....very clearly explained the concept. Thanks
@ManishKumar-qh6vg
@ManishKumar-qh6vg 4 года назад
those sweats worth teaching as well understanding ..
@pratikfutane8131
@pratikfutane8131 5 лет назад
Great!!
@Himricks
@Himricks 3 года назад
Thanks man parso ppr hai online exam I hope I can clear it 😅
@sahilgaikwad4508
@sahilgaikwad4508 4 года назад
mast samjhaya sir thank you😘
@purvimajoka4795
@purvimajoka4795 Год назад
great explaination👍
@shubh13272anand
@shubh13272anand 3 года назад
marvelous
@PURBEYVIKRAM
@PURBEYVIKRAM 4 года назад
thanku sir for awesome explanation and please make video on xgboost
@lokesh4258
@lokesh4258 5 лет назад
hello sir, your videos are really awesome, can you please explain Gradient Boosting Machine in same manner.
@kalpeshvarankar2574
@kalpeshvarankar2574 8 месяцев назад
I was and still kind of in the impression that in any kind of Boosting method, we start off with the entire dataset and not on the bootstrap samples. The focus is to adjust the weights of individual instances. If it uses a sequential approach, why use subsets from the original data? Can you please clarify more on this? Am I right or do i still need to fill in the learning gaps?
@RohanVasava
@RohanVasava Год назад
Your are the reason i am passing my exams, 5 M.E
@259_parthpatidar9
@259_parthpatidar9 3 года назад
For choosing the right distribution, here are the following steps: Step 1: The base learner takes all the distributions and assign equal weight or attention to each observation. Step 2: If there is any prediction error caused by first base learning algorithm, then we pay higher attention to observations having prediction error. Then, we apply the next base learning algorithm. Step 3: Iterate Step 2 till the limit of base learning algorithm is reached or higher accuracy is achieved.
@shivampradhan6101
@shivampradhan6101 4 года назад
make a video on stacking esembling method
@mdmynuddin1888
@mdmynuddin1888 3 года назад
Supper Bammmmmmm
@ll-rz9bd
@ll-rz9bd 4 года назад
Thank you so much Sir!!
@shekharai225
@shekharai225 Год назад
Thank you
@abdulkabir7527
@abdulkabir7527 Год назад
Jab M2 phir se krra hai calssify (after being trained once) tab sab weights equal lega na vo?
@vedant6460
@vedant6460 Год назад
great
@mandarhanchate5034
@mandarhanchate5034 4 года назад
Good Explanation 👍
@princevegeta7921
@princevegeta7921 4 года назад
Sir, end me jab sab models classify karenge or agar assume karo ki 50-50 classification hui to? Matlab agar 8 models me se 4 ne bola positive instance or 4 ne bola negative instance he to is condition me kya result niklega? Thankyou very much Sir.
@anubhavgupta6331
@anubhavgupta6331 4 года назад
bhai gradient boosting pr bhi ek vedio bana do.
@live_mocha
@live_mocha 4 года назад
Crystal clear...:D
@deepakvyas3833
@deepakvyas3833 3 года назад
How to assess or arrive at value n of M.. meaning how to determine the number of iterations to arrive at final strong model (M*)
@hemavd7879
@hemavd7879 5 лет назад
hello sir, very nicely explained ... could you please explain these techniques on multilabel data
@61_shivangbhardwaj46
@61_shivangbhardwaj46 3 года назад
Thnx sir😊
@poojajagtap9222
@poojajagtap9222 5 лет назад
Thanks u sir
@nutankumarnaik1984
@nutankumarnaik1984 2 года назад
Sir I would like to know about optimal weights of ensemble method basic learners
@jayshrikhamk97
@jayshrikhamk97 2 года назад
Thanks
@spanco123
@spanco123 5 лет назад
Please provide a video on R Squared as well
@poonamlad543
@poonamlad543 5 лет назад
Thank u sir.... Plz make video of HCI....
@vedant6460
@vedant6460 Год назад
thanks
@madhurirampalli4080
@madhurirampalli4080 3 года назад
Pls add the video on xgboost, adaboost
@dhananjaykansal8097
@dhananjaykansal8097 5 лет назад
Am I stupid or what to ask this? But I find Bagging & Boosting to be like 99% or 100% similar. If anyone without laughing can make me understand with 'Key differences' please.
@ezpz4akash
@ezpz4akash 5 лет назад
Bagging is complete randomization. We choose random records(With replacement) from original dataset and form multiple small training datasets. Then we create multiple models by training them with these datasets. And ultimately we combine these multiple WEAK models to form the one STRONG model. Whereas, Boosting is randomization with weight attached to each tuple. Since weight is updated every time we create a model, so likelihood of a record being chosen from original dataset is different for different models. And once multiple models are built and trained remaining procedure is similar to Bagging. Thus, Bagging : Complete Randomization with no weight attached to tuples Boosting : Randomization + Weights attached to each tuples Hope you understood. :)
@dhananjaykansal8097
@dhananjaykansal8097 5 лет назад
@@ezpz4akash My My My. Such lucidly explained. Thanks so much man. And u have some superb command over language. Mind asking me are u a complete Data Scientist now ?
@ezpz4akash
@ezpz4akash 5 лет назад
@@dhananjaykansal8097 Welcome man. Actually I am a final year student. Studying concepts here in lesser time, sufficient enough to write in exams ;)
@dhananjaykansal8097
@dhananjaykansal8097 5 лет назад
@@ezpz4akash Well u r doing great. God bless man
@shantanughanekar3807
@shantanughanekar3807 5 лет назад
@@ezpz4akash nailed it bro. if they ask difference between these two then you just answered it. great explaination
@badarshaban1998
@badarshaban1998 5 лет назад
M3 ko train krne k liye 5 use ni ho skta. Kyun k 6 ka weight 5 se zyada ha because M2 ko train krty hue we updated the weights of 4,6 and 7. Correct me if I'm wrong
@kitagrawal3211
@kitagrawal3211 5 лет назад
it is a probability distribution. So the probability of picking 6 over 5 is high. 5 can still be picked.
@vandanabellani7585
@vandanabellani7585 4 года назад
thankuuu sir
@shubhambhandari6663
@shubhambhandari6663 5 лет назад
Bro toc ke video please exams 3ko he uske pehle please daal
@rushic24
@rushic24 5 лет назад
please add gradient tree boosting and ada boost
@durgeshsharma1890
@durgeshsharma1890 3 года назад
How does this algorithm decide misclassification so that the M2, M3,.....will classify the instances....???
@shivaprakashranga8688
@shivaprakashranga8688 4 года назад
Great Video Sir, i have a doubt that suppose if 4 weak classifier have equal prob. of 0 and 1. how the final model will predict??
@naveendurgam
@naveendurgam 4 года назад
since is voting wouldn't decide a clear class, final model might randomly pick any class (needs verification)
@nomanshaikhali3355
@nomanshaikhali3355 3 года назад
Misclassification can be called Regularization? Right
@sayalishejwal8221
@sayalishejwal8221 3 года назад
Hi. Could you please explain stacking ensemble learning technique...??
@aadityanag4552
@aadityanag4552 9 месяцев назад
Everything is variable except, "exams kaafi nazdeek hai"
@yadav-vikas
@yadav-vikas 4 года назад
In bagging does the weight matter ?
@shashireddy7371
@shashireddy7371 4 года назад
Thanks sir
@FIFA_MANIA2023
@FIFA_MANIA2023 4 года назад
Hello, Can you please elaborate what is class 0 and class 1 here ..?
@error-my9ut
@error-my9ut Год назад
n agar even ho toh voting m konsa lenge random sir???
@kushchauhan3116
@kushchauhan3116 3 месяца назад
Watching video in 2024 , thanks
@TheSiddhantp
@TheSiddhantp 5 лет назад
Sir love u 3000... plz apka nam bataiye
@tryme5364
@tryme5364 4 года назад
How can M1 wrongly classify 6th and 7th instances when they are already given as in training set for M1😅. However sweet explanation
@adilkhannitianmtech188
@adilkhannitianmtech188 4 года назад
Sir voting ke alawa bhi methods h plz unko explain kijiye
@xxMegha33xx
@xxMegha33xx 8 месяцев назад
what are weights?
@seekersseries5386
@seekersseries5386 4 года назад
learning models would be same ? m1,m2,m3?
@divyanggoswami4316
@divyanggoswami4316 5 лет назад
Sir, I have a doubt, What will happen when the majority is the same then How resultant M* will identify the proper class of any tuple of test data. Sir, Please help me with this.
@nikhilgupta4859
@nikhilgupta4859 4 года назад
Sir how boosting work for regrrssion problem?
@PrashantThakre
@PrashantThakre 3 года назад
It seems its ending up with infinite number of classifiers. :)
@tanmayjagtap78
@tanmayjagtap78 4 года назад
Sir will you please elaborate soft and hard voting. great content sir. Keep doing it..
@anshsachdeva1061
@anshsachdeva1061 4 года назад
Bro please adaboost pe video dalo
@sakshijain5318
@sakshijain5318 4 года назад
Hi, Please make a video on gradient decent.
@shrivastavhoon
@shrivastavhoon 11 месяцев назад
iits ke proffesors se acha padhate ho sir tum
@360Cumilla
@360Cumilla 5 лет назад
Boss
@nabashakeel2462
@nabashakeel2462 5 лет назад
Can you explain adaboost algorithm?
@shrikantlandage7305
@shrikantlandage7305 4 года назад
its same what he has explained
@ROHITKUMAR-dl5ij
@ROHITKUMAR-dl5ij 5 лет назад
Bhai Rohit Sharma is your relative ??
@navedshaikh8535
@navedshaikh8535 5 лет назад
Bhai class 0 and 1 kaisa aaya please ans
@ankitnamdeo
@ankitnamdeo 4 года назад
What, if there are equal number of votes ?
@kavishgarg4152
@kavishgarg4152 5 лет назад
How Machine Will Know tht Which Instances are NOT classified correctly ?
@abhijitkulkarni1813
@abhijitkulkarni1813 5 лет назад
It's supervised learning, the training data is labeled
@kitagrawal3211
@kitagrawal3211 5 лет назад
@@abhijitkulkarni1813 you have the original labels and the predicted label. y != h(x) where it's 1 if true and 0 if False should find the misclassified labels.
@AmitSharma-po1zb
@AmitSharma-po1zb 5 лет назад
But my question is on how do you do the random sampling, suppose we have a data set with 40000 rows of data how do we pick randomly the datasets, kindly explain the method on random sampling.
@kitagrawal3211
@kitagrawal3211 5 лет назад
if you are creating a sample size of 800, you can generate 800 random numbers with uniform distribution between [1,4000] and then pick those rows.
@AmitSharma-po1zb
@AmitSharma-po1zb 5 лет назад
Algorithms on their own decide on random sampling and we just give the range ?? Any link or code description to understand it in a practical way will help pls..
@kitagrawal3211
@kitagrawal3211 5 лет назад
@@AmitSharma-po1zb are you new to ML? Imagine you are picking 800 out of 10000 samples. Are you not picking them at random? When you don't know anything about the data? Also, not to get too technical, if you look into theory, there are formal guarantees that show that picking at random still reduces the overall error (you are running it for many iterations so 1 bad sample won't affect the overall model). Maybe look into the Python implementation directly? Open source on Github? Look at pandas.DataFrame.sample()
Далее
AdaBoost, Clearly Explained
20:54
Просмотров 748 тыс.
What is AdaBoost (BOOSTING TECHNIQUES)
14:06
Просмотров 333 тыс.