Тёмный

Data Mining Lecture -- Decision Tree | Solved Example (Eng-Hindi) 

Well Academy
Подписаться 486 тыс.
Просмотров 304 тыс.
50% 1

~-~~-~~~-~~-~
Please watch: "PL vs FOL | Artificial Intelligence | (Eng-Hindi) | #3"
• PL vs FOL | Artificial...
~-~~-~~~-~~-~

Опубликовано:

 

17 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 299   
@ftt5721
@ftt5721 3 года назад
I salute the hard work with which you created this video..i think you are hanging mobile on some thread 5 years ago...and putting almost full day to create this video...
@drashtijani402
@drashtijani402 2 года назад
Sir, I watched your video even in my B.tech as well in M.tech now ! One day before exams, thank you for such great explanation !
@GYANESHWAR27
@GYANESHWAR27 7 лет назад
1)What about the temperature in the final internal node?It wont be used as node? 2) If there is rain->temp(mild/cool)-> Humidity(high/normal)-> Wind (weak/strong); and I want to take the decision including humidity..? Anyway as per the output you got if outlook is"rain",we exclude humidity..Will that be possible in practical scenario,Can you pull me out out in this regard?
@cleanageteenage183
@cleanageteenage183 2 года назад
I watched your video One day before exams, thank you for such a great explanation! support from Pakistan plx keep continuing
@vaidehibam2224
@vaidehibam2224 5 лет назад
Literally I was not at all understanding anything in from text book but by watching ur video I understood very nicely that even I can explain others ...grt explanation tq so mch ..
@shakuntalats3107
@shakuntalats3107 5 лет назад
in this example for sunny , we got gain factor as humidity and in the humidity we got complete yes for "high" and complete no for "normal". But if high falls on both yes and no , during that time which all columns i should consider to perform gain_factor, i mean should i need to consider outlook column once again
@fairycloset
@fairycloset 5 лет назад
Thank you so much me kbhi comment ni krti but ap ka way of teaching acha lga so paper ky lye buht banificial he ye all concepts are cleared😀😀
@nailashah6918
@nailashah6918 4 года назад
Very nice and clear explanation After watching incomplete lecture of edureka I was searching for how to select next child nodes Finally I got u Thank u bro
@kripeshshetty4269
@kripeshshetty4269 5 лет назад
How have you used summation in the Entropy formula? Can you please elaborate. Excellent video though. Will this be implemented in Big Data as well?
@aravinth5223
@aravinth5223 7 лет назад
I don't know hindi but I can understand the way you did the problem, awesome brother. Thank you so much :-)
@bilalchandiabaloch8464
@bilalchandiabaloch8464 4 года назад
Superb. I was not clear about the 2nd iteration. Thanks for making it so much clear. Salam from Quetta Balochistan.
@NehaThakur-qj4gi
@NehaThakur-qj4gi 6 лет назад
Sir can we make strong a right child or weak is a left tree?? or is thr any compulsion or rule to take it in left ?
@babu7726
@babu7726 7 лет назад
why we cant use humidity if it was used by sunny? please reply. Thanks in advance.
@sharminsharna5001
@sharminsharna5001 4 года назад
really good explanation, sir. but sir i have a query? when there wind' has two element weak and strong' and there when weak then all elements are yes. and other side when strong has all elements ar no thats why easily we can create tree . but when weak has mixed yes and no... then what can i do? please sir answered me.
@AmarjeetSingh-wo3ps
@AmarjeetSingh-wo3ps 5 лет назад
Ek no bhai all classification at each level get cleared now..
@waghmaredeepalidnyaneshwar3507
you have really worked hard to explain such a long procedure...it was great..i would appreciate if you can explain KNN algorithm same way..keep it up..thank you
@gurpartapsingh1693
@gurpartapsingh1693 4 года назад
I checked online the formula of entropy and information gain and they are exactly opposite of those used by you i.e. the formula of information gain used by you is actually the formula of entropy...what is the reason behind this?
@shaheenask3432
@shaheenask3432 6 лет назад
U had 2 options to start working again, sunny and Rainy, and u started with sunny. Is there any specific rule to choose which node i want to start working with first, in such cases like yours?
@prasadnagarale6274
@prasadnagarale6274 5 лет назад
Why did you omit the humidity column under the rainy node what if it would have given better gain than wind ?
@awaisjavaid9372
@awaisjavaid9372 6 лет назад
i m confused about temperature attribute ...where have we to put that??as descendant to humidity u directly put yes or no...either we also check here for temperature or not??
@danishbhatia5004
@danishbhatia5004 7 лет назад
Thank a ton for this video, I got this concept very clear. The only thing I wanna ask is that at 27:44 you said that for every "strong" there is a "no" and for every "weak" there is a "yes", what if for 2 "weak" there is "yes" but for 1 "weak" there is "no"..?
@rajarsis
@rajarsis 6 лет назад
Wonderful! Just wonderful! Thanks for your your effort and the wonderful explanation! The best that I have found so far!
@granturismoautos
@granturismoautos 7 лет назад
Couldn't explain it any better!! Made my life really easy, thank you! BEST!
@kiranfazal1272
@kiranfazal1272 3 года назад
your videos are best one among all
@sandeshmore6510
@sandeshmore6510 6 лет назад
If u r using table to make final decision tree then what is meaning of there to find the entropy ...I didn't understand
@tanusreedasgupta9310
@tanusreedasgupta9310 3 года назад
Excellent explanation. It is very much fruitful
@amnaarshed8011
@amnaarshed8011 5 лет назад
each step is clearly defined . Thank you
@iamsajjadalidev
@iamsajjadalidev 4 года назад
Sir if we have two same gain values then ???
@zweack
@zweack 5 лет назад
The video is good but it would be great if you can summarize the steps at the end of every video. It's a long process so remembering it as an algorithm will be useful.
@sudhanshushakya4579
@sudhanshushakya4579 5 лет назад
Good explanation. But you should have mentioned that this method is done using "ID3 algorithm". For "C4.5 algorithm" Gain Ratio is used and for "CART algorithm" 'GINI index' is used.
@Om-iy9ix
@Om-iy9ix 7 лет назад
Why we didn't go further under humidity high then under that wind strong or normal...why we stopped there only ?
@Areeva2407
@Areeva2407 5 лет назад
Presenatation is good but pls let us see board, it is not visible fully. Thanks for posting such a helpful lecture
@harshitchhabra5572
@harshitchhabra5572 5 лет назад
Is this example of C4.5 algorithm for construction of decision tree?
@anandinamdar4054
@anandinamdar4054 7 лет назад
The formula for information gain you have used here is actually a formula for entropy. The formula for information gain is : IG = entropy of total data set - summation of p/p+n*entrop(vi) where i=ith value
@_khooman_chronicles_
@_khooman_chronicles_ 6 лет назад
Entropy is another word to describe IG. And the equation for gain is [summation(pi+ni)/p+n]*IG(pi,ni)
@hrishikeshkulkarni2856
@hrishikeshkulkarni2856 6 лет назад
I'm agree @Anand
@mdrajaulislam4526
@mdrajaulislam4526 7 лет назад
Awesome, Perfect and Clear explanation. Thanks a lot to share your valuable lectures with us. If you share lectures document, then we shall be more benefited. .
@WellAcademy
@WellAcademy 7 лет назад
+Mohammad Rajaul Islam Will soon provide material
@mdrajaulislam4526
@mdrajaulislam4526 7 лет назад
thank you very much
@deepak161091
@deepak161091 5 лет назад
Excellent Teaching skills, Superb bro.. God bless you....👍🏻👍🏻👌🏻
@pearlminestudios7568
@pearlminestudios7568 5 лет назад
Great explanation. Aap tripod kyu nahi lete?
@peshimamnadeem8959
@peshimamnadeem8959 5 лет назад
An excellent explanation to grasp easily by any student.Thank you so much sir.
@haripalthakur2996
@haripalthakur2996 5 лет назад
Awesome bhai.. Random Forest pe b bnaana video. Excellent work, so much efforts u had put.
@malikusamahameed988
@malikusamahameed988 4 года назад
i really understand clearly this. your method was out class but if you used your Native language than it will be more brilliant. English is not the purpose of this topic but the main thing is Decision Tree.
@aneekaazmat6653
@aneekaazmat6653 4 года назад
what about temperature column, it is not included in the tree? what to do with that ?
@mohammaddanish764
@mohammaddanish764 4 года назад
Good question
@payalbajpai4259
@payalbajpai4259 7 лет назад
really great work......decision tree or CART algo is same thing??
@WellAcademy
@WellAcademy 7 лет назад
payal bajpai Thank you so much for appreciation, share it with your friends This is referred as decision tree, but on some platforms like R they are referred to by modern term CART
@ayeshafatima4163
@ayeshafatima4163 7 лет назад
if there is three types of output not in the case of yes or no then how to solve it,, we will take three variables such as p and n?
@WellAcademy
@WellAcademy 7 лет назад
+ayesha fatima Yes you to solve taking more variables
@padhiyarkunalalk6342
@padhiyarkunalalk6342 4 года назад
Awesome #ekdum mast samjavyu #thank you
@sanjidaalamdristy444
@sanjidaalamdristy444 7 лет назад
just wow,ur tutorial is very good.thank u so much for making me this tough math so easy
@lalitbarai
@lalitbarai 4 года назад
Is it id3 method or cart method?
@sohamchauhan4372
@sohamchauhan4372 6 лет назад
Can't we just check for minimum entropy than to find the maximum gain? Why to simply add a step?
@rafeedrahman5420
@rafeedrahman5420 6 лет назад
I think the reason that you gave for not considering "Humidity" as an attribute after going downwards from "Rain" branch is not right. "Humidity" does not come simply because its Gain is lower than "Wind" attribute with respect to T_rainy table. Same attribute can appear in multiple paths. But in a single path, same attribute will come only once.
@PARDEEP349
@PARDEEP349 4 года назад
It is a good explanation. But need to focus, i need to know the book from where i could solve the numericals.
@ilashrivastava2408
@ilashrivastava2408 7 лет назад
sir is it possible to use bayesian classification with nearest neighbor clustering...if yes please explain please provide me answer as soon as possible.... thanking you
@narasimhaa1341
@narasimhaa1341 6 лет назад
Suppose I wish to employ a one-level decision tree (i.e, a decision stump) to classify data. Recall that decision trees split data into subsets, and there are many measures that can be used to determine the best way to split data. I’m interested in two measures of impurity: Gini index and classification error. Help me decide how to classify the above data according to Feature 1 by completing the following tasks. (a) Draw and label a line to indicate the best split as measured by Gini index. (b) Draw and label a line to indicate the best split as measured by classification error
@ChathuraJayalath
@ChathuraJayalath 7 лет назад
At around 1.55 , you say "Columns" for which I think you meant "Rows", isn't it ?
@hassanmurtaza8061
@hassanmurtaza8061 4 года назад
This is my first comment, brother your explanation is really awesome.
@amaranasir8686
@amaranasir8686 4 года назад
what if entropy of attibute is zero and information gain is equal to that of parent table ?
@kashishmiglani6576
@kashishmiglani6576 7 лет назад
Awesome video , have an exam today morning , hope this helps. thanks a lot sir.
@nandini412
@nandini412 5 лет назад
Proper teaching and understandable thank u
@Urmarwelfare
@Urmarwelfare 7 лет назад
sir where is the temperature in decision tree ? plz explain
@nikhilnaik5340
@nikhilnaik5340 6 лет назад
what we supposed to choose if 2 attributes have same gain value
@techbugtlg
@techbugtlg 6 лет назад
Why we should not use humidity for rain
@MuhammadAli-dg7nq
@MuhammadAli-dg7nq 7 лет назад
Great video..Great explanation..Great teacher..
@arslan3058
@arslan3058 4 года назад
Thank you so much brother, stay blessed
@vishavgupta3717
@vishavgupta3717 5 лет назад
Please tell me what are the social impacts of data mining
@shivdesai6857
@shivdesai6857 5 лет назад
At 18.14 you said that IG if p and n are the same then IG will also be the same but it is wrong. If they both are same then IG is 1
@TheAbhimait
@TheAbhimait 6 лет назад
Great video bro. Really helpful. I do this in R but I always wanted to learn the concept. Nice work
@Full_Drama777
@Full_Drama777 7 лет назад
outstanding video...really helpful in my study...thanks a ton
@alokchaudhary5045
@alokchaudhary5045 6 лет назад
I use 0.94 or 0.970 for calculating gain(sunny,...)?
@alokchaudhary5045
@alokchaudhary5045 6 лет назад
Confused
@shubhanshu57
@shubhanshu57 7 лет назад
lets suppose one of the attribute is a unique identifier then how will you calculate the information gain....and make a decision tree for it please
@WellAcademy
@WellAcademy 7 лет назад
If there is a unique identifier let us suppose product_id In that case the split on product_id will give you large number of partitions as it is unique identifier each one containing one tuple, that means every partition is pure We can directly say that infogain of product_id=0, and it is useless to calculate
@akashsingh1780
@akashsingh1780 7 лет назад
for unique identifier we use gain ratio instead of information gain as the info gain of unique identifier will be 0.
@sohaibluck
@sohaibluck 5 лет назад
Very good and useful video keep it up.
@08_aparnagupta98
@08_aparnagupta98 2 года назад
your cam is shaking or your camera?
@iamsajjadalidev
@iamsajjadalidev 4 года назад
Thank You So much, Dear. Extraordinary Explanation
@daniyalhabib7900
@daniyalhabib7900 5 лет назад
Very well explained .. Helped alot Thankyou!!!
@educationinformer3880
@educationinformer3880 7 лет назад
sukriya sir ,great sir ,your techniques is good. thank sir have great day
@anushkagurjar8340
@anushkagurjar8340 5 лет назад
Amazing explanation, Thank you so much! Just having a doubt that "we are not considering temperature in our Decision Tree, is it okay to skip temperature?"
@MuhammadHassan-tn9xm
@MuhammadHassan-tn9xm 4 года назад
Sir, aap ki video main camera kiun hilta rehta he ?
@rajbopche7992
@rajbopche7992 5 лет назад
@Well Academy, Why don't we ever find the entropy of attribute Day ?? : (
@vikaspathak2411
@vikaspathak2411 5 лет назад
You can imagine it this way - just by knowing the roll number of a student one cannot determine whether student will pass or not. The day attribute over here is day number, a unique identifier for each day. Knowing day number doesn't contribute to your decision making. In fact, whenever an attribute column has unique values it cannot be used for decision making.
@99ansh
@99ansh 5 лет назад
why temperature is not present in decision tree?
@harisahmed2871
@harisahmed2871 7 лет назад
simply the best dude
@tarunkashyap1496
@tarunkashyap1496 4 года назад
Very well explained sir ,thanks a lot
@akashshukla7159
@akashshukla7159 6 лет назад
Entropy Information gain calculate karte time for Outlook division me 5 kyu Liya...?
@Letsmakeit101
@Letsmakeit101 7 лет назад
Excellent brother Thank you so much...
@shahriarrahman8425
@shahriarrahman8425 6 лет назад
Very helpful! Conceptions are clearly explained.
@rashmikashah8222
@rashmikashah8222 7 лет назад
What if there is no wind colomn while making decision tree?
@qadeemkhan804
@qadeemkhan804 6 лет назад
how we use log2 and log3 in which situation in id3 algorthm?
@muhammadjunaidmajeed1211
@muhammadjunaidmajeed1211 5 лет назад
how to make fuzzy c4.5 on same data-set?
@namanmamodia5731
@namanmamodia5731 4 года назад
Thanks for this good explanation of decision tree
@manuranjangogoi7393
@manuranjangogoi7393 7 лет назад
Very good explanation, Please keep it up ... !!!
@Shrapnel2603
@Shrapnel2603 7 лет назад
In rain table the wind column had 2 values Strong and Weak which had Yes and No values, what if Strong had 1 Yes and 1 No
@sajjadshah2096
@sajjadshah2096 7 лет назад
Great video .Thank you so much bro it solved my problem
@poorvasinha5377
@poorvasinha5377 6 лет назад
what if one of the values for weak was no? we would have to calculate further?
@manishkumar-gq1jm
@manishkumar-gq1jm 7 лет назад
bhai tumhre chte dwdm paas ho gaye thanku man
@aks3475
@aks3475 5 лет назад
Hi.....Thanks for the Great explanation!! Could you please guide me on how to make a confusion matrix for the table taken as an example by you ..?
@taruninbox
@taruninbox 7 лет назад
really a detailed explanation. Much appreciated.
@ThePissucase
@ThePissucase 6 лет назад
It is great! But I prefer if you could do the same in English.
@DebjaneeDhar
@DebjaneeDhar 5 лет назад
Best video and channel. 👌
@prashantjoshi8847
@prashantjoshi8847 7 лет назад
Bro ,it's great, this video is great and cleared all of my doubts,keep it up.And never stop teaching....!!!
@WellAcademy
@WellAcademy 7 лет назад
Prashant Joshi Thank you so much.. ... Keep sharing
@a.n.7338
@a.n.7338 5 лет назад
@@WellAcademy hi can you tell me why did you calculated IG only using the last column?
@usama57926
@usama57926 5 лет назад
very nice explanation
@User_435gyu6
@User_435gyu6 2 года назад
Thank you sir👏🏻👏🏻clear explanation
@asherferoze4107
@asherferoze4107 6 лет назад
Well explained and good efforts.
@mohtashimnawaz4957
@mohtashimnawaz4957 4 года назад
a little mistake, entropy is actually info(D) not info(D) on Attribute i.e your entropy term should be used with first formula.
@sanzayy
@sanzayy 6 лет назад
thanks bro ur lectures are life savers
Далее
Decision Tree Classification Clearly Explained!
10:33
Просмотров 683 тыс.
Viral Video of a Man's Crazy Job Interview
16:02
Просмотров 1,4 млн
Decision Tree 1: how it works
9:26
Просмотров 705 тыс.