Тёмный

Tutorial 36- Logistic Regression Mutliclass Classification(OneVsRest)- Part 3| Data Science 

Krish Naik
Подписаться 1 млн
Просмотров 111 тыс.
50% 1

Please join as a member in my channel to get additional benefits like materials in Data Science, live streaming for Members and many more
/ @krishnaik06
code example:
chrisalbon.com/machine_learni...
Please do subscribe my other channel too
/ @krishnaikhindi
Connect with me here:
Twitter: / krishnaik06
Facebook: / krishnaik06
instagram: / krishnaik06

Опубликовано:

 

20 мар 2020

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 71   
@ahmedaman6384
@ahmedaman6384 3 года назад
Fantastic as always krish. You are by far the best resource for learning data science concepts. Appreciate you.
@ankanbhattacharyya8805
@ankanbhattacharyya8805 3 года назад
This is pretty helpful. So complecated concept of multiclass classification explained so simply. Thank you so much. Keep it up
@datatorture3086
@datatorture3086 3 года назад
Thanks for your best intuition Sir ,This tutorial has weed out all my confusion
@fadlyahmad8565
@fadlyahmad8565 2 года назад
Thank you for the explanation. Its very helpful!
@maheshchoudhury2719
@maheshchoudhury2719 4 года назад
Your video helps me lot... Thanks a lot..
@hrushikeshkulkarni7353
@hrushikeshkulkarni7353 Год назад
Thank you Krish. This is actually very simple to understand 😀
@cosmin2889
@cosmin2889 3 года назад
Thanks man you are pure gold :D
@krishnamishra8598
@krishnamishra8598 4 года назад
Great technique..loved it😘
@thepresistence5935
@thepresistence5935 3 года назад
cleared everything about the logistic regression so thanks dude
@mrunaldusane3905
@mrunaldusane3905 3 года назад
Sir, for multiclass classification (having more than 2 categories in dependent variable). Can we use Multinomial Logistic Model from GLM family?
@egorgavriushkin9230
@egorgavriushkin9230 Год назад
You know thye material so well and you are so talented as a teacher! thank you for your videos. The IBM data science P.C. is no match to you
@sandipansarkar9211
@sandipansarkar9211 3 года назад
Thanks Krish.I am still wondering why I didn't see this video earlier .Things would have been far more easier.Now got to study some real easy language articles or blogs especially medium / towards data science to have a better idea about this,.
@trivendratiwari9231
@trivendratiwari9231 3 года назад
Thank you krish for these videos 👍
@yannickpezeu3419
@yannickpezeu3419 3 года назад
Thanks, really clear explanation
@VarunKumar-pz5si
@VarunKumar-pz5si 3 года назад
Great Explanation
@aadarshshekhar6645
@aadarshshekhar6645 3 года назад
Sir, could you please explain the difference between multiclass classification and clustering algorithm.
@jihedjerbi2148
@jihedjerbi2148 3 года назад
Great technique ...
@louerleseigneur4532
@louerleseigneur4532 3 года назад
Thanks Krish
@kv11gaming
@kv11gaming 3 года назад
Please publish a video for the multinomial classification using sklearn for all classification models properly in python, so that we can understand properly.
@prashantjoshi8847
@prashantjoshi8847 Год назад
clean and precise :-)
@PavanKumar-ef1yy
@PavanKumar-ef1yy 8 месяцев назад
Thank you
@thedataguyfromB
@thedataguyfromB 4 года назад
Owesome
@kasturikasturi2551
@kasturikasturi2551 3 года назад
thank you sir
@Martinxddxdxdxdx
@Martinxddxdxdxdx 3 года назад
Great technique. is the same as use OneVsRestClassifier from sklearn?
@soumyadeeproy6611
@soumyadeeproy6611 3 года назад
Hi,I have a small doubt which is: U said for each category, one model is trained which outputs its probability of being correctly classified. Now, when each of the models is independent and thus yields independent probability values, then how you are saying that their sum would be equal to 1 ?? I hope I made my question clear
@trashantrathore4995
@trashantrathore4995 2 года назад
Did u get any explanation for it from anywhere? If yes, plz let me know
@r0cketRacoon
@r0cketRacoon 22 часа назад
@@trashantrathore4995 through softmax function
@babaabba9348
@babaabba9348 3 года назад
GOOD THANK YOU
@MuhammadAhmad-bx2rw
@MuhammadAhmad-bx2rw 3 года назад
Great
@fastlearningpoint4721
@fastlearningpoint4721 3 года назад
Good sir
@gauravtak9787
@gauravtak9787 4 года назад
Sir i think this is similar as softmax function i am right or not plzz clear me
@samardwivedi6090
@samardwivedi6090 4 года назад
sir how can we plot the visualization graph for AUC and ROC curve for a huge multiclass data set
@samriddhlakhmani284
@samriddhlakhmani284 4 года назад
Is there something like that ? please let me know
@Ruhgtfo
@Ruhgtfo 4 года назад
Great tutorial~
@deepankzanwar2187
@deepankzanwar2187 4 года назад
Sir what if we have 3 class in our target variable and we apply one hot encoding and while doing train test split we select one variable out if 3 as our Y and build a model. Likewise we can build 3 model. Is this way correct or not.
@GauravSharma-ui4yd
@GauravSharma-ui4yd 4 года назад
Absolutely correct but at the end it is also recommend to normalize the predicted probabilities from all three classes using a softmax/linear-normalizer or any other function that satisfies the normalization properties. Also in the video krish may have forgotten to add that we generally normalizes the probs at the end, scikit does so using linear normalizer but softmax can also we used just like we did at the output layer of neural-net in mutliclass problems
@a.mo7a
@a.mo7a 3 года назад
@@GauravSharma-ui4yd Hello dear sir, can you please explain what is probability normalization?
@adityay525125
@adityay525125 4 года назад
Sir I have one question, please tell us where to practice problems. on Machine Learning, I am starting to fell that theory only is not going to cut it anymore.
@amoldeepgupta1400
@amoldeepgupta1400 3 года назад
@db1 make small projects
@anandkumar-cy3st
@anandkumar-cy3st 3 года назад
@krish could you please upload a video about logistic regression implementation
@divyashetty3181
@divyashetty3181 2 года назад
So for every new test data output would be o3??
@datasciencegyan5145
@datasciencegyan5145 2 года назад
can u pls make one video related to coding in multi-class classification using logistic regression
@InfinitelyScrolling
@InfinitelyScrolling 4 года назад
does logistic regression always fit a line or it can fit curves also?
@revanthshalon5626
@revanthshalon5626 4 года назад
Logistic regression is a linear model fitting a curve or sigmoid so that the value stays positive and between 0 and 1
@siddharthgurav6407
@siddharthgurav6407 Год назад
it is like creating dummy variables right
@MaLik-gz9vb
@MaLik-gz9vb Год назад
How to calculate probability of m1,2 and 3. Formula ?
@keshavrjaput1522
@keshavrjaput1522 4 года назад
Plz send me the ineron course link form
@GauravSharma-ui4yd
@GauravSharma-ui4yd 4 года назад
Nice video krish, why we take -1 instead of 0? Also just one thing to add that we normalizes the output probabilities at the end either using linear normalizer or softmax.
@dragon_warrior_
@dragon_warrior_ 4 года назад
just for explanation purpose I guess
@darshanpadaliya9894
@darshanpadaliya9894 4 года назад
what we take for o1, o2, o3 is the value of y not the value of sigmoid function
@borispasynkov1404
@borispasynkov1404 2 года назад
In 6 minutes explained the whole material of 1 hour university lecture
@thallamkarthik7020
@thallamkarthik7020 4 года назад
What if 2 of those 3 classes got the same probability values? How the model is gonna classify that new test point ?
@vasanth3029
@vasanth3029 3 года назад
Its generally a best practise to get the probabilities and setting threshold manually and deciding the output
@frischidn3869
@frischidn3869 Год назад
Sir, the link is not working
@mgaf7864
@mgaf7864 2 месяца назад
The code link is not working
@rohankavari8612
@rohankavari8612 3 года назад
how the sum of all prob of all models is equal to 1?....I think in some cases it might be greater than one.....for eg if 2 model gives .5 value and 3rd gives 0.3 ....then the sum is 1.3
@a.mo7a
@a.mo7a 3 года назад
Exactly my question... but I tried it in python and got 1 for all samples... which is kinda odd
@rohankavari8612
@rohankavari8612 3 года назад
@@a.mo7a check the documentation....it might have divided it with some number
@raoashwi
@raoashwi 3 года назад
is it 0 0 1 or -1 -1 1
@nileshmandlik9662
@nileshmandlik9662 3 года назад
implementation video for this
@ghaythalmahadin4994
@ghaythalmahadin4994 3 года назад
what if probabilities are equal, which class will be chosen?
@sumitmhaiskar722
@sumitmhaiskar722 3 года назад
point did you get the answer?
@sumitmhaiskar722
@sumitmhaiskar722 3 года назад
@Epoxy To The World no sir
@sumitmhaiskar722
@sumitmhaiskar722 3 года назад
@Epoxy To The World I'm 23 years old and how may I know that what path you are following???
@sumitmhaiskar722
@sumitmhaiskar722 3 года назад
@Epoxy To The World yes or just go with the flow and try to understand the maths behind each and every algorithm.
@nit235
@nit235 3 года назад
This is what I understood after doing some research on this question : Let assume, we have 3 binary classification models, each one has been trained to output whether the input belongs to (1,2,3 respectively) or not. Assume we have a query or test point as input. We need to return the labels for this input. Since, it is a multiclass problem, the input could have 3 labels. Assume we set our threshold to 0.6 . Now, we pass the input to all these 3 models. Assume we got these probabilities from each of the 3 models as follows : 0.52,0.7,0.8 . Since, we made our threshold to 0.6, the input doesn't belong to the first because 0.52 is less than our threshold 0.6 . We now output the labels for the query as 2 and 3 since their probabilities are greater than the threshold. I'm not sure about my answer but this is what I got after googling it.
@arunbharathapu2944
@arunbharathapu2944 4 года назад
Do this multi classification in deep learning
@GauravSharma-ui4yd
@GauravSharma-ui4yd 4 года назад
You can easily do so in neural net by applying sigmoid activation at the output layer. But the probs across the classes are not normalized hence you can add a softmax layer after it. But first adding sigmoid and then softmax doesn't makes sense hence we directly apply softmax at the end. Also in this case as well we are recommended to normalize the probs across the classes.
Далее
Tutorial 37: Entropy In Decision Tree Intuition
8:53
Просмотров 279 тыс.
🎸РОК-СТРИМ без ФАНЕРЫ🤘
3:12:10
Просмотров 1,4 млн
Multiclass Classification : Data Science Concepts
13:35
AI VS ML VS DL VS Data Science
9:45
Просмотров 2,8 млн