Тёмный

Logistic Regression in Python from Scratch | Simply Explained 

Coding Lane
Подписаться 28 тыс.
Просмотров 42 тыс.
50% 1

Опубликовано:

 

21 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 83   
@CodingLane
@CodingLane 3 года назад
If you found this video valuable, then hit the *_like_* button👍, and don't forget to *_subscribe_* ▶ to my channel as I upload a new Machine Learning Tutorial every week.
@CodingLane
@CodingLane 2 года назад
@Quran and Hadith yes
@CodingLane
@CodingLane 2 года назад
@Quran and Hadith yes
@saumyashah6622
@saumyashah6622 3 года назад
Very few people explain things mathematically and a very few people want a mathematical explanation. People just want to code without understanding the algorithm. You and your subscribers are the best :)
@CodingLane
@CodingLane 3 года назад
Thank you so much for such a good compliment 😊
@pavithranpavithran7354
@pavithranpavithran7354 8 месяцев назад
Man for the past 2 days i have been searching the explanation Man you rocked it .Keep going brother
@CodingLane
@CodingLane 8 месяцев назад
Thanks a lot ! 😁
@justtimepass2948
@justtimepass2948 Год назад
Please don't stop uploading videos it's really really superb explanation in a precise manner. Great job keep it up bro 😎
@MeetPatel-sk7pu
@MeetPatel-sk7pu 3 года назад
I don't have words for give compliment for your explanation bro. MOST CLEAR EXPLANATION I EVER SEEN BEFORE. 🍺🥂
@CodingLane
@CodingLane 3 года назад
And I don’t have words to appreciate your comment ! Thank you very much ! It really means alot to me
@Itsavinashdubey
@Itsavinashdubey 3 года назад
I was searching a lot and finally bro!!! I got you!!!!! thanks a lot
@CodingLane
@CodingLane 3 года назад
Thank you so much! This means a lot to me.
@AmanTheDisciple
@AmanTheDisciple 3 года назад
A very good video, been searching for something like this for so long. Finally found it. Thanks bro.
@CodingLane
@CodingLane 3 года назад
Thank You!
@ipapergrey6654
@ipapergrey6654 7 месяцев назад
Thank you so much you really helped me start my ML journey
@gratusrichard535
@gratusrichard535 Год назад
you sir are a legend. I took several tutorials on machine learning , your videos are the only one that make sense to me. I don't know if you have any paid course out there, if you do please let me know, i will definitely purchase it. good luck :)
@CodingLane
@CodingLane Год назад
Thanks a lot for the compliment 😇. Means a lot. Currently, I don’t have any paid courses, hoping to make them in future!
@jasurtoshpolatov9153
@jasurtoshpolatov9153 Год назад
500th like by me, good luck👍
@IbrahimAli-kx9kp
@IbrahimAli-kx9kp 3 года назад
Thanks for the video Jay 💙 Just a simple question, at 5:31: You used the method (reshape) to modify Y but the (transpose) to modify X! Why don't we use transpose for both? I tried it and I think it works, otherwise you have other reason! Thanks again for your amazing content 😄
@CodingLane
@CodingLane 3 года назад
You can perform the operation using reshape or transpose. Both are fine. There is no specific reason for me to use reshape instead of transpose. You can use any 😇
@brindhasenthilkumar7871
@brindhasenthilkumar7871 3 года назад
hi, I had been learning machine learning by my own and had seen many videos, your explanation was remarkable, keep going, there is a clarity after we listen to your videos, great great, all the best for making more videos on all algos of ML
@CodingLane
@CodingLane 3 года назад
Thank you so much ! I am elated after reading this. I am glad you find my videos helpful.
@brindhasenthilkumar7871
@brindhasenthilkumar7871 3 года назад
@@CodingLane certainly yes, kindly upload more videos which can teach us from the scratch so that it will be easy for us to understand better than blindly using the python machine learning libraries. Great job, keep going
@CodingLane
@CodingLane 3 года назад
Sure ! Thanks @@brindhasenthilkumar7871
@larrysummer2015
@larrysummer2015 3 года назад
keep it up man... you couldn't be better teaching...
@CodingLane
@CodingLane 3 года назад
Thank You so much !
@003kazimehrabrashid4
@003kazimehrabrashid4 Год назад
well , in your cost function video you told that dCost/dW = (A-Y).X but in code you wrote that dCost/dW = (1/m)(A-Y).X should I multiply (1/m) or not?please tell me bro
@nhs881
@nhs881 3 года назад
thanks for this useful video .. I just have one question : I have a dataset for students performance in a course and I am required to split my dataset into 70% for training and 30% for testing without using sklearn .. How to do so?
@CodingLane
@CodingLane 3 года назад
For this you can learn numpy and pandas from any video tutorial. That will help you in all these sorts of data preprocessing.
@sauravsharma690
@sauravsharma690 3 года назад
Hey, thanks a lot for the video! So I'm facing a major problem. When I run the model, I am getting cost as NaN for every iteration after the 0th iteration. Why is this happening? How do I fix this? For context, I am using a different dataset (adult census income dataset from Kaggle) but all the preprocessing has been done and all the columns have numerical values.
@CodingLane
@CodingLane 3 года назад
Its because you might be taking very large “learning_rate” Try to reduce its value by 100 times or 10000 times or may be more. Once you see cost function takes some value which is not NaN, you can increase the learning_rate or adjust it to train the model faster. If still it shows NaN, then check if you have implemented the equations of logistic regression properly or not. A slight change in equation can also cause model not to train.
@soujanyapm480
@soujanyapm480 2 года назад
Thanks for this video, it was very informative. Could you please explain the formula you have used for accuracy in accuracy function?
@CodingLane
@CodingLane 2 года назад
Hi… I calculated error rate… which is % of wrong predictions and then subtracted it from 100
@mukeshgupta9806
@mukeshgupta9806 3 месяца назад
By implementing the same code it is showing an error: weight is not defined what should I do
@priyankagarg8999
@priyankagarg8999 2 года назад
Thanks for sharing these videos😀. Your all videos are informative and make it so simple for me to understand the concept🤓.
@CodingLane
@CodingLane 2 года назад
It's my pleasure. Happy to hear that! 🙂
@babaabba9348
@babaabba9348 3 года назад
very informative, you are the best continue
@CodingLane
@CodingLane 3 года назад
Thank You so much 😇 !!
@babaabba9348
@babaabba9348 3 года назад
@@CodingLane i have a question that concerns boudary and logistic regression how can i contact you in person
@CodingLane
@CodingLane 3 года назад
@@babaabba9348 mail me on codeboosterjp@gmail.com
@babaabba9348
@babaabba9348 3 года назад
@@CodingLane thank you so much mate
@babaabba9348
@babaabba9348 3 года назад
@@CodingLane maybe it would be better that you delete your address
@jagajaga6908
@jagajaga6908 Год назад
bro thank you for good video
@CodingLane
@CodingLane Год назад
You’re welcome!
@taruchitgoyal3735
@taruchitgoyal3735 3 года назад
Hi At 1:11 you are uploading csv files for train and test. I am using Google Colab. Thus, the code to upload the files I got was =files.upload() Thus, how do I fit the same using Pandas as demonstrated by you?
@CodingLane
@CodingLane 3 года назад
Hello, here are the ways to use the files on google colab and load into pandas: towardsdatascience.com/3-ways-to-load-csv-files-into-colab-7c14fcbdcb92 Hope it helps!
@adrenochromeaddict4232
@adrenochromeaddict4232 11 месяцев назад
you just saved my life mate thx
@anuradhashukla6059
@anuradhashukla6059 3 года назад
it was really helpful
@CodingLane
@CodingLane 3 года назад
Glad I could help!
@dinmadaniel9830
@dinmadaniel9830 Год назад
in your cost where did you get y and from cause you never defined them
@-alfeim2919
@-alfeim2919 2 года назад
Amazing job!
@CodingLane
@CodingLane 2 года назад
Thank you!
@acno_randintrandint4182
@acno_randintrandint4182 3 года назад
would A > 0.5 get us a sum of correct predictions or just one class? can you please explain a bit clearly maybe i missed
@CodingLane
@CodingLane 3 года назад
Sure. Let say if A = [0.2, 0.7, 0.8, 0.3, 0.4, 0.6] Then A > 0.5 will be = [false, true, true, false, false, true] And if you convert it into integer, then it will be, Afinal = [0, 1, 1, 0, 0, 1] Thus, A initially were just probabilities. Now Afinal are predictions for class 0 and 1
@CodingLane
@CodingLane 3 года назад
Hope I made it clear now.
@acno_randintrandint4182
@acno_randintrandint4182 3 года назад
@@CodingLane hey thanks but accuracy is sum of all the correct predictions / total predictions meaning and apologies if I am wrong ... compared to y truth and y predict how many in [0, 1, 1, 0, 0, 1] were right / total , not simply > 0.5 which yes will simply separate the classes ... i did calculate it back yesterday and my accuracy was around 68 - 71 % .. Super sorry if i did it all wrong and big thanks again
@acno_randintrandint4182
@acno_randintrandint4182 3 года назад
def accuracy_manual(slope, intercept, X_test,Y_test): predictions = np.dot(slope.T, X_test) + intercept predictions_log =(sigmoid(predictions)) all_predictions=[1 if i >= 0.5 else 0 for i in predictions_log[0]] print("all predictions == ", len(all_predictions)) count=0 for i in ap: if Y_test[0][i] == all_predictions[i]: count+=1 else: pass print("correct count ", count) #alternate way s= sum([all_predictions[i] == Y_test[0][i] for i in all_predictions]) print("correct count ", s) #accuracy = correct count / total count accuracy = count/len(all_predictions) print("accuracy of model ", accuracy) return accuracy
@acno_randintrandint4182
@acno_randintrandint4182 3 года назад
np.mean(P == y_test)
@ridwanridlonugroho9849
@ridwanridlonugroho9849 Год назад
thanks, very good video 👍
@mdtufajjalhossain1246
@mdtufajjalhossain1246 3 года назад
Thanks a lot. your explanation was just awesome. Would you please make a similar video on Multiclass Logistic Regression from scratch? I am expecting it from you bro.
@CodingLane
@CodingLane 3 года назад
Thank you so much! And yea... I will try to make that video too
@philtoa334
@philtoa334 2 года назад
Very Nice.
@CodingLane
@CodingLane 2 года назад
Thank you!
@ShivamSharma-eh8vb
@ShivamSharma-eh8vb Год назад
you are awsome
@nauman_26
@nauman_26 2 года назад
Thank you for this video, it is really helpful. Can you make a video on feature scaling from scratch?
@CodingLane
@CodingLane 2 года назад
Thanks for the suggestion… i will see if i can make a video on it
@nauman_26
@nauman_26 2 года назад
@@CodingLane Thank you for your support
@proterotype
@proterotype 3 года назад
Great video. You made it seem easy. And Easy is good. Thanks a lot
@CodingLane
@CodingLane 3 года назад
Thank you so much ! I really appreciate it
@kalidasabuj8940
@kalidasabuj8940 2 года назад
Please make more videos on ml algorithms
@CodingLane
@CodingLane 2 года назад
Thanks for the suggestion. Will also make videos on other ML algorithms. Though it might take some time.
@vg5675
@vg5675 4 месяца назад
how to select best features to get the highest possible f1 score
@ajinkyajagtap5151
@ajinkyajagtap5151 3 года назад
how to plot logistic regression ?
@RajivKumar-nv2gj
@RajivKumar-nv2gj 3 года назад
Thanks bro
@CodingLane
@CodingLane 3 года назад
Your welcome !
@theopensparrow3274
@theopensparrow3274 3 года назад
Content is very good, but the presentation is not satisfactory.
@CodingLane
@CodingLane 3 года назад
Thank you for your feedback. I have tried to improve the presentation style in the newer videos. I hope you find it better.
@glenmason6680
@glenmason6680 3 года назад
sorry mate you need to slow down a bit
@CodingLane
@CodingLane 3 года назад
Okay 👍🏻
@danish5326
@danish5326 3 года назад
I love your explanation but please don't fake your accent. Its quite annoying.
@CodingLane
@CodingLane 3 года назад
Thank You Danish !
@kian0902
@kian0902 2 года назад
What do you mean by ''Fake your accent'' ?
Далее
ОН У ТЕБЯ ЗА ДВЕРЬЮ!
22:33
Просмотров 460 тыс.
Logistic Regression [Simply explained]
14:22
Просмотров 189 тыс.
ML Was Hard Until I Learned These 5 Secrets!
13:11
Просмотров 327 тыс.
Python for Data Analysis: Logistic Regression
19:06
Просмотров 10 тыс.
Logistic Regression Python Sklearn [FROM SCRATCH]
5:33
Gradient Descent From Scratch In Python
42:39
Просмотров 18 тыс.