Тёмный

Machine Learning Tutorial Python - 4: Gradient Descent and Cost Function 

codebasics
Подписаться 1,1 млн
Просмотров 639 тыс.
50% 1

In this tutorial, we are covering few important concepts in machine learning such as cost function, gradient descent, learning rate and mean squared error. We will use home price prediction use case to understand gradient descent. After going over math behind these concepts, we will write python code to implement gradient descent for linear regression in python. At the end I've an an exercise for you to practice gradient descent
#MachineLearning #PythonMachineLearning #MachineLearningTutorial #Python #PythonTutorial #PythonTraining #MachineLearningCource #CostFunction #GradientDescent
Code: github.com/codebasics/py/blob...
Exercise csv file: github.com/codebasics/py/blob...
Topics that are covered in this Video:
0:00 Overview
1:23 - What is prediction function? How can we calculate it?
4:00 - Mean squared error (ending time)
4:57 - Gradient descent algorithm and how it works?
11:00 - What is derivative?
12:30 - What is partial derivative?
16:07 - Use of python code to implement gradient descent
27:05 - Exercise is to come up with a linear function for given test results using gradient descent
Topic Highlights:
1) Theory (We will talk about MSE, cost function, global minima)
2) Coding - (Plain python code that finds out a linear equation for given sample data points using gradient descent)
3) Exercise - (Exercise is to come up with a linear function for given test results using gradient descent)
Do you want to learn technology from me? Check codebasics.io/?... for my affordable video courses.
Next Video:
Machine Learning Tutorial Python - 5: Save Model Using Joblib And Pickle: • Machine Learning Tutor...
Very Simple Explanation Of Neural Network: • Neural Network Simply ...
Populor Playlist:
Data Science Full Course: • Data Science Full Cour...
Data Science Project: • Machine Learning & Dat...
Machine learning tutorials: • Machine Learning Tutor...
Pandas: • Python Pandas Tutorial...
matplotlib: • Matplotlib Tutorial 1 ...
Python: • Why Should You Learn P...
Jupyter Notebook: • What is Jupyter Notebo...
To download csv and code for all tutorials: go to github.com/codebasics/py, click on a green button to clone or download the entire repository and then go to relevant folder to get access to that specific file.
🌎 My Website For Video Courses: codebasics.io/?...
Need help building software or data analytics and AI solutions? My company www.atliq.com/ can help. Click on the Contact button on that website.
#️⃣ Social Media #️⃣
🔗 Discord: / discord
📸 Dhaval's Personal Instagram: / dhavalsays
📸 Codebasics Instagram: / codebasicshub
🔊 Facebook: / codebasicshub
📱 Twitter: / codebasicshub
📝 Linkedin (Personal): / dhavalsays
📝 Linkedin (Codebasics): / codebasics
🔗 Patreon: www.patreon.com/codebasics?fa...

Опубликовано:

 

1 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 705   
@codebasics
@codebasics 2 года назад
Check out our premium machine learning course with 2 Industry projects: codebasics.io/courses/machine-learning-for-data-science-beginners-to-advanced
@honeymilongton8401
@honeymilongton8401 2 года назад
Sir can you please upload the slides also sir
@codebasics
@codebasics 4 года назад
Stochastic vs Batch vs Mini gradient descent: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-IU5fuoYBTAM.html Step by step roadmap to learn data science in 6 months: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-H4YcqULY1-Q.html Machine learning tutorials with exercises: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-gmvvaobm7eQ.html
@mukulbarai1441
@mukulbarai1441 3 года назад
It has become so clear that I am gonna teach it to my dog.
@codebasics
@codebasics 3 года назад
👍🙂
@farazaliahmad3257
@farazaliahmad3257 3 года назад
Just Do it....
@Austin-pw2ud
@Austin-pw2ud 3 года назад
Dont do it! He may become a threat to humanity!
@eresque7766
@eresque7766 2 года назад
@@Austin-pw2ud he may become the cleverest but he'll remain being a good boy
@Austin-pw2ud
@Austin-pw2ud 2 года назад
@@eresque7766 ouuuuuchh! Tht touched my ♥
@vanlindertpoffertje3032
@vanlindertpoffertje3032 5 лет назад
Thank you so much for the detailed explanation! I have difficulties understanding these theories but most of the channels just explain without mentioning the basics. With your explanation, it is now it is soooo clear! amazing!!
@angulimaldaku4877
@angulimaldaku4877 4 года назад
3Blue1Brown is a great channel so is your explanation. Kudos to you! Also, it is quite appreciable how you positively promote and credit other's good work. That kind of Genuity is much needed.
@officesuperhero9611
@officesuperhero9611 6 лет назад
I’m so excited to see you uploaded a new video on machine learning. I’ve watched your other 3 a couple of times. They’re really top notch. Thank you. Please keep this series going. You’re a great teacher too.
@alidi5616
@alidi5616 4 года назад
This is the best tutorial i have ever seen. This is truly from scratch. Thank you so much
@waytosoakshya1127
@waytosoakshya1127 Год назад
Finally, found the best ML tutorials. Coding with mathematics combined and explained very clearly. Thank you!
@IVIRnathanreilly
@IVIRnathanreilly Год назад
I've been struggling with my online lectures on machine learning. Your videos are so helpful. I can't thank you enough!
@codebasics
@codebasics Год назад
👍👍🙏
@mamtachaudhary5281
@mamtachaudhary5281 3 года назад
I have gone through so many materials and couldn't understand a thing on these, but this video is amazing .Thanks for putting all you videos.
@codebasics
@codebasics 3 года назад
Glad it was helpful!
@mdlwlmdd2dwd30
@mdlwlmdd2dwd30 3 года назад
For people who wants to know whats behind of scene: The reason we get partial derivative m t function (mse): - 2/n (summation) x_i ( y_i (mx_i+b)) is due to chain rule in calculus. We want to take m deriviative and as you see m would be gone as m^(1) and m^(1-1) = 1 and leave only x_i. with chain rule we dissect the function. so suppose we have random function F(m)= (am+b)^2, we would deal with (am+b)^2 first -> 2*(am+b) X df/dm (am+b) -> 2*(am+b) X a . likewise you'd use chain rule for same MSE above. and get - 2/n (summation) x_i ( y_i (mx_i+b)) Please don't accept as it is then you never learn why things are working completely and come up with your own solution. Easy way is never get you where you want it.
@datalearningsihan
@datalearningsihan Год назад
None really understands why we have -(2/n) instead of 2/n. if you do the calculations, even with the chainrule, you will get 2/n, never will get negative values!
@awakenwithoutcoffee
@awakenwithoutcoffee Месяц назад
@@datalearningsihan I think it is indeed to prevent any negative values to occur.
@ayushlabh
@ayushlabh 5 лет назад
It's the most helpful video I have seen till now on gradient descent . Great work . Looking forward for more videos on machine learning .
@khushidonda7168
@khushidonda7168 11 месяцев назад
can you help me how to plot all values of m and b on chart?
@saltanatkhalyk3397
@saltanatkhalyk3397 3 года назад
thank you for such easy explanation. was reading about gradient descent many times but this is the first time I understood the math behind that.
@codebasics
@codebasics 4 года назад
How to learn coding for beginners | Learn coding for free: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-CptrlyD0LJ8.html
@SudiKrishnakum
@SudiKrishnakum 3 года назад
I followed tonnes of tutorials on gradient descent. Nothing came close to the simplicity of your explanation. Now I have a good grasp of this concept! thanks for this sir!
@codebasics
@codebasics 3 года назад
👍☺️
@sharathchandrachowdary6828
@sharathchandrachowdary6828 Год назад
This video is just enough to describe the excellence of your explanation. Simply mind blowing.
@khushidonda7168
@khushidonda7168 11 месяцев назад
can you help me how to plot all values of m and b on chart?
@ishaanverma9asn523
@ishaanverma9asn523 2 года назад
this is the best ML course I've ever came upon !
@yashagarwal3999
@yashagarwal3999 4 года назад
so calmly and nicely u have explained a tough topic to beginners
@moududhassan3026
@moududhassan3026 5 лет назад
The best one for Gradient Descent, Thank you,
@ramsawasthi
@ramsawasthi 5 лет назад
Great tutorial, explained in very easy language in very less time.
@codebasics
@codebasics 5 лет назад
Glad you liked it ram.
@hfe1833
@hfe1833 5 лет назад
Thank you, I think I found the right channel for machine learning
@codebasics
@codebasics 5 лет назад
Great. Happy learning.
@jhansisetty9429
@jhansisetty9429 5 лет назад
It was a very useful video. After watching many other videos, I understood the concept in the best way after watching your video. Keep making such tutorials which are simple and easy to understand the complex topics. Thankyou.
@AlonAvramson
@AlonAvramson 3 года назад
You provide this complex material in such a nice and easy way. Thank you!
@jenglong7826
@jenglong7826 5 лет назад
This was an excellent explanation! Not too technical and explained in simple terms without losing its key elements. I used this to supplement Andrew Ng's Machine learning course on Coursera (which has gotten technical real quick) and it's been really helpful thanks
@codebasics
@codebasics 5 лет назад
Glad you found it useful Chia Jeng.
@akashmishra5553
@akashmishra5553 3 года назад
Hey, thanks for creating all these playlists. These are so good. I think the viewers should at least like and comment in order to show some love and support.
@shijilts4139
@shijilts4139 4 года назад
All your tutorials are amazing!! Thanks a lot.
@i.t.878
@i.t.878 2 года назад
Such an excellent tutorial, the clearest I have seen on this topic. Kudos. Thank you.
@yousufali_28
@yousufali_28 5 лет назад
Thanks for taking step by step approach and making it easy. 👍
@TheSocialDrone
@TheSocialDrone 4 года назад
This was a difficult topic for me; then I spent the time to watch your video, thank you for making my learning easier! Very nice explanation.
@codebasics
@codebasics 4 года назад
👍😊
@khushidonda7168
@khushidonda7168 11 месяцев назад
can you help me how to plot all values of m and b on chart?
@GlobalDee_
@GlobalDee_ 3 года назад
Waoh ,waoh. Codebasics to the world.You are such a great teacher sir.Thanks for sharing these series.........
@wolfisraging
@wolfisraging 6 лет назад
I am glad someone gives perfect explanation
@vijaydas2962
@vijaydas2962 5 лет назад
Perfect explanation. Thanks for your effort
@princeekanim1804
@princeekanim1804 4 года назад
This tutorial made me finally understand gradient descent and cost function ... I dont know you how u did but you did... thanks man. I really appreciate
@codebasics
@codebasics 4 года назад
You're very welcome Prince :) I am glad your concepts are clear now.
@princeekanim1804
@princeekanim1804 4 года назад
codebasics no problem keep it up you’re a great teacher
@fridayemmanueljames4873
@fridayemmanueljames4873 Год назад
Waooo, for a long time I've struggled to really understand the gradient descent algorithm. I feel like a pro
@rajanalexander4949
@rajanalexander4949 2 года назад
Sharp, to the point, succinct. Great stuff!
@mohamedarif3464
@mohamedarif3464 5 лет назад
Thanks for teaching in this approach...great!!!
@chokoprty
@chokoprty 2 месяца назад
Watching this at 2x, like if you are too 😂
@srishtikumari6664
@srishtikumari6664 3 года назад
Insightful! Deep understanding of ML is necessary. You explained it very well
@prajwal3114
@prajwal3114 3 года назад
One of the best Tutorial for Gradient Descent.
@aayush135
@aayush135 2 года назад
Superb!! Your lectures are very good and make complicated things very easy. May you keep growing in your life.
@alokpratap2094
@alokpratap2094 5 лет назад
sir ur videos are really awesome, sir please try to complete these series as soon as possible, cover all the topics of machine learning like cluster analysis, principal component analysis, etc
@Opinionman2
@Opinionman2 2 года назад
Best video on the topic I’ve seen so far! Thanks
@VijaykumarS7
@VijaykumarS7 9 месяцев назад
You explained in the simplest way this complex concept. Best teacher in the world 🎉🎉
@codebasics
@codebasics 9 месяцев назад
Glad you liked it ! 😊
@amandaahringer7466
@amandaahringer7466 2 года назад
Exceptionally done! Great work and thank you!
@fernandoriosleon
@fernandoriosleon 4 года назад
Finalmente aprendí Gradient descent, Finally I leaned Gradient Descent, thank you so much 🙏
@justforfun7855
@justforfun7855 3 года назад
BONNE!!!
@ireshaweerasinghe8705
@ireshaweerasinghe8705 5 лет назад
Thanks... i am new to ML and your tutorial is very useful to me :)
@mayankjain24in
@mayankjain24in 4 года назад
awesome explanation. plz keep it up ..... also appreciate how you credit others for their work, that's very rare
@sarikamishra7051
@sarikamishra7051 3 года назад
Sir u r the best teacher I ever got for Machine Learning.
@codebasics
@codebasics 3 года назад
Glad it was helpful!
@yangfarhana3660
@yangfarhana3660 3 года назад
Clearly broken down concepts, very very good video, thank you for this amazing guide!
@codebasics
@codebasics 3 года назад
Glad it was helpful!
@krystianprogress4521
@krystianprogress4521 2 года назад
Thanks to you I finally understood what the gradient descent is
@mahmoudnady4388
@mahmoudnady4388 2 года назад
Thank you, teacher. Your explanation is clear, interesting and useful ❤👌
@MondayMotivations
@MondayMotivations 4 года назад
I don't know why you are so underrated. Only 73K SUBSCRIBERS. You deserve way more than that, I mean the way you clear the concepts. You're simply awesome man.
@codebasics
@codebasics 4 года назад
I am happy this was helpful to you
@geekyprogrammer4831
@geekyprogrammer4831 3 года назад
now he got 281K and in future I expect it to be more :D
@clarizalook2396
@clarizalook2396 4 года назад
i'm confused. This is something very new to me despite I've studied calculus in my undergrad years. I did not get it fully but the code worked from my end. Perhaps soon, the more I get into different models, I'd slowly understand this. Thanks for sharing all these.
@codebasics
@codebasics 4 года назад
Yup clarie. The tip here is to go slowly without getting overwhelmed. Don't give up and slowly you will start understanding it 😊👍
@vishnusagubandi8274
@vishnusagubandi8274 4 года назад
I think this is best gradient descent tutorial even better than andrew ng sir I got stuck with andrew sir tutorial and later came up here Finally got it...Thanks a lot bro🙏🙏
@sararamadan1907
@sararamadan1907 3 года назад
I wanted to thank you before ending watching the video just to tell you that you make my day by implying this lesson
@codebasics
@codebasics 3 года назад
sara i am glad you liked it and thanks for leaving a comment :)
@ishitasadhukhan1
@ishitasadhukhan1 2 года назад
The best tutorial on Gradient Descent !
@boubacaramaiga4408
@boubacaramaiga4408 5 лет назад
Excellent tutorial. Many thanks.
@kasahunabdisa6022
@kasahunabdisa6022 Год назад
great and simple approach to learning gradient descent . Thank you for your effort
@RAJESHMANDALGAU-C-
@RAJESHMANDALGAU-C- 6 лет назад
Here is the video I found. Great to watch!
@jc-co1ck
@jc-co1ck 2 года назад
Thanks for your explanation and it is really clear and easy to understand. They are really awesome, thank you.
@shubhamkanwal8977
@shubhamkanwal8977 4 года назад
This is pure gold!
@HARSHRAJ-2023
@HARSHRAJ-2023 6 лет назад
Hope sir you are more regular in uploading the video. It will help us a lot. Eagerly waiting for new upload.
@wasirizvi2437
@wasirizvi2437 4 года назад
Explained well in easy language ! Thanks bro.
@daychow4659
@daychow4659 Год назад
Omg!!! This is my first time seeing people to calculate how gradients decent works!!!!
@usmanriaz8396
@usmanriaz8396 2 года назад
best video on gradient descent and cost function. understood the match pretty well., excellent,. love from pakistan
@How_About_This428
@How_About_This428 2 года назад
Indians as always, so smart and brilliant people Thank you for the video it helped me a lot
@yourlifeonpower
@yourlifeonpower 4 месяца назад
Very clear, concise and helpful! Thank you !
@ajaykushwaha-je6mw
@ajaykushwaha-je6mw 2 года назад
Best ever video on Gradient Descent.
@ou8xa1vkk64
@ou8xa1vkk64 3 года назад
Little hard for me! I cant do the exercise myself. But 100% sure no one will teach easier than this in the world. Keep doing it love you lot!!!!!!!!!
@sukumarroychowdhury4122
@sukumarroychowdhury4122 3 года назад
Hey: you are absolutely excellent. I have seen many guys offering machine learning tutorials. None is as simple, as clear and as educative as you are. Best regards, Sukumar Roy Chowdhury - ex Kolkata, Portland, OR, USA
@codebasics
@codebasics 3 года назад
Sukumar, I am glad this video helped 👍🙏
@halyan2033
@halyan2033 4 года назад
thank you so much. You saved my life!
@hrithvikreddy6643
@hrithvikreddy6643 Месяц назад
The Exercise that you have shared is taking many no of iterations to get to the correct intercepts and coefficients... my laptop hung many times doing that problem 😵‍💫😵‍💫😵‍💫😵‍💫
@Haven_Hue
@Haven_Hue Месяц назад
code (try this): import pandas as pd import numpy as np import math data = pd.read_csv("D:\\Machine_learning\\Grad_des\\test_scores.csv") x = data['math'].to_numpy() y = data['cs'].to_numpy() def gradient_descent(x,y): m_curr = b_curr = 0 iterations = 10000 n = len(x) learning_rate = 0.001 prev = 0 for i in range(iterations): y_predicted = m_curr * x + b_curr md = -(2/n)*sum(x*(y-y_predicted)) bd = -(2/n)*sum(y-y_predicted) m_curr = m_curr - learning_rate * md b_curr = b_curr - learning_rate * bd cost = (1/n) + sum([val**2 for val in (y-y_predicted)]) if math.isclose(cost,prev,rel_tol=1e-09): break print("m {}, b {}, cost {}, iteration {}".format(m_curr,b_curr,cost,i)) prev = cost gradient_descent(x,y)
@akhileshchauhan7422
@akhileshchauhan7422 5 лет назад
upcoming few days I will see your whole channel
@rishabkumar9578
@rishabkumar9578 3 года назад
You are the best teacher for data science... thanks
@codebasics
@codebasics 3 года назад
Glad you enjoyed it
@AYUSHKUMAR-dm1xg
@AYUSHKUMAR-dm1xg 4 года назад
who are the people disliking these videos. These people work hard and make these videos for us. Please if you don't like it, don't watch it but don't dislike it. It is misleading to the people who come to watch these videos. I know many of us have studied some of these concepts before, but he is making videos for everyone and not for a few section of people. I feel that this channel's videos are amazing and doesn't deserve any dislikes.
@codebasics
@codebasics 4 года назад
Thanks ayush. I am moved by your comment and kind words. I indeed put lot of effort in making these videos. Dislikes are fine but at the same time if these people put a reason on why they disliked, it will help me a lot in terms of feedback and future improvements 😊
@satyanarayanaguggilapu3735
@satyanarayanaguggilapu3735 Год назад
Very effective teaching. Thanks for the videos.
@mohankrishna8112
@mohankrishna8112 6 лет назад
Hello sir I'm a great fan of you Please upload videos on RNN, LSTM, CNN, HIDDEN MARKOV MODEL, MONTECARLO SIMULATIONS, BOOTSTRAP implementation in python
@valijoneshniyazov
@valijoneshniyazov 3 года назад
when you calculate partail derivatives, dont assume x or y zero, assume them constants instead. for example f(x,y)= x*y your partial derivatives will be 0 but it should be x and y
@rajdipdas1329
@rajdipdas1329 2 года назад
no why partial deriative will be zero ,we have to analyze it as a constant df(x,y)/dx=x.dy/dx+y this will be the derivative with respect to x and df(x,y)/dy=x+y.dx/dy.
@ecgisamal
@ecgisamal 2 года назад
If there will be an award of best teacher of theworld so the the award would go to this person, programming hero and brackeys
@codebasics
@codebasics 2 года назад
🙏 thanks for your kind words ayuro
@ecgisamal
@ecgisamal 2 года назад
@@codebasics Please can you make a playlist on tutorial of opencv python?
@user-ys1qg4cl9g
@user-ys1qg4cl9g 10 месяцев назад
Best explanation even seen. Love from Bangladesh
@nice2meetyou631
@nice2meetyou631 4 года назад
This is so amazing. Thank you
@arpanasingh8957
@arpanasingh8957 3 года назад
so calmly and nicely u have explained a tough topic to beginners. very thank full to you . Waiting for your new videos.
@codebasics
@codebasics 3 года назад
Glad you enjoyed it
@nirmalyamisra
@nirmalyamisra 5 лет назад
this was such a great video..many thanks !
@codebasics
@codebasics 5 лет назад
Nirmalya thanks for leaving a comment :)
@imtiyazshaik9950
@imtiyazshaik9950 5 лет назад
big bow sir please continue making videos please !!!!!!!
@stephenadjei8413
@stephenadjei8413 3 года назад
BEst I have watched...totally clear now
@harryinatube
@harryinatube 6 лет назад
Great video and teaching method. You have an art of keeping things simple but still teach advanced concepts. I get a very good and quick overview and understanding from your videos. Thanks a lot.
@naderasaleh1222
@naderasaleh1222 4 года назад
You gave a good explanation. Great video
@deshabhaktg6530
@deshabhaktg6530 3 года назад
You're Legend🔥🔥 Thank you so much for such an amazing explanatory video.
@codebasics
@codebasics 3 года назад
Glad it was helpful!
@MrThirupathit
@MrThirupathit 4 года назад
Very nicely explained and clear. However, expected the code for graphs on cost function vs b and cost vs m, also expected the code on graph for regression line and its outcome. Looking forward the same.
@vinodkinoni4863
@vinodkinoni4863 6 лет назад
happy to sea u ,great work sir
@mdrafikbk
@mdrafikbk 5 лет назад
Excellent and simple explanation for Gradient descent. Thank you br.
@codebasics
@codebasics 5 лет назад
Mohamed Rafik, I am glad you liked it
@arnav707
@arnav707 3 года назад
this deserves a sub
@saifullahshahen
@saifullahshahen 3 года назад
i watched this video the third times and after third time, now this is understandable.
@codebasics
@codebasics 3 года назад
glad that it helped.
@krishkonnect814
@krishkonnect814 4 года назад
Thankyou very much Sir ji. Thankyou for all the important and essential info in improving my learning curve.
@codebasics
@codebasics 4 года назад
Thanks for your kind words of appreciation
@dhairyashiil
@dhairyashiil 3 года назад
Thank you... You saved my Marks! ❤
@ayushgupta80
@ayushgupta80 3 месяца назад
In machine learning , we have input-output ; by help of these values , we derive 'equation' known as prediction function . Firstly we draw the line which is most appropriate and passes through nearest points of o/p values . Then calculate Mean Squared Error ( popular cost function) = 1/n *( summation i =1 to i = n ) *{[ yi - y_predicted]} square or MSE = (1/n) * Σ(actual - forecast)2 y_predicted = mx + b Gradient descent is an algorithm that finds best fit line for given training data set . Code : import numpy as np def gradient_descent(X, y): m_curr = b_curr = 0 iterations = 10000 n = len(X) learning_rate = 0.08 for i in range(iterations): y_predicted = m_curr*x + b_curr cost = (1/n)*sum([val**2 for val in (y-y_predicted)]) md = -(2/n)*sum(x*(y-y_predicted)) #derivative of m bd = -(2 / n) * sum(y - y_predicted) #derivative of b m_curr = m_curr - learning_rate*md b_curr = b_curr - learning_rate * bd print("m {},b {},cost {} , iteration {}".format(m_curr,b_curr,cost,i)) x = np.array([1,2,3,4,5]) y = np.array([5,7,9,11,13]) gradient_descent(x, y) We are supposed to find the learning rate for which the cost decreases continously
@muhammedthayyib9202
@muhammedthayyib9202 2 года назад
Thank you very much. I solved the problem my own.
@tsangwingho2508
@tsangwingho2508 3 года назад
Hi, I wanna know how to plot the learned regression line for each iterations on the same graph showing that the change. Thanks
@simranjeetkaur7829
@simranjeetkaur7829 2 года назад
Great Explanation!
@vidushi55
@vidushi55 5 лет назад
Thanks! Nice video:)
Далее
Meninas na academia
00:11
Просмотров 2 млн
Gradient Descent, Step-by-Step
23:54
Просмотров 1,3 млн
Machine Learning Tutorial Python - 9  Decision Tree
14:46
Gradient Descent Explained
7:05
Просмотров 61 тыс.