Stochastic gradient descent, batch gradient descent and mini batch gradient descent are three flavors of a gradient descent algorithm. In this video I will go over differences among these 3 and then implement them in python from scratch using housing price dataset. At the end of the video we have an exercise for you to solve.
🔖 Hashtags 🔖
#stochasticgradientdescentpython #stochasticgradientdescent #batchgradientdescent #minibatchgradientdescent #gradientdescent
Do you want to learn technology from me? Check codebasics.io/... for my affordable video courses.
Next Video: • Chain Rule | Deep Lear...
Previous video: • Implement Neural Netwo...
Code of this tutorial: github.com/cod...
Exercise: Go at the end of above link to find description for exercise
Deep learning playlist: • Deep Learning With Ten...
Machine learning playlist : www.youtube.co...
Prerequisites for this series:
1: Python tutorials (first 16 videos): www.youtube.co...
2: Pandas tutorials(first 8 videos): • Pandas Tutorial (Data ...
3: Machine learning playlist (first 16 videos): www.youtube.co...
#️⃣ Social Media #️⃣
🔗 Discord: / discord
📸 Dhaval's Personal Instagram: / dhavalsays
📸 Instagram: / codebasicshub
🔊 Facebook: / codebasicshub
📝 Linkedin (Personal): / dhavalsays
📝 Linkedin (Codebasics): / codebasics
📱 Twitter: / codebasicshub
🔗 Patreon: www.patreon.co...
6 сен 2024