Тёмный

How I think about Gradient Descent 

How I think about
Подписаться 345
Просмотров 1,4 тыс.
50% 1

What is gradient descent optimizing exactly?
Source code to generate these animations: github.com/gallettilance/repr...
#gradientdescent #machinelearning #neuralnetworks #optimization #math #datascience #educational #machinelearningtutorialforbeginners #datasciencebasics #datasciencetutorial #machinelearning #datascience #datasciencebasics #datasciencetutorial #machinelearningalgorithm #logisticregression #machinelearningbasics #maths #softmax #multinomial #classification #linearregression #probability #probabilitytheory #education #math #machinelearningtutorialforbeginners #machinelearningtutorial #neuralnetworks

Наука

Опубликовано:

 

12 апр 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 25   
@arihansharma6384
@arihansharma6384 2 месяца назад
This is an AWESOME introduction to gradient descent! I also love that it's more of a high-level overview rather than delving into the nitty gritty details of the calculus required to make it happen- it's surprisingly beneficial for those that are already used to the concepts. Looking forward to watching the Part 2 soon!
@howithinkabout
@howithinkabout 2 месяца назад
So glad to hear that! That means a lot to me especially in this early stage of starting this channel! And I completely agree. The nitty gritty often comes in the way of truly understanding certain concepts but since these are often the only details we're tested on in school it's hard to realize that something is missing.
@alefalfa
@alefalfa 2 месяца назад
I watched 3blue1brown and thought there was nothing else to learn about gradient descent. I was wrong. Thank you for the video!
@howithinkabout
@howithinkabout 2 месяца назад
thank you so much for the kind words!! It means so much
@MarcOBrien-ie4vz
@MarcOBrien-ie4vz 2 месяца назад
Wow what an informative and clear summation with such cool animations well done!
@Shourya-bc7ku
@Shourya-bc7ku 2 месяца назад
loved the video, the format, the animations. hope to see more from you
@howithinkabout
@howithinkabout 2 месяца назад
thank you so much for the encouraging words!
@kryzhaist2483
@kryzhaist2483 Месяц назад
Just discovered your channel. Amazing content! Thank you very much for your work, looking forward to see more of it!
@howithinkabout
@howithinkabout Месяц назад
It's definitely hard work to make these videos but comments like yours make it so worth it - thank you so much!
@sama32lambda
@sama32lambda 2 месяца назад
Awesome video. It's super intuitive looked in this way
@howithinkabout
@howithinkabout 2 месяца назад
Thank you so much! That makes me so happy to hear!
@FreerunnerCamilo
@FreerunnerCamilo 2 месяца назад
First year CS major here dipping my toes in ML and this explanation makes a lot of sense, would love more videos like this! Subbed.
@howithinkabout
@howithinkabout 2 месяца назад
That’s so great to hear, thank you for your encouraging words! Please feel free to suggest topics you would like me to cover
@mrjackrabbitslim1
@mrjackrabbitslim1 2 месяца назад
Awesome. We'll watch as many of these as you're going to make.
@ijosakawi
@ijosakawi 2 месяца назад
Very nice video! I think there's a slight issue: the derivative of x^5 - 40x^3 - 5x can be solved really easily. Its derivative is just 5x^4 - 120x^2 - 5, and you can set that to zero, substitute u for x^2 to get 5u^2 - 120u - 5 = 0, use the quadratic formula to solve for u, take its square roots to get x, and check which is lowest in the original f(x). But the specific equation isn't what's important, and the video is very nice otherwise!
@ijosakawi
@ijosakawi 2 месяца назад
(by "the derivative can be solved really easily" I mean "you can easily find the zeroes of the derivative")
@howithinkabout
@howithinkabout 2 месяца назад
You're absolutely right! I had to make a decision as to what is "easy" to solve and decided that u substitutions are not :D But great to point out I'm sure many watching the video will learn something from your comment!
@dann_y5319
@dann_y5319 2 месяца назад
Cool video!!!!
@cornevanzyl5880
@cornevanzyl5880 5 дней назад
Mitochondria are the powerhouse of the cell
@user-rn1ky8kq4k
@user-rn1ky8kq4k 2 месяца назад
wow!
@dhruvssharma1458
@dhruvssharma1458 2 месяца назад
How would the concept of momentum tie into your explanation? Because when using ADAM one usually specifies the learning rate (which is the step size) and the momentum
@howithinkabout
@howithinkabout 2 месяца назад
Great question! I'm planning to talk more about variants of GD (which includes ADAM) in a next video about how to avoid some of the pitfalls of GD. But the tldr is this: adam tries to use historical information contained in the successive gradients to make better step adjustments.
@Cheke_180
@Cheke_180 2 месяца назад
Brooo, really loved your content haha. To better add to the analogy, lets say you have your eyes closed/you have no touch sensation, i think that might be a great idea but correct me if i'm wrong. In any case, really loved the video, keep the good work 🎉🎉
@howithinkabout
@howithinkabout 2 месяца назад
Love that idea! If I find a way to visualize it I'll include this in part 2 :)
@Cheke_180
@Cheke_180 2 месяца назад
@@howithinkabout sounds awesome! I will be watching it ;)
Далее
How I think about Logistic Regression - Part 1
12:36
Просмотров 1,7 тыс.
The moment we stopped understanding AI [AlexNet]
17:38
Просмотров 281 тыс.
The Most Impressive Basketball Moments!
00:36
Просмотров 12 млн
The Most Important Algorithm in Machine Learning
40:08
Просмотров 310 тыс.
Coding Adventure: Simulating Fluids
47:52
Просмотров 1,7 млн
Researchers thought this was a bug (Borwein integrals)
17:26
Deriving the Dirac Equation
16:34
Просмотров 86 тыс.
Is the Future of Linear Algebra.. Random?
35:11
Просмотров 236 тыс.
How to train simple AIs to balance a double pendulum
24:59
ИГРОВОВЫЙ НОУТ ASUS ЗА 57 тысяч
25:33
Собери ПК и Получи 10,000₽
1:00
Просмотров 2,2 млн
PA-RISC рабочая станция HP Visualize
41:27