Тёмный

How I think about Logistic Regression - Technical Interlude 

How I think about
Подписаться 345
Просмотров 603
50% 1

The Math Behind Logistic Regression.
Negative Log Likelihood 00:00-05:25
Gradient Descent Step by Step 05:26-07:20
Scale Your Data 07:21-09:09
Part 1: • How I think about Logi...
Part 2: • How I think about Logi...
Visualization and animation code on GitHub: github.com/gallettilance/repr...
Thumbnail by / endless.yarning
#mathformachinelearning #gradientdescent #machinelearning #datascience #datasciencebasics #datasciencetutorial #machinelearningalgorithm #logisticregression #machinelearningbasics #maths #softmax #multinomial #classification #linearregression #probability #probabilitytheory #education #math #machinelearningtutorialforbeginners #machinelearningtutorial #neuralnetworks

Наука

Опубликовано:

 

6 июл 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 12   
@jakesimonds5051
@jakesimonds5051 6 дней назад
These videos are fantastic. Your pacing is (for me at least) excellent, the illustrations are awesome, and you're doing a fantastic job of job of motivating everything. Keep it up!!!!!
@howithinkabout
@howithinkabout 6 дней назад
So glad to hear it! Thanks for the kind and encouraging words :) I'll do my best!
@frannydonington9925
@frannydonington9925 8 дней назад
Another great video!! Such good quality explanations. A really great study tool :)
@howithinkabout
@howithinkabout 8 дней назад
thank you so much!! 🙏
@ssingh7317
@ssingh7317 8 дней назад
Keep this Machine Learning concepts series videos :)
@howithinkabout
@howithinkabout 8 дней назад
I got big plans for this channel :) but let me know what you would like to learn about!
@ssingh7317
@ssingh7317 8 дней назад
@@howithinkabout I would love to watch mathematics for better understanding of algorithms.
@parsami7101
@parsami7101 6 дней назад
I just found your channel by chance and subbed, great job with both the explanation and the video, you can show the proof of linear regression as well in the next video since it is easy to understand for most people with little math knowledge as well
@howithinkabout
@howithinkabout 5 дней назад
thanks so much!! which proof are you talking about exactly? I'll definitely consider adding it in the next one of these!
@parsami7101
@parsami7101 5 дней назад
​​​@@howithinkabout here is a video to understand it : ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-DSQ2plMtbLc.htmlsi=AtQxjTAvlTnjzxMX but long story short, linear reg give us a function like : y = ax + b where a = (n sum xy - sum x sum y) / (n sum xx - sum x sum x) we can prove a and b formulas by summing all residuals and getting it's partial derivatives and solve the 2 variables to get a and b using matrices, by this I understood linear reg really better
@howithinkabout
@howithinkabout 5 дней назад
@@parsami7101 I see - sadly in logistic regression we don’t have such formulas (that’s why we use gradient descent) because the maximum likelihood problem in this case doesn’t have a closed form solution. Hope that helps clarify things!
@parsami7101
@parsami7101 4 дня назад
@@howithinkabout oh okay, thanks anyways
Далее
How I think about Logistic Regression - Part 1
12:36
Просмотров 1,7 тыс.
The Man Who Solved the World’s Hardest Math Problem
11:14
Logistic Regression - VISUALIZED!
18:31
Просмотров 25 тыс.
The Most Important Algorithm in Machine Learning
40:08
Просмотров 310 тыс.
100+ Linux Things you Need to Know
12:23
Просмотров 460 тыс.
How I think about Gradient Descent
5:31
Просмотров 1,3 тыс.
Regularization Part 1: Ridge (L2) Regression
20:27
Linear Regression, Clearly Explained!!!
27:27
Просмотров 1,3 млн
Так ли Хорош Founders Edition RTX 4080 ?
13:00