Тёмный

How I think about Logistic Regression - Part 1 

How I think about
Подписаться 325
Просмотров 1,6 тыс.
50% 1

A (hopefully) simple and intuitive explanation of logistic regression for binary classification.
Intro and Overview 00:00-02:15
How to think about the Threshold 02:15-04:35
Assigning Likeliness 04:36-08:14
Maximum Likelihood 08:15-11:02
Finding the Best Threshold 11:03-11:34
Recap 11:35-12:35
Part 2: • How I think about Logi...
The Math Behind Logistic Regression: • How I think about Logi...
Part 3: Coming Soon!
Gradient Descent Video: • How I think about Grad...
Some Reading on Probability: / overview-of-probability
Visualization and animation code on GitHub: github.com/gallettilance/repr...
Thumbnail by / endless.yarning
#machinelearning #logisticregression #education #classification #optimizationalgorithm #explained #mathexplained #machinelearningalgorithm #mathformachinelearning #machinelearningbasics #datascience #datasciencebasics #linearregression #probability #probabilitytheory

Наука

Опубликовано:

 

4 июл 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 17   
@jakeaustria5445
@jakeaustria5445 18 часов назад
Wow, this channel needs to explode in views!
@FitsumWondessen
@FitsumWondessen 9 часов назад
really great video
@sriveralopez
@sriveralopez 11 дней назад
Yet another one of those videos that, in your head, you think have 100k+ views but it turns out we're just lucky to be here first.
@howithinkabout
@howithinkabout 11 дней назад
🙏🙏🙏 that’s so encouraging and kind! Thank you!
@HM-wo6ic
@HM-wo6ic 20 дней назад
That was a pleasure to watch.
@howithinkabout
@howithinkabout 19 дней назад
So glad to hear it!
@frannydonington9925
@frannydonington9925 Месяц назад
This was so helpful! A really great, new perspective (and way of explaining it)
@ryanschofield6160
@ryanschofield6160 Месяц назад
Awesome!
@nickwendt4023
@nickwendt4023 Месяц назад
Great video, excited for more
@howithinkabout
@howithinkabout Месяц назад
Thank you so much for watching!! Part 2 (and 3!) should be out real soon
@shahulrahman2516
@shahulrahman2516 22 дня назад
Great video
@howithinkabout
@howithinkabout 22 дня назад
thanks so much!! I hope you enjoy part 2 when you get around to it :)
@mrjackrabbitslim1
@mrjackrabbitslim1 Месяц назад
Awesome. Question: why would you be adjusting the constants i.e in the hours/exam length basic chart? The final probability went up but what does that two-fold increase represent in the real world, not math language?
@howithinkabout
@howithinkabout Месяц назад
Thanks so much for watching!! And great question! As you change these parameters you generate different probabilities across your space allowing you to describe it better. (and thus make better predictions etc). So it's less about what these parameters **mean** and more about what they let you **do** (which is to mold the sigmoid function to the data). In logistic regression there is an interpretation of the parameters as increases to the log-odds but that's pretty mathy and as far as I can tell just happenstance and not by design (happy to elaborate on this if you want). In probit regression for example there exists no such interpretation but the mechanism is the same. Similarly for Neural Networks.
@mrjackrabbitslim1
@mrjackrabbitslim1 Месяц назад
@@howithinkabout ah I see. It makes sense that you'd want to shape the sigmoid by manipulating the parameters, intuitively it's still hard to convert the abstracted constant value into a tangible example. But as Von Neumann said "in mathematics you don't understand things. You just get used to them". Waiting eagerly for the next vid!
@howithinkabout
@howithinkabout Месяц назад
@@mrjackrabbitslim1 I'm more of the opinion that if you don't understand it, someone's not explaining it well enough :) I made an animation just for you github.com/gallettilance/reproduce-those-animations/blob/main/examples/linear_function.gif to demonstrate what happens to the threshold as you change the parameters. This lets you rotate, shift, and center the sigmoid function. The constant specifically is responsible for shifting things. I'll try to make this more clear in part 2 - thanks for sharing your thoughts!
@mrjackrabbitslim1
@mrjackrabbitslim1 Месяц назад
@@howithinkabout wow, thank you!
Далее
All Learning Algorithms Explained in 14 Minutes
14:10
Просмотров 188 тыс.
Nice hiding.
00:43
Просмотров 3,3 млн
How I think about Gradient Descent
5:31
Просмотров 1,3 тыс.
Gaussian Processes : Data Science Concepts
24:47
Просмотров 7 тыс.
Logistic Regression in R, Clearly Explained!!!!
17:15
Просмотров 509 тыс.
Support Vector Machines: All you need to know!
14:58
Просмотров 135 тыс.
Maximum Likelihood Estimation ... MADE EASY!!!
9:12
Logistic Regression - THE MATH YOU SHOULD KNOW!
9:14
Просмотров 146 тыс.