Тёмный
No video :(

Maximum Likelihood : Data Science Concepts 

ritvikmath
Подписаться 163 тыс.
Просмотров 36 тыс.
50% 1

The story behind max likelihood .... fully explained!
Sigmoid Video : • The Sigmoid : Data Sci...
Logistic Regression Video : • Logistic Regression - ...
My Patreon : www.patreon.co...

Опубликовано:

 

5 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 62   
@corybeyer1
@corybeyer1 3 года назад
this is art
@ritvikmath
@ritvikmath 3 года назад
wow thanks!
@bannerlad01
@bannerlad01 3 года назад
You are a brilliant teacher - thanks so much for doing this
@abhishekchandrashukla3814
@abhishekchandrashukla3814 Месяц назад
I swear to god, when I was searching for maximum likelihood estimation, I was hoping you would make a video on this, scrolled down a bit and bingo!! I see Rithvik Math. My happiness knows no bounds. Thank you for existing.
@Aruuuq
@Aruuuq 6 месяцев назад
Undeniably you're creating some of the best videos concerning statistics out there. And this is another one. Thank you so much
@xxshogunflames
@xxshogunflames 3 года назад
The collection of info *chefs kiss*
@DaquanMHall
@DaquanMHall 2 года назад
Man I watch your videos all the time. I can write the code and understand the outcome but you’re the only way I can understand the math. Thanks!
@KareemMnour
@KareemMnour 2 года назад
Thank you so much for preferring to actually help people understand concepts rather than throwing fancy multi-step jargon that gets people frustrated at math topics. I would do anything I can to help keep the channel alive and I will recommend your channel to all my friends and colleagues. Thanks again and keep the excellent work.
@user-xg3wn9ip4u
@user-xg3wn9ip4u 3 года назад
Love your videos! Very nice to revise + learn new things, not missing on intuition either. Hope, your followers number will soar soon.
@mehtipsz
@mehtipsz 3 года назад
As always, great videos! I mainly use them as supplement to masters level courses. What I love are the parts where you cover the intuitions about the formulas, it makes them so much more understandable. Keep up the good work!
@dilinijayasinghe8134
@dilinijayasinghe8134 3 месяца назад
Thank you very much. Been struggling to get the intuition of MLE and you helped me to understand it. Would be awesome if you could do a video on GMM estimation. Thank you!!!
@yashpundir2044
@yashpundir2044 3 года назад
Just 3K views on this? people are crazy. This deserves wayyyy more.
@omniscienceisdead8837
@omniscienceisdead8837 2 года назад
this was a very beautiful lecture
@dataman6744
@dataman6744 2 года назад
Just Brilliant! thanks for demystifying logistic regression equations for me🤝
@rmiliming
@rmiliming 2 года назад
excellently explained. Very clear and logical! Tks !
@TheScawer
@TheScawer 3 года назад
Thank you for the video! I wanted to say they are great for revision, but I usually learn a lot more than I did in school on the topic... so thank you!
@MariaBarcoj
@MariaBarcoj Год назад
Thanks for making things seem to be quite simpler ☺
@arshadkazi4559
@arshadkazi4559 2 года назад
excellent explanation, very good as an introduction. Can you make something which delves into maths even more? Explanation of the last part it necessary and would be fun to understand. :)
@moravskyvrabec
@moravskyvrabec Год назад
Great stuff. I'm taking an online MIT class. Complicated topic? I come to your channel to solidify my understanding!
@ritvikmath
@ritvikmath Год назад
Glad it was helpful!
@alessandro5847
@alessandro5847 3 года назад
Thanks for these lectures. You're great at explaining this stuff. Keep it up!
@shubhampandilwar8448
@shubhampandilwar8448 3 года назад
very well explained. I am gaining confidence by these fundamentals lectures.
@jansanda544
@jansanda544 3 года назад
Amazing video. But during the whole time, I was distracted by figuring out what number series is on the tattoo. :D
@yodarocco
@yodarocco Год назад
Have you ever done a video on Maximum a Posteriori (MAP)?
@yulinliu850
@yulinliu850 3 года назад
Thanks for the great lecture. I really liked the word "seeing" outcomes.
@NickKravitz
@NickKravitz 3 года назад
In English most people use the terms Probability and Likelihood interchangeably - I can't help but correct this when I hear it. One nuance is that the Maximum Likelihood result is often very small, meaning the parameter value isn't very likely, it is just more likely than the alternatives. Ranked Choice Voting is designed to promote the Most Likable Choice past the 50% threshold. Great video as always; I hope you become a stats and data science professor!
@fszhang9010
@fszhang9010 3 года назад
Great & helpful video ! From 12:40 says : "This's the probability of seeing all the real world outcomes that i actually see in my data". I think its better to replace "real world" with "predicted" or other synonym since the former kind of mislead the viewers to think those "outcomes" are recording of happened event which actually not, they stem from the model prediction, it's the "data"(y:nx1) that recorded the real result. and from 20:05 that's the correct way to express it.
@maneechotunpuang5299
@maneechotunpuang5299 3 года назад
Your videos are absolutely helpful!! You're soooo damn good teacher and really good at delivering the complicated lessons into the easier way to digest. I hope I can pass this semester with your videos bc without your video it would be even worse! 😂 THANK A MILLION ❤️
@kaym2332
@kaym2332 3 года назад
Amazing style of teaching. Thank you!
@harshads885
@harshads885 3 года назад
In the logistic regression part on the left, its probably better to callout that probability p is not the same as number of data points p.
@Whatever20237
@Whatever20237 Год назад
WOW! Thank you!
@ChocolateMilkCultLeader
@ChocolateMilkCultLeader 2 года назад
keep putting out your bamgers. Use them to learn how to communicate concepts. Shared this one with my network
@goelnikhils
@goelnikhils Год назад
Amazing Content. Thanks
@ritvikmath
@ritvikmath Год назад
My pleasure!
@aminmohammadigolafshani2015
@aminmohammadigolafshani2015 2 года назад
Amazing! Amazing! thanks a lot
@yerzhant701
@yerzhant701 Год назад
Isn't likelihood should be inverse to probability P(y|x,beta), i.e. L(beta|x,y)?
@kisholoymukherjee
@kisholoymukherjee 7 месяцев назад
exactly my thoughts. From what I read from other sources, Likelihood is given by L(parameters or distribution|observed data). Perhaps @ritvikmath can explain better
@jiaqint961
@jiaqint961 Год назад
Gold!
@oneclickdiy
@oneclickdiy 3 года назад
thank you ! these videos are good refresher
@hameddadgour
@hameddadgour 2 года назад
The Gods of Statistics finally decided to send us a prophet :)
@Leila0S
@Leila0S 5 месяцев назад
100% He us a magician I don’t understand how smoothly he makes the concept sweep into one’s brain
@user-ty9bv6kx1b
@user-ty9bv6kx1b Год назад
Thanks for the video, will you be able to guide to a reference about why EM converges to a maximum, either local or global?
@amjedbelgacem8218
@amjedbelgacem8218 Год назад
This guy makes Machine Learning easy bro, subscribed
@jakobforslin6301
@jakobforslin6301 2 года назад
You're an awesome teacher, thanks a lot!
@bryany7344
@bryany7344 3 года назад
Can I know what is the difference between log likelihood vs negative log likelihood graphically ? How do I choose which of the loss functions?
@prateekcaire4193
@prateekcaire4193 4 месяца назад
What should be the probability P(y_i| x_i, beta) where actual y_i is reject(= 0). If P(y_i| x_i, beta) is close to 0 or 0. Max Likelihood estimator will not be max even though beta parameters are fine tuned.
@tianhelenaa
@tianhelenaa 2 года назад
This is truly amazing!!!
@fotiskamanis8592
@fotiskamanis8592 10 месяцев назад
Thank you!!!
@ianstats97
@ianstats97 Год назад
Great video, just did not understand where the sigma came from?
@bobby2636
@bobby2636 Год назад
Question: In 8:03, you're introducing the conceptions of the likelihood, which from my understanding is the probability of real observation emerging given the y; but in the formula, it looks like the posterior probability, not likelihood, is there something missing?
@TheRish123
@TheRish123 3 года назад
What a guy! Amazing stuff
@jijie133
@jijie133 Год назад
I love your videos.
@robertpollock8617
@robertpollock8617 10 месяцев назад
I am confused. You are saying the probability and likelihood are the same according to what you have written by your equations. For likelihood are you not trying to say given the acceptance into med school the likelihood of having these values for gpa, mcat score etc…For instance if probability is P(y|x) then likelihood is L(x|y)? You have these two being equal.
@fatriantobong8169
@fatriantobong8169 Год назад
Hmmm how u bind the Yi to the sigmoid function..
@montycardman2535
@montycardman2535 Год назад
would the likelihood function be between 0 - 1?
@ireoluwaTH
@ireoluwaTH 3 года назад
Neat 'Mathematese'...
@ling5544
@ling5544 Год назад
When the derivative is 0, it could also be a local minimum right? How to assure when the derivative is 0 then the likelihood is maximized?
@ritvikmath
@ritvikmath Год назад
while it's true that derivative=0 could mean min or max, we can distinguish since a min has a decreasing gradient on the left and increasing gradient on the right. a max is the opposite. hope that helps!
@ling5544
@ling5544 Год назад
@@ritvikmath thanks! I got it.
@asadkhanbb
@asadkhanbb 2 года назад
Wow that t-shirt ❣️❣️❣️ cool 😎
@redherring0077
@redherring0077 2 года назад
Please marry me😍😂😂. I can listen to you forever. Such a passionate teacher!
@akashswain7939
@akashswain7939 5 месяцев назад
L
Далее
Multiclass Classification : Data Science Concepts
13:35
Cristiano Ronaldo Surpassed Me! #shorts
00:17
Просмотров 8 млн
Аруси Точики ❤️❤️❤️
00:13
Просмотров 92 тыс.
Likelihood Estimation - THE MATH YOU SHOULD KNOW!
27:49
EM Algorithm : Data Science Concepts
24:08
Просмотров 69 тыс.
Metropolis - Hastings : Data Science Concepts
18:15
Просмотров 100 тыс.
Gaussian Processes : Data Science Concepts
24:47
Просмотров 10 тыс.
Bayesian Linear Regression : Data Science Concepts
16:28
Cristiano Ronaldo Surpassed Me! #shorts
00:17
Просмотров 8 млн