Тёмный

How does Netflix recommend movies? Matrix Factorization 

Serrano.Academy
Подписаться 156 тыс.
Просмотров 344 тыс.
50% 1

Наука

Опубликовано:

 

2 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 331   
@conintava514
@conintava514 5 лет назад
So informative and easy to follow. I love this. Thank you so much for taking the time to create this video. It's so important to know how the concepts we learn in class can be applied in real life. This has changed everything for me. Thank you again.
@Xraid32
@Xraid32 5 лет назад
Sharknado = Twister + Jaws. This was gold. That was the moment all of Machine Learning made sense.
@ianboard544
@ianboard544 4 года назад
It sounds like a pitch you might make to a production executive.
@AmeyPanchpor02
@AmeyPanchpor02 4 года назад
Very true
@computerguycj1
@computerguycj1 5 лет назад
Sir, I've seen almost all of these concepts painfully "explained" in many different ways, but never have I seen them presented as elegantly and intuitively! Excellent video!
@TheGenerationGapPodcast
@TheGenerationGapPodcast 3 года назад
Most of us who have been watching your videos are changed forever. We are convinced now that there are better ways to teach machine learning and your way is one of the better ways. Thanks
@crazywebhacker9769
@crazywebhacker9769 4 года назад
There are not many really good ML videos on YT. This is by far one of the best.
@sarthaktiwari1889
@sarthaktiwari1889 4 года назад
It is one thing to have a great hold over technical concepts and another thing to be able to explain them. You have both. Very well explained!!!
@azurewang
@azurewang 2 года назад
watched again after 3 years, still be amazed!
@jonlenescastro1662
@jonlenescastro1662 2 года назад
Definitively the best explanation on YT
@guanyanlin1933
@guanyanlin1933 4 года назад
No doubt. It is definitely an excellent tutorial, and give a reasonable answer of why the weights in the hidden layer is the embedding of a movie or a person. Thanks a lot.
@killuawang677
@killuawang677 4 года назад
Actually I do still have a question after this video: How do we know how many features we are supposed to have? i.e. how were you able to decide the factorized metrices are 2 x N and M x 2? Does it mean you might end up getting a feature that is a combination of multiple "actual features" and you need to further break it down?
@samarthpianoposts8903
@samarthpianoposts8903 2 года назад
That is something which people experiment with by seeing what gives them the best result. In general, people experiment with values proportional to the logarithm of number of unique items.
@anushkagupta79
@anushkagupta79 3 года назад
I read so many articles about this topic but was never able to understand. You made it all so easy. Excellent work!!!
@atulitraopervatneni9320
@atulitraopervatneni9320 5 лет назад
You are one of best teachers on RU-vid. Thanks
@blackstallion9605
@blackstallion9605 2 года назад
This is amazing, it has really opened my mind. Thank you so much
@armasaaf6180
@armasaaf6180 Год назад
thank you for making it easy to understand. Great job!
@skymagickid
@skymagickid 2 года назад
Very clear explanation, amazing, thank you!
@iamdanielkip
@iamdanielkip 5 лет назад
I was driven here after reading a chapter on RGA's book where the mention "collaborative filtering". I was curious and decided to learn more about it. I would like to know though, what computer language is generally used to achieve this? Thank you for the very simple and fun explanation.
@shivanshkaushik383
@shivanshkaushik383 2 года назад
This is a work of art. Never thought matrix factorization could be explained so effortlessly yet so clearly. You have helped me a lot with this sir! Thank You, God bless you!
@amandaahringer7466
@amandaahringer7466 2 года назад
Excellent explanation, great job! Thank you for sharing!
@josephhsueh6456
@josephhsueh6456 5 лет назад
appreciate your efforts to make such a good video! thank you! everything is detailed! love it
@jjj78ean
@jjj78ean Год назад
Amazing explanation! Thank you Luis
@sargun.nagpal
@sargun.nagpal 3 года назад
Given a new movie M6, how do we assign feature values for the movie? Related question is- given a new person E, how do we come up with their interest in different features?
@reyhanehhashempour8522
@reyhanehhashempour8522 6 лет назад
Luis! You are a fantastic teacher! Not everyone can explain complicated concepts in a way that every body understands. Your teaching style shows the depth of your knowledge! Thank you!
@AmeyPanchpor02
@AmeyPanchpor02 4 года назад
Really this is one of the best introductory video i have found. Knowledge + simple understanding examples = Gives very good understanding of topic.
@kevdaag2523
@kevdaag2523 5 лет назад
That was great the way you explained matric factorization and then turned it into an explanation of ML.
@ws-ob4wy
@ws-ob4wy 5 лет назад
I find your teaching method not only to be great but also very valuable to motivate young people to take up Machine Learning. You could make it even better by also relating it to the math (Linear Algebra, Calculus, Probability) in a more familiar form. Make sure that anyone teaching and learning ML in a college environment will be aware of your videos. Great stuff.
@andykim1614
@andykim1614 3 года назад
Not even going to lie, very hard NOT to follow. Thank you for the explanations!
@mahimahans111
@mahimahans111 3 года назад
Very nicely explained. Thanks.
@NidaSyeda
@NidaSyeda 5 лет назад
This excellent! Thank you for simplifying it. You are incredibly talented.
@bayesml
@bayesml 4 года назад
Thank you for the video. I just wonder if I need a matrix with numbers all filled up for the training and I test the trained model on some sparse test matrix?
@UmeshRajSatyal
@UmeshRajSatyal 5 лет назад
Well explained and very easy to understand. Stopping here to thank you.
@guitar300k
@guitar300k 2 года назад
in your example, we have two latent factors, so how do we know which one should increase which one should decrease to reduce the error, it seem like you have to increase/decrease both of them at same time
@nurkleblurker2482
@nurkleblurker2482 3 года назад
Great video. But how do you determine a users preferences for movies in the first place?
@dudibs1
@dudibs1 3 года назад
I needed to get remember in this and find this video helpful. Thx
@기바랜드
@기바랜드 5 лет назад
Thanks for the wonderful explanation! Really like the way you teach.
@NoahAndABadger
@NoahAndABadger 4 года назад
I’m a bit confused about the derivative but. What are we taking derivative with respect to
@willdog4352
@willdog4352 2 года назад
Very good explanation
@karannchew2534
@karannchew2534 Год назад
Hi Serrano, A suggestion please. Before walking through a detailed example, please first introduce the overall concept/algorithm/intuition, and, the content/agenda. First tell the learner what they would expect to see/learn, then start teaching them. Thanks for all the useful videos!
@c0t556
@c0t556 5 лет назад
Can you get overfitting with matrix factorization?
@MegaVikas1990
@MegaVikas1990 5 лет назад
I think your students don't know the sentence of falling in exam.one doubt 25:15 I didn't understand how exactly we change those values any ratio of features should be followed?
@jackshi7613
@jackshi7613 2 года назад
Well explained concepts, really appreciate your nice video
@nandankakadiya1494
@nandankakadiya1494 3 года назад
Just amazing can't describe in words❤️🙏
@dayan5402
@dayan5402 4 года назад
Real-life application + theory in simple terms. Very nice! Thank you!
@chandanroy1789
@chandanroy1789 3 года назад
Great explanation! I was looking for something cool and simple to refresh my past learnings.
@rishabhchoudhary0
@rishabhchoudhary0 2 месяца назад
Do you take the blank cell in the sparse matrix as zero to calculate its factors?
@nikhithasagarreddy
@nikhithasagarreddy 4 года назад
Super sirr, every class is very clear ,, there are only few classes available. Please upload every class sir,,😘😘😘
@killuawang677
@killuawang677 4 года назад
This video deserve 10x more likes. I got to say it is so much better than Google's own recommendation system crash course...
@SurajAdhikari
@SurajAdhikari 4 года назад
Thanks Luis. This is one of the first videos I watched on Matrix Factorization and I understood them really well. Great job. Keep posting.
@usmanabbas7
@usmanabbas7 2 года назад
Great video Luis :) I have one question though that how do we decide the number of latent of latent features and what are the trades off using high/low number of latent features. Thanks
@ultraviolenc3
@ultraviolenc3 4 года назад
Great video! So much easier now to comprehend more complicated material after your explanation
@gmdl007
@gmdl007 3 года назад
hi Luis, the Yannet github code, it is very hard to understand how the factorization is done for a beginner who has not previous education on word embedding and CNN pytorch. Is there any other code or tutorials you can recommend which shows how the factorization is done? The best will be if you can make one for word2vec !
@hasanyousef6782
@hasanyousef6782 3 года назад
Man that's deep.
@Dat-Potato
@Dat-Potato 5 лет назад
As soon as I saw twister and jaw.... i was like... oh noes..... its sharknadooooo.
@mohitaggarwal6220
@mohitaggarwal6220 11 месяцев назад
The explanation for gradient descent was great, but I'm a little confused about the 25:00 minute part. In the matrix, the (1,1) element is 1.44, but the actual value is 3. So, we need to increase something. It could be [f1][m1], [f2][m1], [A][f1], or [B][f2]. How do we decide which one to increase? And by increasing which value and by what factor can we get accurate results? Increasing a single value or multiple values can potentially bring us closer to the answer. If anyone has an answer for this doubt, please clarify. I'm curious to know.
@ASHISHDHIMAN1610
@ASHISHDHIMAN1610 4 года назад
Hey sorry for knit-picking but at 17:01, the red triangle would have transposed shape i.e. greater height(2000 users) than width (1000 movies) !! Great video though !! Please make one on Gaussian Mixture models.
@Yan-dh5yc
@Yan-dh5yc 2 года назад
I think you got Sharknado wrong, it should 3 in comedy and 1 in action.
@pratikmandlecha6672
@pratikmandlecha6672 Год назад
This was so good
@fernandobezerra4040
@fernandobezerra4040 4 года назад
THE BEST VIDEO ABOUT MATRIX FACTORIZATION EVER! CONGRATULATIONS, TEACHER!
@DevanshiRuparel
@DevanshiRuparel 4 года назад
Do you think the dot product calculation assumes that a low rating (of 1 on 5) will not decrease their overall rating for the movie? (I.e. the calculation only adds the preferences & if Person A doesn't like Action, their rating for a comedy + action movie will equal that of only a comedy movie and the overall rating for a movie will not be reduced because they dislike action). Is this a limitation of the dot product calculation? How do you think Netflix takes this into account?
@보라색사과-l1r
@보라색사과-l1r 4 года назад
you made whole bunches of other contents about CF or Matrix Factorization boring and meaningless. Thank you so much for this incredibly plain explanation.
@peaceandlov
@peaceandlov 3 года назад
Best video ever. Thanks mate.
@MrChristian331
@MrChristian331 3 года назад
If NMF is a type of dimensionality reduction/unsupervised algorithm....then I thought unsupervised programs don't have an error function and therefore do not need an optimization function like "gradient descent" to correct itself and reduce the error function??
@iitgupta2010
@iitgupta2010 5 лет назад
such an awesome video ...very informative
@kathyker3498
@kathyker3498 3 года назад
hi I'm curious about why at around 2:22 the table isn't realistic? Why you say that there's similarity between them? How do I observe the similarity? I thought middle and right table are both fine... And tks! it's a great vd :)
@kohjiarong6452
@kohjiarong6452 4 года назад
Not too sure how realistic are the data in the 5x4 matrix, but not all users who have watched a movie would give a rating, and not every user would have watched every single movie. How do we deal with such disparity? I don't think we can just assign zeros right?
@bdubbs717
@bdubbs717 5 лет назад
Lol In machine learning we never say thats good, it's just close enough. Too true.
@spitalhelles3380
@spitalhelles3380 2 года назад
Ok, but Dana definitely netflix&chills with both Betty and Carlos. What a philandering rapscallion!
@phaniramsayapanen5890
@phaniramsayapanen5890 3 года назад
Great explanation, you seem to understand the concept very clearly. Subscribed immediately! any videos on expectation maximization, svd, dimensionality reduction ? or resources that you liked most ?
@XxAssassinYouXx
@XxAssassinYouXx Год назад
My name is also Luis, but I pronounce it LooEEs, and it sounds like you say it as lwis. Never thought I'd see a Luis pronounce it like that.
@MohitJaggi-f8h
@MohitJaggi-f8h Месяц назад
Nicely explained. Small nit: you say square to avoid ambiguity between positive or negative which is a misleading simplification. The reason to do that is to avoid the errors from canceling each other out when you add them up for all ratings. That is indeed the step you show next so easy to add an accurate explanation
@ZavierBanerjea
@ZavierBanerjea 2 месяца назад
As always a big fan of Luis! He is a master of "Explain this concept to a kid" Idea. Of course, that is what Greatness is!
@thanhthanhtungnguyen8536
@thanhthanhtungnguyen8536 7 месяцев назад
Hi prof. Thank you for an amazing lecture, but can you tell me how can i deal with cold start problems like if the user is new and don’t have any info or the movie is new?
@psychopedia1631
@psychopedia1631 8 месяцев назад
The video just summarised graphs and matrices can be viewed as isomorphic systems in machine learning! How many of u feel so?👇
@camille_leon
@camille_leon 3 года назад
this video ends way too early. So when you're training your model on sparse data, what do you do with the unrated movies? Do you treat them as zeros? Do you exclude them from your error function?
@aakashyadav1589
@aakashyadav1589 Год назад
are we giving features of users and movies as input or are they extracted by MF algo. itself? During gradient descent, is the algo. learning weights to each of the features or the algo changes the features, as shown in the video?
@SajjadZangiabadi
@SajjadZangiabadi Год назад
The instructor does an excellent job of breaking down concepts and explaining them step by step in a way that is easy to understand. I appreciate the time and effort put into creating such an informative and well-presented video. Thank you for sharing your knowledge with us.
@obesechicken13
@obesechicken13 4 года назад
One issue I see right away is that this solution doesn't scale. Netflix for instance has way more movies and way more users than in this table and the number of calculations increases exponentially as those 2 dependencies increases.
@nguyenhiep6639
@nguyenhiep6639 3 года назад
Thanks for your full explain inaction. It helps me really much to understand my project
@nikhilbelure
@nikhilbelure 5 лет назад
elegantly explained. like the description very friendly introduction. I was struggling to see how matrix factorization plays role in recommendation system. no I got it Thanks
@osmanovitch7710
@osmanovitch7710 2 года назад
Hello Teacher , thanko you , can we say that Matrix Factorization= Boltzman Machine ?
@dpdp006
@dpdp006 2 года назад
Thank you for your efforts on details Why is teaching not made as simple as you just explained.
@MrGreen087
@MrGreen087 3 года назад
I am 34 with no tech background, learning ML stuff on Friday night)
@amanzholdaribay9871
@amanzholdaribay9871 5 лет назад
WooooW! That has been as simple as possible! If person understands something, he can be able to explain it even to the child - I mean level of understanding is amazing! Thank you!
@sandrabernal5630
@sandrabernal5630 2 года назад
Excelent presentation. Specially the representation of two sexy Canadians named Ryan 🤣
@ChetanRawattunein
@ChetanRawattunein 4 года назад
Thank you RU-vid recommender for the video🤗. I was really looking for something this informative.
@spikeydude114
@spikeydude114 2 года назад
You really did a great job of distilling what I saw as a complex topic to something practical and understandable. Great video!
@ishand8209
@ishand8209 2 года назад
Amazing explanation. Totally worth watching.
@bendim94
@bendim94 3 месяца назад
how do you choose the number of features in the 2 matrices, i.e. how did you choose to have 2 features only?
@boywithacoin
@boywithacoin 9 месяцев назад
i thought you were going to show actual factorization methods like QR or LU.
@codingpineappl3480
@codingpineappl3480 2 года назад
Best video, you can find about matrix factorization. Thanks a lot
@gholamrezadar
@gholamrezadar 3 года назад
Amazing explanations. Thank you for this video.
@Azureandfabricmastery
@Azureandfabricmastery 4 года назад
Thanks. Nicely explained with visuals to understand matrix factorization.
@title601a
@title601a 5 лет назад
Awesome!!! Nice presentation, simple and easy to understand. Thank you so much :)
@dhruvbarot
@dhruvbarot 4 года назад
Awsome , simple , mindblowing .... both explanation and presentation
@Lae56
@Lae56 5 лет назад
Awesome! Truly appreciate. Very informative and easy to follow.
@serafeiml1041
@serafeiml1041 5 месяцев назад
Great explanation. Is this factorization a Non-negative matrix factorization?
@giangpham6044
@giangpham6044 5 лет назад
Well explained and very easy to understand. Thanks you
@Utkarsh_vns
@Utkarsh_vns 6 месяцев назад
Instead of finding relation between different users we can find relationship between a user and the features of the target variable.
@TitasSaha-er5ye
@TitasSaha-er5ye 2 года назад
You are the best !! I am so amazed that i understood the video in just one go, Thank you :D
@HimanshuRobo
@HimanshuRobo 5 лет назад
How will you decide the number of features in general? There will be a technique to identify the optimum number of features? Can you suggest some of the algorithms?
@alirezariazi5325
@alirezariazi5325 3 года назад
complex information explained very simple, TNX!
Далее
Latent Dirichlet Allocation (Part 1 of 2)
26:57
Просмотров 133 тыс.
How Recommender Systems Work (Netflix/Amazon)
8:18
Просмотров 240 тыс.
Trends in Recommendation & Personalization at Netflix
32:00
Matrix Factorization - Numberphile
16:34
Просмотров 377 тыс.
COS 302: Applications of Matrix Factorization
12:07
Просмотров 6 тыс.
Collaborative Filtering : Data Science Concepts
12:03
A friendly introduction to Recurrent Neural Networks
22:44
Мэджик! Смартфон 2-в-1!
0:48
Просмотров 770 тыс.
Распаковка Huawei Mate XT🔥
0:54
Просмотров 81 тыс.
CED: часть 1
23:37
Просмотров 83 тыс.
Hardware tools repair tool high performance tool
0:16