Тёмный

10. Introduction to Learning, Nearest Neighbors 

MIT OpenCourseWare
Подписаться 5 млн
Просмотров 265 тыс.
50% 1

Опубликовано:

 

29 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 107   
@MM-uh2qk
@MM-uh2qk 5 лет назад
Thank you MIT. Just found out today that Professor Patrick had passed on the 19th of July, 2019. I am immensely saddened by this incident. I was actually looking forward to meeting you but I guess that is no longer possible. Rest in Peace legend!
@nikolasevan3564
@nikolasevan3564 3 года назад
Instablaster...
@AhmedSALAH-bb7un
@AhmedSALAH-bb7un 2 года назад
RIP Professor Patrick Winston, definitely your work gonna live for ever
@dragolov
@dragolov 2 года назад
Deep respect!
@cristiannievesp
@cristiannievesp 3 года назад
"You have to be very careful about the confusion of correlation with cause. They see the correlation, but they don't understand the cause, so that's why they make a mistake" This is so simple but so meaningful!
@user-ol2gx6of4g
@user-ol2gx6of4g 7 лет назад
The last minute of the lecture is gold.
@Andrei-fg6uv
@Andrei-fg6uv 7 лет назад
...aaaaand this is why MIT is one of the top educational institutions in the world ! Thanks MIT !
@BapiKAR
@BapiKAR 6 лет назад
This is why I love this prof's lecture.. so much of passion along with simplicity and fun!
@HangtheGreat
@HangtheGreat 4 года назад
His stories and jokes inspires learning and intuition about the subject. That's good teaching skill right there. I was lucky enough to meet teachers with this skill during secondary my school years but I find really rare in university-level education. Thank you MIT and the late professor for this lecture series
@cheng-haochang3509
@cheng-haochang3509 7 лет назад
The world is better with you, thanks prof Winston and MIT
@EranM
@EranM 7 лет назад
46:06 Another thing that is not especially related to the topic is that even when deprived of sleep, the brain works better in the middle of the day rather then the start or end. The huge drops of performance happens when a "subject" is used to sleep/need to sleep. While performance doesn't drop at all (and even goes higher related to the "sleeping time") during the mid-day. Therefore Linear regression can tell you the obvious hypothesis (losing sleep = losing performance) While the Cubic spline can teach you new things you didn't even think of.
@bxh062000
@bxh062000 10 лет назад
Professor Winston is the best. He is amazing!
@sassoleo
@sassoleo 6 лет назад
These lessons just keep getting better and better
@ameerhamza-zr5oc
@ameerhamza-zr5oc 3 года назад
hy
@martinmadsen1199
@martinmadsen1199 7 лет назад
I started out going: "This is too slow". Im now on day two, another 10 hour session. The pace of new information is just perfect. You are a great teacher!
@user-ol2gx6of4g
@user-ol2gx6of4g 7 лет назад
I always put it on 1.25x speed, occasionally pause to ponder. ;)
@saikumarmv1250
@saikumarmv1250 6 лет назад
Really love the way professor is teaching , his confidence and body language is great ..Thank you very much sir
@qzorn4440
@qzorn4440 8 лет назад
this mit opencourseware is like eating potato chips you cannot eat just one or view just one lecture. thank you.
@TGPadm
@TGPadm 7 лет назад
except they are healthy
@hengyue6596
@hengyue6596 7 лет назад
i can't imagine how much does the knowledge contained in this course worth.
@ranjeetchoice
@ranjeetchoice 9 лет назад
Love this professor..thanks MIT
@2flight
@2flight 7 лет назад
Thanks Patrick Winston for the lively presentations! Thanks MIT!!!
@MathsatBondiBeach
@MathsatBondiBeach 5 лет назад
Taught by Marvin Minsky and a truly class act at so many levels.
@pjakobsen
@pjakobsen 6 лет назад
Excellent teacher, very organized. He has clearly taught this course many times.
@MacProUser99876
@MacProUser99876 7 месяцев назад
Boy, he packed a lot into this lecture, but made it so engaging!
@nb1587
@nb1587 2 года назад
I wish I was in MIT such an outstanding teaching.
@mangaart3366
@mangaart3366 3 года назад
Amazing lecture thank you MIT for providing us with free knowledge!
@thechesslobster2768
@thechesslobster2768 4 года назад
Absolutely blessed to be able to get MIT courses for free.
@redthunder6183
@redthunder6183 Год назад
I can believe I just willingly watched an entire lecture
@angelmcorrea1704
@angelmcorrea1704 4 года назад
I love this lectures, thanks MIT and Mr Pattrick for shared.
@KulvinderSingh-pm7cr
@KulvinderSingh-pm7cr 6 лет назад
he looks so cool !!! he's absolutely amazing in every way
@acal790
@acal790 10 лет назад
that was hella hilarious on the part of the rangers, sleep dep, so the answer then should be how do they get major end decisions out of soldiers when they only have a 25 percent ability, and naps do help immensely if you can handle rounds or just nervousness.
@KulvinderSingh-pm7cr
@KulvinderSingh-pm7cr 6 лет назад
best professor funny in a sense that is often senseless... Love this guy!!!!
@sakcee
@sakcee 2 года назад
RIP Professor Winston!
@WepixGames
@WepixGames 5 лет назад
R.I.P Patrick Winston
@TheBirdBrothers
@TheBirdBrothers 9 лет назад
luv his curmudgeonly persona, but always lively!
@HaiderKhanZ
@HaiderKhanZ 10 лет назад
Great Lecture, explains it with nice animations on the blackboard great for programmers :-)
@яобщественныйдеятель
amazing lecture
@amrdel2730
@amrdel2730 6 лет назад
VERY USEFUL FOR STUDENT US GRADUATE PHD RESEARCHERS OR ANYONE NEEDING TOBE INTRODUCED TO AI FIELD I GUES THE BASES GOT HERE FROM PROF WINSTON ARE A GREAT BASIS TO TACKLE OR USE ANY NOTIONS IN THIS VAST FIELD THANKS FROM ALGERIA
@elivazquez7582
@elivazquez7582 6 лет назад
Thank you Patrick Winston !
@Jackeeba
@Jackeeba 2 года назад
At 23:01 the professor says 'so that's just the dot product, right' - but that's to say that cosine similarity = dot product, which is not precise, right? The dot product is the numerator in this case.
@oguzhanakgun9591
@oguzhanakgun9591 2 года назад
What a great lecture..
@sansin-dev
@sansin-dev 5 лет назад
R.I.P. Professor Winston
@dragolov
@dragolov 2 года назад
Deep respect!
@user-hf2dr7sh4y
@user-hf2dr7sh4y 8 лет назад
I hope that x-axis grows at a much faster rate than his y-axis, otherwise the example to get the idea across makes less sense. Still a great lecture though! Thumbs up.
@donbasti
@donbasti 7 лет назад
The deeper you go into the series the more hard-core programmers you meet in the comment section :D
@zingg7203
@zingg7203 3 года назад
Nicecore
@marceloflc
@marceloflc 3 года назад
Wait, so the majority here are programmers and not math people? I thought that it would be the other way around. I don't know where one subject starts and the other begins anymore
@pmcate2
@pmcate2 5 лет назад
Isn't there an ambiguous way to divide the graph into 4 areas? That little triangle in the middle looks like it could be included in any of the four boundaries.
@ThePeterDislikeShow
@ThePeterDislikeShow 10 лет назад
I'm surprised in the 21st century we still haven't found a way to reduce our need for sleep.
@KaosFireMaker
@KaosFireMaker 9 лет назад
I present you coffee!
@ThePeterDislikeShow
@ThePeterDislikeShow 9 лет назад
Coffee doesn't reduce the *need* for sleep. It just prevents you from getting what you need.
@KaosFireMaker
@KaosFireMaker 9 лет назад
FortNikitaBullion It does if you BELIEVE!
@ThePeterDislikeShow
@ThePeterDislikeShow 9 лет назад
Well what I'm thinking is something you can take and then it would feel like you had 8 hours sleep, even though you didn't (or maybe even though you only had 2 hours). Caffeine doesn't do that --- it just makes it impossible to sleep without really improving your productivity.
@KaosFireMaker
@KaosFireMaker 9 лет назад
FortNikitaBullion I understood what you meant.
@michafilek6883
@michafilek6883 8 лет назад
Incredible lecture, thanks MIT.
@pokixd2298
@pokixd2298 4 года назад
As always great stuff
@stephk42
@stephk42 5 лет назад
When lecture is over, he just nods and walks away...
@kevnar
@kevnar 4 года назад
I once use a nearest-neighbor algorithm to create a voronoi diagram. I didn't even know there was a name for either of them. I was just playing around with pixels.
@abjkgp
@abjkgp 2 года назад
What is comparitor 8:32? Couldn't find on the web. Is this a spelling mistake?
@ally_jr
@ally_jr 7 лет назад
Amazing lecturer!
@thetranslator1044
@thetranslator1044 Год назад
Legend.
@yoyokagus9245
@yoyokagus9245 9 лет назад
great lecture
@anishreddyellore6002
@anishreddyellore6002 3 года назад
Just Wow!!
@hdfhfdhdfhdfghdfh3306
@hdfhfdhdfhdfghdfh3306 6 лет назад
Can anyone please help me? 1. Regarding the Robotic Hand Solutions Table: If I understand correctly in the case of the robotic hand, we start from an empty table and drop a ball from a fixed height on the robotic hand. When the robotic hand feels the touch of the ball, we give a random blow as we record the robotic hand movements. Now, only if the robotic arm detects after X seconds that the ball has hit the surface again, it realizes that the previous movement was successful and records the movements it made for the successful result in the table for future use. I guess there is a way to calculate where on the surface the ball fell and then in case the robotic hand feels that the ball touched a region close to the area it remembers it will try the movement closest to these points in the table. Now there are a few things I do not understand: A. The ball has an angle, so that touching the same point on the board at different angles will lead to the need to use a different response, our table can only hold data of the desired point and effect and do not know the intensity of the fall of the ball or an angle, the data in the table will be destroyed or never fully filled ? B. How do we update the table? It is possible that we will drop a ball and at first when the table is empty we will try to give a random hit when the result of this is that the ball will fly to the side so we will not write anything in the table, now this case may repeat itself over and over and we will always be left with an empty table? It seems to me that I did not quite understand the professor's words and therefore I have these questions. I would be very happy if any of you could explain to me exactly what he meant by this method of solution. 2. In relation to finding properties by vector: If I understand correctly, we fill in the data we know in advance, and then when a new figure is reached, and we do not know much about it, we measure the angle it creates with the X line (the angle of the vector) and check which group is the most suitable angle. Now there is a point I do not understand. Suppose I have 2 sets of data, 1 group have data with very low Y points and very high X points and a second group having data with high X and Y points when I get a new data with a low Y and low X , the method of the vector angle will probably associate them with group 1 although it appears on paper that the point is more suitable for group 2. It seems that if we used a simple surface distribution here (as in the first case presented by the professor) we would get more accurate results than the method of pairing according to vectors angle?
@edusson
@edusson 9 лет назад
Does anybody know the authors of the robot balancing the tennis ball? Thanks!
@shumakriss
@shumakriss 8 лет назад
Thanks for posting, is there somewhere I can go to ask questions or discuss the lecture?
@famishedrover
@famishedrover 7 лет назад
Amazing teacher !
@amitgrover
@amitgrover 6 лет назад
At 41:45, the professor indicates that you cannot use AI for predicting bankruptcies in credit card companies. That's like making cake without flour. Wouldn't the credit card company have relevant data to be able to use AI to predict bankruptcies? Why is the answer "no"?
@suniltech7586
@suniltech7586 9 лет назад
good work sir
@philippg6023
@philippg6023 6 лет назад
with nearest neighbours learning, I've got 92% accuracy on MNIST-Database ( with euclidean distance). 97% with Neural-Nets
@whutismyname
@whutismyname 5 лет назад
Wish he could be my machine learning professor!!
@samirelzein1978
@samirelzein1978 2 года назад
the longest and less efficient way to deliver the intuition!
@sainiarvind3660
@sainiarvind3660 2 года назад
Good
@이종법-o2m
@이종법-o2m 3 года назад
so.. what is nearest neighbor??
@xXxBladeStormxXx
@xXxBladeStormxXx 8 лет назад
I don't think I'll even be able to walk with 36 hours of sleep deprivation.
@AdhityaMohan
@AdhityaMohan 6 лет назад
xXxBladeStormxXx I start hallucinating after 26 hours
@oudarjyasensarma4199
@oudarjyasensarma4199 5 лет назад
good!!
@katateo328
@katateo328 2 года назад
some of the principle looks like very abstract and super-natural and human have been considering as mystery and classify it to AI, but actually it is very simple. Computer could simulate it easily. The Brain is small but can do a lot of things. Not because of mystery but very simple structure.
@sauravfalia9676
@sauravfalia9676 6 лет назад
Can some one help me expand the Cos theta equation?
@freeeagle6074
@freeeagle6074 11 месяцев назад
Take two vectors of u, v of R2 as an example. Let u=[x11 x12] and v=[x21 x22]. Then, cos(theta)=(x11*x21 + x12*x22) / ( sqrt(x11^2 + x12^2 ) * sqrt(x21^2 + x22^2 )). If u=v, then x11=x21 and x12=22 so cos(theta)=1.
@ffzcdbnc9679
@ffzcdbnc9679 6 лет назад
Possible image knn code in Java
@magnfiyerlmoro3301
@magnfiyerlmoro3301 7 лет назад
did someone understand in 40:00 the derivative of x ?
@MrFurano
@MrFurano 7 лет назад
If you are talking about "x prime", that's not the derivative of x. It's a new random variable. More precisely, it's a random variable transformed from the original x. With the definition of "x prime", you can calculate its variance by plugging it in the formula. You will get 1.
@surflaweb
@surflaweb 5 лет назад
IF YOU THING THAT IS FOR KNN NOT! GET OUT OF HERE.. THIS IS NOT FOR KNN ALGORITHM
@dragolov
@dragolov 2 года назад
Deep respect to Patrick Winston. And this solo is made using KNN classifier: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-K2PQOgmlQwY.html
@110Turab
@110Turab 6 лет назад
Wondefull
@keira1412
@keira1412 6 лет назад
The sleep data is helpful to me. This Professor is very typical of how a Robotics Prof. would teach.
@asmadjaidri1219
@asmadjaidri1219 8 лет назад
thanx a lot ^^
@MattyHild
@MattyHild 6 лет назад
C'mon pierre...
@angtuanetblwse8896
@angtuanetblwse8896 4 дня назад
This white guy with a telescope is killing me
@dhruvjoshi8744
@dhruvjoshi8744 5 лет назад
11:20 turn on caption ..lol
@bubbleman2059
@bubbleman2059 9 лет назад
lama
@aureliassong
@aureliassong 2 года назад
:(
@luke8489
@luke8489 3 года назад
asdf
@bryanjohnson7781
@bryanjohnson7781 10 лет назад
There he GOES again: blaming his age on poor GAIN control
@akshayakumars2814
@akshayakumars2814 3 года назад
Diet coke
@bensalemn271
@bensalemn271 Год назад
this course is a waste of time...
@maar2001
@maar2001 10 месяцев назад
Could you explain why you think that?
Далее
11. Learning: Identification Trees, Disorder
49:37
Просмотров 121 тыс.
MIT Introduction to Deep Learning | 6.S191
1:09:58
Просмотров 676 тыс.
Random Emoji Beatbox Challenge #beatbox #tiktok
00:47
17. Learning: Boosting
51:40
Просмотров 317 тыс.
3. Reasoning: Goal Trees and Rule-Based Expert Systems
49:56
Lecture 3 "k-nearest neighbors" -Cornell CS4780 SP17
49:42
16. Learning: Support Vector Machines
49:34
Просмотров 2 млн
Think Faster, Talk Smarter with Matt Abrahams
44:11
Просмотров 1,8 млн
12a: Neural Nets
50:43
Просмотров 531 тыс.
20. Savings
1:14:29
Просмотров 921 тыс.