Тёмный

9. Four Ways to Solve Least Squares Problems 

MIT OpenCourseWare
Подписаться 5 млн
Просмотров 118 тыс.
50% 1

MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018
Instructor: Gilbert Strang
View the complete course: ocw.mit.edu/18-065S18
RU-vid Playlist: • MIT 18.065 Matrix Meth...
In this lecture, Professor Strang details the four ways to solve least-squares problems. Solving least-squares problems comes in to play in the many applications that rely on data fitting.
License: Creative Commons BY-NC-SA
More information at ocw.mit.edu/terms
More courses at ocw.mit.edu

Опубликовано:

 

15 май 2019

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 46   
@intuitivej9327
@intuitivej9327 2 года назад
Fantastic lecture... i am so lucky to watch and learn.... i am an ordinary mother of two little girls and recently finding myself experiencing full of joy learning linear algebra. It is all happened to me beacuse of wonderful lecturer and mit. thank you for sharing with us. I am going to continue to learn. Many thanks from korea.
@patf9770
@patf9770 3 года назад
MIT has still got it. What a time to be alive that we can watch this for free
@brendawilliams8062
@brendawilliams8062 2 года назад
I am sure happy about it.
@SheikhEddy
@SheikhEddy 5 лет назад
Thanks for the lecture! I've tried to learn these things before and gotten out more confused than I was when I came in, but Dr. Strang's approach makes it all seem so simple!
@iwonakozlowska6134
@iwonakozlowska6134 5 лет назад
The "idea" of orthogonal projection allowed me to understand the Christoffel symbols. I "studied" all the lectures on MIT 18.06 but I am still discovering the linear algebra anew. Thanks G.S. , thanks MIT.
@kristiantorres1080
@kristiantorres1080 3 года назад
Who dares to dislike this masterpiece of a lesson?
@georgesadler7830
@georgesadler7830 2 года назад
Professor Strang thanks for showing different ways to Solve Least Squares problems in linear algebra and statistics. Least Squares is used every day to fit data.
@shiv093
@shiv093 5 лет назад
lecture starts at 5:18
@oldcowbb
@oldcowbb 4 года назад
thanks, i have 5 more minute to study for the final now
@alecunico
@alecunico 3 года назад
Hero
@ryanjackson0x
@ryanjackson0x 2 года назад
I am not skipping anything from Strang
@abay669
@abay669 Год назад
I wish u were my professo Mr Strang, but hey, I have u as my Professor here online: thank you very much for ur elegant explanation. Wish u good healt and long live Mr Strang
@mengyuwang5159
@mengyuwang5159 5 лет назад
One thing in question in the lecture is that Ui but not Vi is in the column space of A. Vi should be in the A's row space.
@user-jt7kw4jf9o
@user-jt7kw4jf9o Год назад
Thanks, I agree with u. I get trouble when I first see it
@omaraymanbakr3664
@omaraymanbakr3664 12 часов назад
ruthlesss 25 people have disliked this video , who dares to dislike a lecture by prof Strang!!
@unalcachofa
@unalcachofa 5 лет назад
The first question from the problem set asks for the eigenvalues of A+ when A square. I know that A and A+ have the same number of zero eigenvalues but I'm stuck searching for a relationship for the non zero ones. Some hint?? I check numerically and I verified that they are not 1/λ_i as one might have conjecture.
@hasan0770816268
@hasan0770816268 4 года назад
least square problem: to solve a system of equations that has more equations than unknowns, i.e. non square matrix. we solve by At Ax = At b, but since we cant find At for non square matrix, we approximate using svd
@chiahungmou7351
@chiahungmou7351 Год назад
Last two minutes for Gram-Schmidt is really remarkable, 2 mins hardly time to see the heart of that mathematic machine.
@forheuristiclifeksh7836
@forheuristiclifeksh7836 11 дней назад
5:41 least squared
@paganisttc
@paganisttc 3 года назад
The best of the bests
@daweedcito
@daweedcito 2 года назад
Thought Id be watching for 5 minutes, ended up staying for the whole class...
@matheusvillaas
@matheusvillaas Год назад
29:40 why do we use p=2 norm rather than any other p?
@heidioid
@heidioid 2 года назад
bookmark Least Squares Problem 23:00
@Enerdzizer
@Enerdzizer 4 года назад
Prof claimed that A+b give the same result as ATA-1b in 40:39 if matrix ATA is invertable . But if it is not invertable ,what is geometric meaning of A+b? Is it still projection of b onto the column space of A?
@rushikeshshinde2325
@rushikeshshinde2325 4 года назад
It's it's not invertible,in general the vector gets mapped to a null space which is smaller than n dimension. This means, it gets mapped to lesser dimensional space hence it's impossible to recover/map it back to column space.
@ajiteshbhan
@ajiteshbhan 4 года назад
At 46:00 professor says SVD in this case is neither side inverse but Right side is one side inverse, then he says at end under independent columns SVD gives same result as Guass, but sigma matrix in pseudo inverse should still have missing values how will they give same result?
@yuchenzhao6411
@yuchenzhao6411 4 года назад
Under the independent columns assumption, A has left-inverse, and it's form is exactly same as the Guass's method.
@user-rn6ff8fq5n
@user-rn6ff8fq5n 4 месяца назад
If b is perpendicular to the column space of A, what is the solution for Ax=b?
@dariuszspiewak5624
@dariuszspiewak5624 3 года назад
"You can't raise it from the dead"... How true, how true, prof. Strang :))) Even though there are some in this world that think it's actually possible to raise people from the dead, LoL :)))
@dohyun0047
@dohyun0047 4 года назад
in notice box why both of equations don't produce same identity matrix? 43:30
@jayantpriyadarshi9266
@jayantpriyadarshi9266 4 года назад
Because you cannot open the bracket in the second expression. As the inner matrices are not square and thus they don't have an inverse.
@dohyun0047
@dohyun0047 4 года назад
@@jayantpriyadarshi9266 thank youuuu
@jayantpriyadarshi9266
@jayantpriyadarshi9266 4 года назад
@@dohyun0047 no worries bro.
@Irfankhan-jt9ug
@Irfankhan-jt9ug 4 года назад
Camera man ....Follow Prof Strang!!!
@thatsfantastic313
@thatsfantastic313 7 месяцев назад
Mathematicians teach machine learning way better than machine learning experts do, lol. Hats off to Prof. Strang
@srinivasg20
@srinivasg20 4 года назад
Sir you are father of linear algebra. Nice teaching sir.
@user-fh4xl3xz1f
@user-fh4xl3xz1f 4 года назад
Wel, this matrix here is doing its best to be the inverse. Actually, everybody here is just doing the best to be an inverse. (c) This phrase really describes me fighting my procrastination all the day.
@alshbsh2
@alshbsh2 4 года назад
how did he get (Ax-b)T(Ax-b)?
@hiltonmarquessantana8202
@hiltonmarquessantana8202 4 года назад
MA MO The dot product in a matrix form
@phsamuelwork
@phsamuelwork 4 года назад
Ax-b is a column vector. So (Ax-b)T is a row vector. Let's write Ax-b = w, wT w give us sum_i wi^2, that is exactly the sum of square of all elements in w.
@Fan-vk8tl
@Fan-vk8tl 3 года назад
the pseudoinverse part is unclear, the book tells more details and it relationship with the normal solution
@meyerkurt5875
@meyerkurt5875 3 года назад
Could u tell me how to find the book or the name of book? Thank you!
@Fan-vk8tl
@Fan-vk8tl 3 года назад
@@meyerkurt5875 His own book: linear algebra and learning from Data.
@drscott1
@drscott1 2 года назад
👍🏼
@chrischoir3594
@chrischoir3594 3 года назад
This guy could be the worst professor of all time
@paradoxicallyexcellent5138
@paradoxicallyexcellent5138 2 года назад
Far from the worst but he ain't great, that's for sure.
Далее
Lecture 10: Survey of Difficulties with Ax = b
49:36
Просмотров 51 тыс.
Lecture 8: Norms of Vectors and Matrices
49:21
Просмотров 156 тыс.
Finger Heart - Fancy Refill (Inside Out Animation)
00:30
Получилось у Миланы?😂
00:13
Просмотров 1,4 млн
7. Eckart-Young: The Closest Rank k Matrix to A
47:16
Linear Least Squares to Solve Nonlinear Problems
12:27
6. Singular Value Decomposition (SVD)
53:34
Просмотров 219 тыс.
The moment we stopped understanding AI [AlexNet]
17:38
Просмотров 789 тыс.
4. Eigenvalues and Eigenvectors
48:56
Просмотров 143 тыс.
26. Chernobyl - How It Happened
54:24
Просмотров 2,8 млн
The Matrix Transpose: Visual Intuition
26:01
Просмотров 20 тыс.