Тёмный

Linear Regression 

Steve Brunton
Подписаться 351 тыс.
Просмотров 47 тыс.
50% 1

Linear regression is a cornerstone of data-driven modeling; here we show how the SVD can be used for linear regression.
Book PDF: databookuw.com/databook.pdf
Book Website: databookuw.com
These lectures follow Chapter 1 from: "Data-Driven Science and Engineering: Machine Learning, Dynamical Systems, and Control" by Brunton and Kutz
Amazon: www.amazon.com/Data-Driven-Sc...
Brunton Website: eigensteve.com
This video was produced at the University of Washington

Наука

Опубликовано:

 

26 янв 2020

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 33   
@Ash-bc8vw
@Ash-bc8vw 2 года назад
I don't think anybody is teaching LR with respect to SVD on RU-vid right now, hence this video is more informative! Loved it immediately subscribed
@nikosips
@nikosips 4 года назад
Those videos are the best resource for someone who wants to understand data driven models! Thank you very much for your work from an engineering student!!
@appliedmathness8397
@appliedmathness8397 3 года назад
I love thiese videos! But in this one you point out the "squared projection error" while showing the segment going from the biased line to the outlier (like in PCA); instead in case of linear regression residuals should be vertical lines.
@Chloe-ty9mn
@Chloe-ty9mn Месяц назад
i've been watching all the videos in this chapter and this is the one that got me to cave and purchase the book!! i was so surprised to see that it was so affordable. thank you and your team so so so much for the high quality accessible information
@SoroushRabiei
@SoroushRabiei 4 года назад
Dear professor, you're a great teacher! Thank you so much for these videos.
@dmitrystikheev3384
@dmitrystikheev3384 3 года назад
I was looking for copper, but found gold! Boss, excellent as always. Love your way of conveying the material. I hope you will continue presenting more topics on statistics, cause in the multivariate case it can become really intimidating. Best regards from Russia!
@patrickxu8795
@patrickxu8795 2 года назад
The lecture is so clear and well-organized! IT IS IMPRESSIVE!!!!
@linnbjorkholm9237
@linnbjorkholm9237 Год назад
Wow! Great video! I really liked your shirt, where is it from?
@Eigensteve
@Eigensteve Год назад
It’s a Patagonia capilene. My favorite shirt. I almost only wear them
@ParthaPratimBose
@ParthaPratimBose 3 года назад
Hi Steve, I am a pharmaceutical data analyst, but you're just outstanding
@Eigensteve
@Eigensteve 3 года назад
Wow, thanks!
@shakibyazdani9276
@shakibyazdani9276 3 года назад
Absolutely awesome series, I will finish the whole series today:)
@Eigensteve
@Eigensteve 3 года назад
Hope you enjoy it!
@hengzhang7039
@hengzhang7039 4 года назад
Thank you sir, your courses are awesome!
@vahegizhlaryan5052
@vahegizhlaryan5052 Год назад
I am honestly surprised(just accidentally discovered this channel) why this coolest recourse is not popular among RU-vid algorithms
@saitaro
@saitaro 4 года назад
This is gold, professor!
@motbus3
@motbus3 3 года назад
besides the very awesome explanation, the book is awesome and he writes mirrored as it was nothing 😄
@engr.israrkhan
@engr.israrkhan 4 года назад
sir great teacher you are
@kindleflow
@kindleflow 4 года назад
Thanks
@a.danielhernandez2839
@a.danielhernandez2839 3 года назад
Excellent explanation!, What happens with the y-interception of the line? Is it b?
@user-ih4mv5hl9i
@user-ih4mv5hl9i 8 месяцев назад
Interesting. In the first lecture of this series, individual faces (i.e. people) were in the columns, but a face was really a column of many pixels. In this lecture, people are in the rows. So each use of SVD is different. And each setup of a data matrix is different.
@spidertube1000
@spidertube1000 3 года назад
Good vid bruh
@sachavanweeren9578
@sachavanweeren9578 3 года назад
very nice series ... though it has been a while and I might be a bit rusty on my math. But if I recall correctly there is nowhere an explicit link made between the SVD and least squares. It is explained that the there is an SVD and with a theorem that this was the best one in some norm. But I have not seen an explicit link with ols. Would be nice if that would be more explicit in the video series...
@udriss1
@udriss1 2 года назад
Hello. In your book DATA DRIVEN SCIENCE & ENGINEERING page 24, relation (1.26), you express the matrix B. In this relation you must write: B = X - X bar and not as one can read B = X - B bar. With here X bar which is the matrix of means.
@clickle23
@clickle23 3 года назад
Can you explain why in the example at the end, U = a/|a|, is it because U has the only one eigen vector of matrix AA(transpose), which is just itself?
@Martin-iw1ll
@Martin-iw1ll 7 месяцев назад
In mechanics, overdetermined is named statically indeterminate
@robsenponte3308
@robsenponte3308 2 года назад
Cool
@anilsenturk408
@anilsenturk408 4 года назад
How's it going?
@moshuchitu203
@moshuchitu203 2 месяца назад
by cross referencing ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-ualmyZiPs9w.html, one can clearly see the slope derived in the end is nothing but "covariance (a, b)/variance(a)"
@uzferry5524
@uzferry5524 2 года назад
based
@philrincon
@philrincon Год назад
Is he writing in the reverse?
@ralfschmidt3831
@ralfschmidt3831 3 года назад
I am slightly confused: the orthogonal projection of b onto a should minimize the distance between b and its projection - which is ORTHOGANAL to the span of a. If I remember correctly, the minimum least squares, however, should minimize the VERTICAL distance between the projected and the original point. I am sure there is something wrong with my assumptions but maybe someone can point me in the right direction
@SkyaTura
@SkyaTura Год назад
Besides the undeniable quality of the video overall, isn't awesome that he writes backwards in the air just to explain his points? 🤔
Далее
Linear Regression 1 [Matlab]
12:05
Просмотров 38 тыс.
Randomized Singular Value Decomposition (SVD)
13:12
Просмотров 28 тыс.
[RU] Winline EPIC Standoff 2 Major | Group Stage - Day 2
9:32:40
❤️My First Subscriber #shorts #thankyou
00:26
Просмотров 4,8 млн
Linear Regression, Clearly Explained!!!
27:27
Просмотров 223 тыс.
Regression: Crash Course Statistics #32
12:40
Просмотров 688 тыс.
Bayesian Linear Regression : Data Science Concepts
16:28
Logistic Regression [Simply explained]
14:22
Просмотров 163 тыс.
Regression Output Explained
33:19
Просмотров 661 тыс.
$1 vs $100,000 Slow Motion Camera!
0:44
Просмотров 12 млн
Я УКРАЛ ТЕЛЕФОН В МИЛАНЕ
9:18
Просмотров 68 тыс.