Тёмный

9 Regression as an Orthogonal Projection 

Shaina Race Bennett
Подписаться 489
Просмотров 3 тыс.
50% 1

Кино

Опубликовано:

 

21 июн 2021

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 8   
@AlphansoEric
@AlphansoEric Год назад
That's amazing video, Beautiful explanation of linear regression in terms of linear algebra.
@chiarasacchetti8284
@chiarasacchetti8284 2 месяца назад
This video saved my life
@MrOndra31
@MrOndra31 Год назад
Great content! This was the missing link between my linear algebra and econometrics courses :D
@asifzahir7512
@asifzahir7512 Год назад
Amazing! cleared lots of confusions
@breathemath4757
@breathemath4757 Год назад
This is just way too good. Thanks a lot!
@antonlinares2866
@antonlinares2866 8 месяцев назад
Thank you so much, you made algebra and linear regression click for me
@teklehaimanotaman3150
@teklehaimanotaman3150 Год назад
Very amazing lecture! thank you very much for your efforts. Is the line from the origin to the point y_hat the regression line please?
@sum1sw
@sum1sw 9 месяцев назад
I'm not sure this is what I am looking for, if it is, then I missed it. I have an implicit function f(x,y,z)=0 (it is actually a model with adjustable parameters). I have an experimental data point (Xexp, Yexp, Zexp). You can probably see where I am heading with this. I want to know where a line, orthogonal/perpendicular to the surface, will intersect the surface. I'm calling this point of intersection Xcalc, Ycalc, Zcalc. How do I proceed? Based on other videos I watched, it looks like the first step is to linearize the surface using Taylor series. So, now I have a plane (in terms of partial derivatives and (Xcalc, Ycalc, Zcalc) which is still unknown. I want to know the point of intersection (Xcalc, Ycalc, Zcalc) of the orthogonal line from Xexp, Yexp, Zexp. At first, I thought is it a trial an error iterative procedure (I have to guess Xcalc, Ycalc, Zcalc) so I programmed that, but the answers I am getting do not seem to be correct. I'm also beginning to suspect that the solution can be direct, not iterative. Any thoughts?
Далее
10   Defining Eigenvalues and Eigenvectors
9:17
Least Squares
29:01
Просмотров 10 тыс.
УРА! Я КУПИЛ МЕЧТУ 😃
00:11
Просмотров 810 тыс.
The deeper meaning of matrix transpose
25:41
Просмотров 359 тыс.
17. Orthogonal Matrices and Gram-Schmidt
49:10
Просмотров 202 тыс.
Orthogonal sets
23:32
Просмотров 12 тыс.
Orthogonal Regression
7:48
Просмотров 13 тыс.
The Least Squares Formula: A Derivation
10:31
Просмотров 123 тыс.
Simple Linear Regression - ANOVA
22:01
Просмотров 20 тыс.
The Mathematics of String Art
10:36
Просмотров 510 тыс.
Orthogonality and Orthonormality
11:48
Просмотров 191 тыс.