Тёмный

Linear Regression: Derivation 

numericalmethodsguy
Подписаться 67 тыс.
Просмотров 71 тыс.
50% 1

Опубликовано:

 

16 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 86   
@hottoniapalustris1541
@hottoniapalustris1541 3 года назад
Man, thank you! I thought my school project was really doomed before I saw this, but with your explanation, I finally found a way to make sense of my project data. Once more, thanks a lot!
@sharifahmed45
@sharifahmed45 3 года назад
Prof, I can't say anything else , but immense gratitude for you and your channel, and I am grateful student. Thanks again
@numericalmethodsguy
@numericalmethodsguy 3 года назад
Thank you. Please subscribe and ask your friends to subscribe - our goal is to get to 100,000 subscribers by the end of 2021. To get even more help, subscribe to the numericalmethodsguy channel ru-vid.com, and go to MathForCollege.com/nm and MathForCollege.com/ma for more resources. Follow the numerical methods blog at AutarKaw.org. You can also take a free massive open online course (MOOC) at canvas.instructure.com/enroll/KYGTJR Please share these links with your friends and fellow students through social media and email. Support the channel if you able to do so at ru-vid.com/store
@SaintRudi85
@SaintRudi85 5 лет назад
Nice explanation. It would also be really useful to have a similar video for multiple linear regression.
@ahmadibrahim3596
@ahmadibrahim3596 3 года назад
Thank You professor your explanation is very clear, I did the calculation and had the formulas of a and b.
@sergten
@sergten 4 года назад
Fantastic explanation.
@matard2940
@matard2940 3 года назад
All of this guys videos are so clear and helpful, best for numerical methods!
@numericalmethodsguy
@numericalmethodsguy 3 года назад
Thank you. Please subscribe and ask your friends to subscribe - our goal is to get 100,000 subscribers by the end of 2021. To get even more help, subscribe to the numericalmethodsguy channel ru-vid.com, and go to MathForCollege.com/nm and MathForCollege.com/ma for more resources. Follow the numerical methods blog at AutarKaw.org. You can also take a free massive open online course (MOOC) at canvas.instructure.com/enroll/KYGTJR Please share these links with your friends and fellow students through social media and email. Support the channel if you able to do so at ru-vid.com/store
@kvs123100
@kvs123100 3 года назад
This is so awesome! Sir Pranam from my side! After having gone through so many videos, this the perfect video I saw!
@numericalmethodsguy
@numericalmethodsguy 3 года назад
Thank you. Please subscribe and ask your friends to subscribe - our goal is to get to 100,000 subscribers by the end of 2021. To get even more help, subscribe to the numericalmethodsguy channel ru-vid.com, and go to MathForCollege.com/nm and MathForCollege.com/ma for more resources. Follow the numerical methods blog at AutarKaw.org. You can also take a free massive open online course (MOOC) at canvas.instructure.com/enroll/KYGTJR Please share these links with your friends and fellow students through social media and email. Support the channel if you able to do so at ru-vid.com/store
@Vishwesh2
@Vishwesh2 4 года назад
THANKS A LOT SIR!!!! I was choking at the derivative part but you made it clear. I have watched some other videos of yours. All are great. You earned a like and a subscriber. Really huge thanks sir. I'll watch other videos of yours also. You're a really good teacher
@delaware137
@delaware137 5 лет назад
Enlightening! Thank you for teaching me this.
@numericalmethodsguy
@numericalmethodsguy 5 лет назад
Thank you. To get even more help, subscribe to the numericalmethodsguy channel, and go to MathForCollege.com/nm and MathForCollege.com/ma for more resources and share the link with your friends through social media and email. Support the site by buying the textbooks at www.lulu.com/shop/search.ep?keyWords=autar+kaw&type= Follow my numerical methods blog at AutarKaw.org. You can also take a free online course at www.canvas.net/?query=numerical%20methods Best of Learning Autar Kaw AutarKaw.com
@ayushshaw3681
@ayushshaw3681 3 года назад
After watching the derivation I would say, awesome explanation.
@edwardmansal8459
@edwardmansal8459 2 года назад
Well explained. Grateful
@藍梁勻
@藍梁勻 Год назад
That is a fantastic explanation! I'm thankful for this video.
@numericalmethodsguy
@numericalmethodsguy Год назад
Thank you. Please subscribe and ask your friends to subscribe - our goal is to get to 100,000 subscribers by the end of 2021. To get even more help, subscribe to the numericalmethodsguy channel ru-vid.com, and go to MathForCollege.com/nm and MathForCollege.com/ma for more resources. Follow the numerical methods blog at AutarKaw.org. You can also take a free massive open online course (MOOC) at canvas.instructure.com/enroll/KYGTJR Please share these links with your friends and fellow students through social media and email. Support the channel if you able to do so at ru-vid.com/store
@imglenngarcia
@imglenngarcia 3 года назад
Wow! This will definitely be a key ingredient for my endeavor in transport, urban and regional planning. Thank you!
@cheznikos
@cheznikos 3 года назад
Seems you can set a0 = 0, find a1 very easily, then deduct a0 also easily. Reason is the angle a1 of the straight line doesn't not change if all Yi are decreased by any constant. Also in the end we can verify that a1 = cov(x,y)/var(x) = cov(x, y-a0)/var(x) for any a0. This will simply the computations.
@numericalmethodsguy
@numericalmethodsguy 3 года назад
Do not know about setting a0=0. If we are minimizing with respect to a0, we cannot assume it to be zero. Simpler derivation should not be used to sacrifice logical explanation.
@cheznikos
@cheznikos 3 года назад
@@numericalmethodsguy You're right, I was badly confused :(
@bulakornsi7285
@bulakornsi7285 4 года назад
Thank you so much. You explain so clearly.
@kunalparihar9224
@kunalparihar9224 Год назад
Thankyou sir for clear explanation 🙏
@studycenter8941
@studycenter8941 3 года назад
Very helpful 💓 thank you sir.
@arunbm123
@arunbm123 5 лет назад
brilliant explanation............
@numericalmethodsguy
@numericalmethodsguy 5 лет назад
To get even more help, subscribe to the numericalmethodsguy channel, and go to MathForCollege.com/nm and MathForCollege.com/ma for more resources and share the link with your friends through social media and email.
@y_p7
@y_p7 3 года назад
This helped me a ton!!! God bless ya professor
@visualizetheinfinitys.g.5048
@visualizetheinfinitys.g.5048 2 года назад
Thank u so much sir.
@twinklecloud6645
@twinklecloud6645 3 года назад
Thank you so much Sir for explaining the derivation in such an easy way😇.
@numericalmethodsguy
@numericalmethodsguy 3 года назад
Always welcome
@dharasheth4107
@dharasheth4107 4 года назад
I love it........Thank you so much......
@lukepaluso9863
@lukepaluso9863 5 лет назад
Wondrous! Thank you!!!
@stephenbarnes5145
@stephenbarnes5145 3 года назад
Excellent explanation! Thank you
@mjf6125
@mjf6125 5 лет назад
Thanks good explanation. Question: why does the partial derivative in this case yield a 'minimum'? How do we know it's not a maximum? Is it because: SSR = (Y - a.o - sum(a.i*x.i))^2 is the multivariable function we're trying to minimize and since it's squared we assume it's parabolic and opens upwards? Therefore the solution to the first partial derivative = 0 is a minimum?
@mjf6125
@mjf6125 5 лет назад
I'm sorry I misspoke when I placed the a.i and x.i in the sum. I was getting confused with multiple regression. Is solving multiple regression the same process? Just taking partial derivative with respect to each unknown variable and then solving the resulting equations?
@numericalmethodsguy
@numericalmethodsguy 5 лет назад
SSR=sum(y_i - a_o - a_1*x_i)^2 where _ stands for subscript. First partial derivatives put=0 ONLY yield a possible location of local minimum or maximum (do not know yet, if it a local minimum, local maximum or inflection point). It has to be followed by a second derivative test to see if it is the location of a local minimum or a local maximum. The second derivative test shows it is the location of local minimum (see link below). Since the first partial derivatives equal to zero equations have only one solution and SSR is a continuous function of a_0 and a_1, it has to be the also the location where the absolute minimum occurs too. To see the complete math behind it, go here: autarkaw.org/2012/09/03/prove-that-the-general-least-squares-model-gives-the-absolute-minimum-of-the-sum-of-the-squares-of-the-residuals/ or look at the derivation and appendix of mathforcollege.com/nm/mws/gen/06reg/mws_gen_reg_txt_straightline.pdf
@numericalmethodsguy
@numericalmethodsguy 5 лет назад
@@mjf6125 Yes, multiple regression follows same procedure as it is all about minimizing SSR.
@sparrowp2251
@sparrowp2251 Год назад
Thank you sir really 🙏🙏🙏🙏
@shreyanawani4218
@shreyanawani4218 4 года назад
Sir,is it correct to call this method as minimization using partial derivaties?Kindly reply as i have exam tomorrrow.
@numericalmethodsguy
@numericalmethodsguy 4 года назад
One cannot conflate the two items. What is shown is the derivation of the linear regression model. The least-squares linear regression method is to find the best fit straight line for given data. The straight-line regression model is found by minimizing the sum of the square of the residuals. " Minimization using partial derivatives" is the concept used to find the constants of the model. math.libretexts.org/Courses/University_of_Maryland/MATH_241/03%3A_Differentiation_of_Functions_of_Several_Variables/3.08%3A_Maxima/Minima_Problems
@michaeljburt
@michaeljburt 3 года назад
@@numericalmethodsguy Good answer. Also @numericalmethodsguy, this derivation was fantastic, thanks much. I'm now using regression models in electrical engineering (power distribution demand forecast models) and wanted to take a bit of a dive to understand where the coefficients for linear regression came from.
@AffanSamad
@AffanSamad 6 лет назад
very well explanation ..
@numericalmethodsguy
@numericalmethodsguy 6 лет назад
Thank you. Go to mathforcollege.com/nm/mws/gen/06reg/mws_gen_reg_txt_straightline.pdf to see how the second derivative test is done as it is not shown in the video.
@lucasmoratoaraujo8433
@lucasmoratoaraujo8433 Год назад
Nice!
@gp6957
@gp6957 2 года назад
Sir, I learnt basic Calculus and I'm in doubt how the exponent 2 become minus 2? When we use power rule it is simply 2 but u r using -2, how u got?
@numericalmethodsguy
@numericalmethodsguy 2 года назад
d/dx(u^2)=2*u*du/dx. The du/dx may be negative!
@seal0118
@seal0118 3 года назад
its very clear, thank you
@natashawanjiru1018
@natashawanjiru1018 4 года назад
What about using partial differentiation derive a normal equation for regression model......is it the same??
@numericalmethodsguy
@numericalmethodsguy 4 года назад
That is what is being done in the video. I do not understand the question?
@natashawanjiru1018
@natashawanjiru1018 4 года назад
The question is"using partial differentiation,derive the normal equations of a two variables regression model"
@numericalmethodsguy
@numericalmethodsguy 4 года назад
@@natashawanjiru1018 The question is ill-posed. First, the kind of model should be defined - is it y=a0+a1*x? Is it y=a*exp(b*x)? If it is just the straight line, go to nm.mathforcollege.com/mws/gen/06reg/mws_gen_reg_txt_straightline.pdf and look at the derivation as well as the appendix.
@natashawanjiru1018
@natashawanjiru1018 4 года назад
Thanks so much
@jamalnuman
@jamalnuman Год назад
great
@nD-ci7uw
@nD-ci7uw 5 лет назад
Can you explain how you derived a0? I got it from Crammer's Rule, but I can't derive it with a1 :/
@nD-ci7uw
@nD-ci7uw 5 лет назад
ok I did inverted derivation. I take your equation for a0 and I make it equal to Crammer a0. So equation is true, but how did you hit on this idea ? :)
@numericalmethodsguy
@numericalmethodsguy 5 лет назад
@@nD-ci7uw If you look at the equations, you already got a1 using Crammer's rule. You will get a similar looking expression to a1 for a0 by using Crammers rule. But how I get the expression for a0 is just by using equation (1) without Crammer's rule, that is n*a0+sum(xi)*a1=sum(yi), and writing a0 in terms of a1. Also, sum(xi)/n=xbar and sum(yi)/n=ybar.
@hirakmondal6174
@hirakmondal6174 4 года назад
What is this method called? Is it the same as gradient descent method?
@numericalmethodsguy
@numericalmethodsguy 4 года назад
One cannot conflate the two items. What is shown is derivation of linear regression model. The gradient descent method is to find the local minimum of any differentiable function. The least-squares linear regression method is to find the best fit straight line for given data. The straight-line regression model is found by minimizing the sum of the square of the residuals. The gradient descent method surely can be used to find the minimum of the square of the residuals.
@hirakmondal6174
@hirakmondal6174 4 года назад
@@numericalmethodsguy Thanks a lot for your reply. So both these ways i.e. the OLS and Gradient Descent can be used to achieve the same purpose right?
@numericalmethodsguy
@numericalmethodsguy 4 года назад
@@hirakmondal6174 No. You got to think about that SR is an objective function and we can use a method such as GD to find where it is minimum.
@Jayesh-uf6th
@Jayesh-uf6th 3 года назад
Sir... thank you sir.
@A.K04
@A.K04 4 года назад
Thank you very much..... Sir...
@muhammadkashim3229
@muhammadkashim3229 4 года назад
Can you explain for the model of 3 independent or explanatory variables
@gp6957
@gp6957 Год назад
Sir, I couldn't solve the matrix, please provide how to solve it....
@numericalmethodsguy
@numericalmethodsguy Год назад
Multiply equation (1) by sum of xi, and equation (2) by n. Subtract and you will get rid of a0 unknown. You will get the equation for a1. To find a0, simply use equation (1) and write it in terms of a1, sum of xi and sum of yi. You have already found a1. You can also look at the matrix form, and use Crammer's rule. See equation 9.8.5 and 9.8.6 of math.libretexts.org/Bookshelves/Precalculus/Precalculus_(OpenStax)/09%3A_Systems_of_Equations_and_Inequalities/9.08%3A_Solving_Systems_with_Cramer's_Rule
@nipulsindwani117
@nipulsindwani117 4 года назад
Thanks professor
@samirah1534
@samirah1534 4 года назад
why is the derivation of the minimum error made with respect to a0 and a1, i mean what is the general theory to derive w.r.t. a0 and a1.
@numericalmethodsguy
@numericalmethodsguy 4 года назад
nm.mathforcollege.com/mws/gen/06reg/mws_gen_reg_txt_straightline.pdf
@samirah1534
@samirah1534 4 года назад
@@numericalmethodsguy Thanks loads
@buttegowda
@buttegowda 3 года назад
Thanks a lot sir
@col.aureliano7352
@col.aureliano7352 3 года назад
Where did the -1 come from @ 6:35 ??
@numericalmethodsguy
@numericalmethodsguy 3 года назад
Taking derivative of (-a0) with respect to a0 is -1. Chain rule example. If u=u(a), then d/da(u^2)=2*u*du/da
@col.aureliano7352
@col.aureliano7352 3 года назад
@@numericalmethodsguy yes figured it out! but thanks for replying
@manamsetty2664
@manamsetty2664 Год назад
Why do we add the errors
@numericalmethodsguy
@numericalmethodsguy Год назад
We cannot reduce each residual. If we reduce one, another will increase or decrease. When you have many points, it is hard to do that. So we as a next step say - let us add the residuals and add them up. Then make the sum as small as possible. We find that it is not a good criterion. The sum of the absolute residuals is also not a good criterion. Both these methods result in non-unique straight lines. Minimizing the sum of the squares of the residuals works. It gives a unique straight line as well.
@MuhammadHussain-ol4lw
@MuhammadHussain-ol4lw 4 года назад
Can anyone explain the a0 value... How itt become ..
@numericalmethodsguy
@numericalmethodsguy 4 года назад
Just look at the first equation and write a0 in terms of a1.
@jamesoseiowusu8212
@jamesoseiowusu8212 4 года назад
Thanks Prof, but you didn't prove a0.
@numericalmethodsguy
@numericalmethodsguy 4 года назад
If you look at the equations, you already got a1 using Cramer's rule www.chilimath.com/lessons/advanced-algebra/cramers-rule-with-two-variables/ or by using Gaussian elimination symbolically. You will get a similar to a1 looking expression for a0 by using Cramers rule. But how I get the expression for a0 is just by using equation (1) without Cramer's rule, that is n*a0+sum(xi)*a1=sum(yi), and writing a0 in terms of a1. Also, sum(xi)/n=xbar and sum(yi)/n=ybar.
@romanemul1
@romanemul1 3 года назад
Police line on a ground . DO NOT CROSS !
@ethanhunt987
@ethanhunt987 5 лет назад
i needed the solution of those equations where you stop solving and wrote the formula to find a0 and a1, this video is not much of a use for me
@numericalmethodsguy
@numericalmethodsguy 5 лет назад
You can simply use Gaussian elimination symbolically to get the solution. Give it a try - it won't hurt. Or use the cofactor method as explained here. www.nabla.hr/MD-SysLinEquMatrics2.htm
@numericalmethodsguy
@numericalmethodsguy 5 лет назад
Thank you. To get even more help, subscribe to the numericalmethodsguy channel, and go to MathForCollege.com/nm and MathForCollege.com/ma for more resources and share the link with your friends through social media and email. Support the site by buying the textbooks at www.lulu.com/shop/search.ep?keyWords=autar+kaw&type= Follow my numerical methods blog at AutarKaw.org. You can also take a free online course at www.canvas.net/?query=numerical%20methods
@alexm9744
@alexm9744 4 года назад
VERY well explained. Thanks so much!
@sathiyanarayanan7245
@sathiyanarayanan7245 Год назад
Thank u very much sir .
@numericalmethodsguy
@numericalmethodsguy Год назад
Most welcome. Thank you. Please subscribe and ask your friends to subscribe - our goal is to get 100,000 subscribers by the end of 2022. To get even more help, subscribe to the numericalmethodsguy channel ru-vid.com, and go to MathForCollege.com/nm and MathForCollege.com/ma for more resources. Follow the numerical methods blog at blog.AutarKaw.com. You can also take a free massive open online course (MOOC) on Numerical Methods at canvas.instructure.com/enroll/KYGTJR and on Introduction to Matrix Algebra at canvas.instructure.com/enroll/J4BFME. Please share these links with your friends and fellow students through social media and email. Support the channel if you able to do so at ru-vid.com/store
Далее
Derivation of OLS coefficients
13:35
Просмотров 48 тыс.
Linear Regression, Clearly Explained!!!
27:27
Просмотров 250 тыс.
Linear Regression: Example
10:24
Просмотров 57 тыс.
Introduction to residuals and least squares regression
7:39
Interpreting Linear Regression Results
16:08
Просмотров 308 тыс.
All Learning Algorithms Explained in 14 Minutes
14:10
Просмотров 238 тыс.
Gradient Descent, Step-by-Step
23:54
Просмотров 1,3 млн
Linear Regression and Correlation - Example
24:58
Просмотров 1 млн