Тёмный

Assumptions of Linear Regression | What are the assumptions for a linear regression model 

Unfold Data Science
Подписаться 94 тыс.
Просмотров 47 тыс.
50% 1

Опубликовано:

 

25 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 125   
@UnfoldDataScience
@UnfoldDataScience 2 года назад
In this lecture, 3rd point is "NO heteroscedasticity" , please do not confuse
@manushrivetal6098
@manushrivetal6098 2 года назад
3. Homoscedasticity is one of the most critical It states that there should be an equal distribution of errors. It is not Heteroscedasticity
@UnfoldDataScience
@UnfoldDataScience 2 года назад
Access Hindi, English courses here- www.unfolddatascience.com/s/store Plz register on the website
@jeremiahayegbusi2269
@jeremiahayegbusi2269 Год назад
best video on assumptions of OLS on RU-vid
@aswanthshanmugam
@aswanthshanmugam 11 дней назад
Thank you Sir. Very informative 🙏
@interestingstudies4422
@interestingstudies4422 2 года назад
Becoming huge fan of this channel..grt explaination
@anbesivam7686
@anbesivam7686 3 года назад
I am actively searching job. Sometime I feel like I won't get a job but after watching your videos I feel really learned something and it's give some confident. Thanks for the video. Please keep sharing videos.
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Best of luck buddy. you will do it. mark my words. cheers!
@abhinavkale4632
@abhinavkale4632 3 года назад
It will be really helpful if you can provide a video lecture in which you put all the assumptions to a test on a Kaggle dataset(any). Cheers.. great work sir..
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Hi Abhinav, All the assumptions in the test means?
@abhinavkale4632
@abhinavkale4632 3 года назад
@@UnfoldDataScience all the assumptions as in multicollinearity, normality of residuals, autocorrelation.. all these assumptions applied on real dataset (basically executing all the assumptions in python)
@sktarikaziz6529
@sktarikaziz6529 2 года назад
Good explanation, Thanks
@siddhawan5190
@siddhawan5190 2 года назад
You explanation is very good and easy to understand. Thanks for this awesome video
@UnfoldDataScience
@UnfoldDataScience 2 года назад
Glad it was helpful!
@gauravkamble9702
@gauravkamble9702 3 года назад
You explained it in the easiest manner possible! Thanks for sharing this, Sir 😊
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Welcome Gaurav.
@prachigupta1963
@prachigupta1963 3 года назад
Clear explanation in a very less time . Thank you 😊👍
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Thanks Prachi. Keep watching.
@amitbudhiraja7498
@amitbudhiraja7498 2 года назад
Can u make the live demo video to check all of these assumptions on the data on dataset
@yoyomovieclips8813
@yoyomovieclips8813 3 года назад
clean,simple and precise explaination
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Thanks Neeraj.
@salikmalik7631
@salikmalik7631 3 года назад
great video. we need videos like that sir, (Assumptions of other algorithms)...
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Thanks Salik. Will try to create.
@raofaizanali1548
@raofaizanali1548 2 года назад
Lovely Brother ..Thanks
@UnfoldDataScience
@UnfoldDataScience 2 года назад
Thanks a lot.
@annonymous6555
@annonymous6555 2 года назад
Please make a video on assumptions of logistic regression
@deepakadik1210
@deepakadik1210 3 года назад
Very Crystal clear explanation !! Hoping for more of such content.
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Thanks Deepak.
@nayanparnami8554
@nayanparnami8554 3 года назад
Very precise and very informative!! thank you sharing for this,Sir...
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Thanks Nayan.
@deangibson5283
@deangibson5283 2 года назад
Very easy to understand. Please provide a video about RMSE
@UnfoldDataScience
@UnfoldDataScience 2 года назад
I have video of R squared, not specific to RMSE, will create one.
@ashulohar8948
@ashulohar8948 Год назад
Please make a vedio on end to end projects of all the algos
@kratiagrawal2742
@kratiagrawal2742 2 года назад
Hi, Could you please answer how should we approach this situation in regression problem: The target variable is distributed in a biased manner(50% of the values lie in the range 0-300 and 30% in 300-500 and 10% in remaining 500-1000) , how will you approach such scenario?
@nivednambiar6845
@nivednambiar6845 3 года назад
good way of explanation
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Thanks Nived
@rahulramachandran761
@rahulramachandran761 3 года назад
You deserve more subscribers aman.. Your teachings are very good👍👍
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Thanks a lot Rahul.
@sadhnarai8757
@sadhnarai8757 3 года назад
Very much needed .
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Thanks a lot :)
@divyanshuchaudhari5416
@divyanshuchaudhari5416 3 года назад
Good Explanation.
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Thanks Divyanshu.
@sandhya_exploresfoodandlife
@sandhya_exploresfoodandlife 3 года назад
Good explanation! thank you!
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Thanks Sandhya.
@ankurdubey960
@ankurdubey960 3 года назад
you should explain why in the last three assumptions.
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Sure, thanks for feedback.
@ramub7657
@ramub7657 2 года назад
Awesome explanation. Thank you Aman. Have one question, isn't assumptions 3 and 4 are same Heteroscedasticity and No auto correlation of errors?
@UnfoldDataScience
@UnfoldDataScience 2 года назад
No, not same,
@UnfoldDataScience
@UnfoldDataScience 2 года назад
It's little long explanation, sorry to not able to write, will talk in live or interviews
@sowjiadabala
@sowjiadabala 2 года назад
@@UnfoldDataScience Sir have you explained this in any of your videos ?
@peterjohn2899
@peterjohn2899 2 года назад
Sir..please explain whether the points 2 and 6 are different or not? why?
@firassami7399
@firassami7399 3 года назад
Thanks I just have couple equestion . 1- What is the disadvantages of multicolinearity 2- in several cases, the distibutiin of error vector is not following the normat distribution. How can I deal with that
@AshokKumar-rh7ey
@AshokKumar-rh7ey 3 года назад
The same question i do have..
@UnfoldDataScience
@UnfoldDataScience 3 года назад
I explained in multicollinearity video. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-ekuD8JUdL6M.html
@AMVSAGOs
@AMVSAGOs 2 года назад
Hi Aman, I think we need strong justification for point 3,4,5. Why it should not happen??. I was asked in an interview and I was not able to justify point 3,4,5 . Could you please elaborate little more on these points.
@rohitbhosale4614
@rohitbhosale4614 3 года назад
Thanks! Please explain 1st assumption
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Sure.
@sandipansarkar9211
@sandipansarkar9211 2 года назад
finished watching
@umeshrawat8827
@umeshrawat8827 6 месяцев назад
Why are these called Assumptions? They seem to be the mandatory conditions to ensure better regression model...?
@RamanKumar-ss2ro
@RamanKumar-ss2ro 3 года назад
Amazing content.
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Thanks a lot :)
@manushrivetal6098
@manushrivetal6098 2 года назад
Yor are always great
@yoyomovieclips8813
@yoyomovieclips8813 3 года назад
In next video please tell us how to check these assumptions and how to correct the data to follow these assumptions
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Sure Neeraj, Will do.
@shadiyapp5552
@shadiyapp5552 Год назад
Thank you sir ♥️
@UnfoldDataScience
@UnfoldDataScience Год назад
Most welcome
@sharanm5718
@sharanm5718 3 года назад
Similar can we have any assumptions and limitations to logistic regression
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Yes Sharan. That comes also under regression umbrella.
@nickrogers4408
@nickrogers4408 3 года назад
So clear and informatic❤️
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Glad you think so Sai.
@mesfin.ayalew
@mesfin.ayalew 2 года назад
how i can check assumption in multiple linear Regression with categorical independent variable?
@developerboy8341
@developerboy8341 3 года назад
Very good explanations, keep it up.
@UnfoldDataScience
@UnfoldDataScience 2 года назад
Thanks, will do!
@RohanDreamerz
@RohanDreamerz 3 года назад
Great Explanation! So much clarity on concepts.
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Glad it was helpful Rohan.
@_itachi7904
@_itachi7904 3 года назад
excellent explanation. can you demonstrate a linear regression?
@UnfoldDataScience
@UnfoldDataScience 3 года назад
ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-8PFt4Jin7B0.html
@ajaybandlamudi2932
@ajaybandlamudi2932 2 года назад
could you please answer my question What is the similarities and the difference between a generalised linear model(Glm) and gradient boosting machine(Gbm)?
@sandipansarkar9211
@sandipansarkar9211 3 года назад
great explanation
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Thanks Sandipan.
@nagnathsatav9978
@nagnathsatav9978 3 года назад
Thank u for this video, clean explanation, waiting for this. I want video on kaggle specially using submission file using smothe technique.
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Thanks Nagnath. Sure :)
@genai-guru
@genai-guru 2 года назад
1. Linear relationship 2. Very low/No multicollinearity (independent variable correlation each other 3. heteroscedasticity 4. No autocorrelation Normally distributed error 5. All the observations are independent of the each other
@UnfoldDataScience
@UnfoldDataScience 2 года назад
Thanks. 3rd point - " No heteroscedasticity"
@luistorres7297
@luistorres7297 2 года назад
Will (or could) heteroskedasticity imply autocorrelation?
@UnfoldDataScience
@UnfoldDataScience 2 года назад
here heteroskedasticity is in context of error terms. It will not mean autocorrelation.
@anirbansarkar6306
@anirbansarkar6306 3 года назад
Thank you so much for this wonder content. It was really helpful. In multicolinearity part, I have a small doubt. I understood through your example, it is better to remove one feature out of 2 if they are positively correlated. Does the same applies for negatively correated features too? I mean shall I drop one feature, in case two features are positively correlated?
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Thanks Anirban. Postive/Negative correlation does not matter, we should not keep two variables which are highly correlated.(any direction)
@yashpandey5484
@yashpandey5484 3 года назад
How to remove multicollinearity from tha data set if the features are highly correlated can we solve the problem without removing any features or any information loss does PCA helfull?
@abhinavkale4632
@abhinavkale4632 3 года назад
I think we can use VIF "Variance inflation factor" and then decide which features should be included in the model. In addition, we also need to check the significance value from the OLS regression model. There is a threshold limit(generally for VIF
@arvindadari3390
@arvindadari3390 3 года назад
PCA will definitely help to tackle the multicollinearity but will loose out Interpretability.
@UnfoldDataScience
@UnfoldDataScience 2 года назад
To Answer question from Yash, there is very low information loss if you remove a highly correlated variable.
@dsklife
@dsklife 3 года назад
Can you help me with regression models with multi-dimensional data?
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Its there in my channel. Search with channel name and topic
@spandanswain2879
@spandanswain2879 3 года назад
Does it applicable for all regression model or only in linear regression models?
@UnfoldDataScience
@UnfoldDataScience 3 года назад
All regression based model mostly.
@sreevidyahothur2313
@sreevidyahothur2313 3 года назад
Sir, shall I ask a doubt? Should we consider multicollinearity in multiple linear regression, on a time series financial data?
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Yes Srividya, if you are training a regression model.
@ultra_legend23
@ultra_legend23 3 года назад
should the data have homescedasticity or hetero? Other blogs says it should have homoscedasticity. Please clarify. Thanks
@UnfoldDataScience
@UnfoldDataScience 2 года назад
No Hetero, that is what I gave example and said it should not happen - may be missed to write "NO" before hetero in 3rd point
@sanketsanap1076
@sanketsanap1076 2 года назад
What are assumptions for Classification?
@UnfoldDataScience
@UnfoldDataScience 2 года назад
if its logistic regression u are asking about it will be mostly same
@AshokKumar-rh7ey
@AshokKumar-rh7ey 3 года назад
My data has no multicolinerity ( value is 1) and is not normal. Can i run regression ?? Plz answer
@UnfoldDataScience
@UnfoldDataScience 3 года назад
You can run but model may not be very robust.
@rajeshvenaganti6797
@rajeshvenaganti6797 3 года назад
what is mean by linear relation? this is the interview question
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Linear relation means y = mx+c kind of relation
@pixelff5044
@pixelff5044 3 года назад
Is this a assumption of OLS
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Yes. OLS is the internal methodology for Linear regression(If we don't use gradient descent).
@leecreations9133
@leecreations9133 8 месяцев назад
Explanations is ok but Writing on the board is provide notes for the students it's my suggestion
@abhishekanand5898
@abhishekanand5898 2 года назад
Because of this question, I was rejected in final round interview of ZS-Data Science Associate😂
@UnfoldDataScience
@UnfoldDataScience 2 года назад
Now that you have understood, you will do great 😊
@arvindadari3390
@arvindadari3390 3 года назад
Just a note. The relationship between dependent and independent variables should be linear, linear in terms of coefficients but not in variables. When we are doing polynomial regression, the linearity between variable with target will not hold true. As we have raised power terms.
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Thanks Arvind.
@veenap3682
@veenap3682 2 года назад
sir can u please provide the notes
@RajdeepBorgohainRajdeep
@RajdeepBorgohainRajdeep 2 года назад
I got this question today and couldn't answer other assumptions! I felt really very bad about myself
@UnfoldDataScience
@UnfoldDataScience 2 года назад
Np. it happens witj all of us. Keep trying.
@RajdeepBorgohainRajdeep
@RajdeepBorgohainRajdeep 2 года назад
@@UnfoldDataScience thanks for motivating Aman :)
@shreyashyadav1521
@shreyashyadav1521 3 года назад
Not able to get a job in field of Data science. Dont know what to do.. Frustated !!
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Hi Shreyash, lease make your resume strong. Learn new stuff, approach people for more opportunities also.
@kalam_indian
@kalam_indian 3 года назад
can you explain by taking a real life example more deeply because whatever you explained are the basic things with no depth explanation, so if possible please explain deeply by taking a good example even if the video becomes longer
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Thanks for the suggestion, Will definitely take in consideration.
@minhazuddin5169
@minhazuddin5169 3 года назад
You are extremely good in teaching I am looking for. Can I have your email address? I am from Bangladesh, beginner in research (M.Phil). I'm struggling in some topics of data analysis. I would like to contact with you if you approve. Thanks
@UnfoldDataScience
@UnfoldDataScience 3 года назад
please connect on LinkedIn
@TJ-wo1xt
@TJ-wo1xt 3 года назад
very generic explanation, u should cover them in a little detail
@UnfoldDataScience
@UnfoldDataScience 3 года назад
Thanks for feedback, will try to cover.
Далее
Regression assumptions explained!
47:16
Просмотров 253 тыс.
Gaussian Processes
23:47
Просмотров 130 тыс.
Regression Analysis | Full Course
45:17
Просмотров 833 тыс.