Тёмный

Linear Regression + Mediation + Moderation 

Jekaterina Rogaten
Подписаться 422
Просмотров 70 тыс.
50% 1

Опубликовано:

 

4 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 23   
@MahamEmadi
@MahamEmadi 10 лет назад
This video is amazing. It's gold. I spend weeks looking for a video like this and here it is! . Thank you very much Jekaterina for your thorough explanation (Maham From Durham University)
@erentl
@erentl 9 лет назад
Thanks a lot for this helpful video. totally illustrative
@Mosbah81
@Mosbah81 8 лет назад
one of the best videos of statistic analysis i have ever watched. Thanks Prof. alot Rogaten. in the video you said that u will give a link to another video related to moderation. can we still have it?
@jekaterinarogaten3013
@jekaterinarogaten3013 8 лет назад
+Mosbah Salim Hi Mosbah, if you want to do a moderation, the method I have demonstrated in my video is quite old and out of fashion, but good for understanding the principle. Depending what you want to do analysis for and what are the requirements. The best way for dissertation, conference presentation or publication is to get a PROCESS macros for SPSS and use model 1. Process can be dowloaded from afhayes.com/. Another good video on moderation and mediation that I personally like a lot is by Andy Field ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-RqkGMqDU20Q.html.
@danielafuentes1369
@danielafuentes1369 10 лет назад
Thank you so much for this in depth explanation! It has been very helpful. I have one doubt I was hoping you could help with. How do I know I am running "support" as a moderator and not "hassles" . Is it the order in which one enters the data?
@rubabanawrin1250
@rubabanawrin1250 8 лет назад
Hi, I got your video very informative. Thanks a lot. I have a question on moderation, in your example both before and after entering the variable named (CHCS), significance of the variable "support" is same, insignificant. Now, can we still say that it has moderating effect even if the significance status doesn't change?
@jekaterinarogaten3013
@jekaterinarogaten3013 8 лет назад
+Rubaba Nawrin Hi, yes, support in this case is a moderator. It is not significant predictor of symptoms and it makes sense that support should not predict symptoms. However the centered variable CHCS which is a moderating term for support is significant. As such, support moderates the relationship between hassles and symptoms. I always think it is easier to understand moderation by thinking about it in terms of interaction in ANOVA. The first step shows main effects and centered multiplied variables in step 2 is basically an interaction. As such, you can have no significant main effect (predictor being not significant) and significant interaction (moderator being significant). I hope I am not making it more confusing.
@MaximaPax
@MaximaPax 8 лет назад
Thank you very much for this video. I'm wondering why do Betas change whit each additional predictor? How do you interpret Betas in 2nd Model for the same predictors? I'm confused. Thank you for the explanation.
@BenG5292
@BenG5292 8 лет назад
Thank you for this video, it's very informative. I do however have a question. If I have a number of control variables and independent variables that possibly predict a dependent variable and I perform a hierarchical analysis as follows: model 1 (control variables), model 2 (model 1 + independent variables), model 3 (model 2 + interaction term). If I want to discuss the coefficients of individual variables, should I only consider the last model? Or should I for instance consider model 1 when I want to assess the coefficients of the control variables, model 2 to assess the coefficients of the independent variables, ....? Thank you in advance.
@jekaterinarogaten3013
@jekaterinarogaten3013 8 лет назад
Ok, In short the alst option is correct. Basically when you do a hierarchical regression you basicallydo ANOVA but with scale rather than nominal variables. For each step of a model you are interested in two main statistics which are R2 change and its associated significance and standardised regression coefficients and their associated significance. With regression coefficients you will have output for 3 models. The 3rd model output if you look at the whole model is basically an output you would get if you do a multiple linear regression with all your variables and whole model 2 output would be the same as if you would do a multiple regression with control and independent variables. Therefore, if you need to determine the effect of any one IV on the DV you would look for their effect in model 2. If you want to see if interaction was significant, then you look at the interaction term output in model 3. I hope that helps.
@shinx2ran
@shinx2ran 8 лет назад
Thank you this video is very helpful. Uh I want to ask question, suppose that I have latent variable called Creativity, and I measured it using Likert scale. Creativity is measured with 4 question items: CR1, CR2, CR3, and CR4. Now in SPSS I have to compute CR1, CR2, CR3, and CR4 into Creativity in order to analyze it, but is it in SUM or MEAN?
@jekaterinarogaten3013
@jekaterinarogaten3013 8 лет назад
+shinx2ran It does not really matter whether you chose mean or sum. I usually work with means as it is easier to understand and interpret what a score for each participant means i.e., if their mean is 1.2 they are low on creativity and if their mean is 3.7 they are high. It is kind of intuitive. Now computing the variable as latent is a different procedure from computing an overall variable. Having a latent variable allows to control for measurement error, which simple mean of the scores does not do. I only ever use latent variables when doing SEM. I think it is reasonably alright (and everyone does) just stick with the means for regressions, mediation and moderation analyses, but do not call them latent variables, just stick with variable. Also it you run mediation and moderation you may consider using PROCESS (it is an add on macros for SPSS). Andy Field beautifully describes how to use this add on. Hope that helps.
@shinx2ran
@shinx2ran 8 лет назад
Jekaterina Rogaten sorry to bother you, I have another question about mean centering (to reduce multicollinearity). I am analyzing an effect of organizational identificaton to creative process engagement, with creative self-efficacy as moderating variable. However, multicollinearity is high in my interaction score (VIF > 60). How do we make a mean centering? I learned from other youtube link that create new variable by aggregating our variable. After that, we substract between total (or mean, I use mean) score of variable with new aggregated variable. but the result are mix of positive and negative values. How do we deal with this? I mean is it normal to have negative value (especially in social research), or do I have to make it as absolute value?
@jekaterinarogaten3013
@jekaterinarogaten3013 8 лет назад
+shinx2ran When you center a variable lets say your overall group mean is 3 and you want to center to that group mean. Now Participant 1 scored 2 and participant 3 scored 4. When you center it is not really a big deal which mean you subtract from which mean as long as you are consistent i.e., you can go and compute (P1 - group mean), (P2 - group mean) which will give us new centered score of -1 (P1) and 1 (P2). Minus in this case it totally alright as you now can say that participant 1 scored on lets say creativity 1 point bellow the average, whereas participant 2 scored 1 point above the average. Does that make sense? It is in effect same as Z-scores, but instead of dealing with SDs you can refer to the actual scale. Some people find it easier to interpret centered variables rather than Z-scores.
@MaximaPax
@MaximaPax 8 лет назад
One more question: Is it OK to add interaction in the same model as main effects? What difference does that make for interpretation?
@jekaterinarogaten3013
@jekaterinarogaten3013 8 лет назад
Ok, you can put main effects and interactions together in one step and standardised Beta coefficients will be the same as in the last step of the hierarchical regression. The reason why you do hierarchical regression and not multiple is that in hierarchical regression you get additional statistics for R2 for each step of a model (R2 and R2 change). Basically what you will get is R2 for the main effects (with significance level) and then separate R2 change for interactions (with significance level). In terms of Betas, you will interpret it as chnage in the relationship between predictor and DV as a result of a moderator. If it is significant you have a moderation and if it is not, then you do not have a moderation. There is also a posibility to draw graphs and it is easier to draw them if you convert DV into -1 0 and 1 standard deviations. Hope that helps.
@yinuojin999
@yinuojin999 8 лет назад
can I use your method for conducting moderated regression with multiple IV and multiple interaction terms simultaneously? They are all on scale measurement. You mentioned PROCESS is a more updated way to do this, but it only allows the input of one IV.. so, I am kinda stuck
@jekaterinarogaten3013
@jekaterinarogaten3013 8 лет назад
Hi, in short yes, you can use multiple IVs and interactions(moderators) doing hierarchical regression. Another alternative would be to use PROCESS and estimate different models for each IV where you use other IVs as covariates. For example, if we want to see how approaches to studying (deep, strategic and surface) predict academic performance and we would want to use prior academic performance as a moderator. We can then estimate one model for each approach to studying using other two approaches as covariates. I hope this helps.
@yinuojin999
@yinuojin999 8 лет назад
+Jekaterina Rogaten I've ran multiple analyses with one IV at a time with PROCESS, they all have significant coefficient. Then I did the ordinary multiple regression and included all the predictors, only two of the predictors remained significant. And from that I also the know the overall mode fit (R2). But with PROCESS I dont know the overall model fit, since PROCESS only creates interaction between the specified IV and the specified moderator in M variables sections. The other interactions with the IV (listed in covariates section) are missing. So what I try to say is that the estimates(beta coefficient) of certain predictor tend to change when more (strong) predictors enter the model. So if the model is missing some interaction terms how can we say that the estimates are correct for the whole overall model. In your example, how do you combine the results of the three models involving three different study approaches since they all have different squared R. Thank you in advance!
@yinuojin999
@yinuojin999 8 лет назад
+Jekaterina Rogaten to give you more context, I have 5 IV and 4 Moderator and 9 interactions terms in total. If I would switch out the IV everytime in PROCESS I would end up with 5 model results with different R squared. The whole equation containing the 5 IV, 4 moderators and the 9 interaction terms cannot be performed in one analysis in PROCESS as far I am concerned.
@jekaterinarogaten3013
@jekaterinarogaten3013 8 лет назад
That is a limitation of a PROCESS and in my case I just said that that this model is for one particular IV controlling for the effect of other IVs.
@jekaterinarogaten3013
@jekaterinarogaten3013 8 лет назад
I am a bit confused as to what it exactly contains as moderators are interactions it is the same thing. Basically you will have 5 IVs (main effects) and then that will give you ten 2-way interactions and then a number of 3-way interactions and so on. Technically you will test all possible interactions, but pretty much after 3-way interactions results will be largerly uninterpretable. Beauty of testing interactions with regression is that you can specify which interactions you are testing and your desision will be theory driven. You cannot control for interaction effects in one model the same way as you control for other IVs by entering them as covariates. I would just stick with 5 separate models for each IV.
@pah0831
@pah0831 9 лет назад
horrible illustration!
Далее
This mother's baby is too unreliable.
00:13
Просмотров 9 млн
V16 из БЕНЗОПИЛ - ПЕРВЫЙ ЗАПУСК
13:57
Mediation analysis using regression
11:39
Просмотров 74 тыс.
Linear Regression, Clearly Explained!!!
27:27
Просмотров 1,3 млн
Moderation and Mediation
34:22
Просмотров 430 тыс.
Moderator and Mediator Variables
15:44
Просмотров 94 тыс.
Interpreting Linear Regression Results
16:08
Просмотров 312 тыс.
Mediation with Bootstrapping
14:21
Просмотров 110 тыс.
How to Use SPSS: Standard Multiple Regression
36:54
Просмотров 352 тыс.
Regression Analysis | Full Course
45:17
Просмотров 822 тыс.
This mother's baby is too unreliable.
00:13
Просмотров 9 млн