User Tools

Site Tools


partial_and_semipartial_correlation

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
Next revisionBoth sides next revision
partial_and_semipartial_correlation [2019/10/12 23:51] – [regression gpa against sat] hkimscilpartial_and_semipartial_correlation [2019/11/27 15:40] – [Partial and semi-partial correlation] hkimscil
Line 2: Line 2:
 references references
 {{https://web.stanford.edu/~hastie/Papers/ESLII.pdf|The Elements of Statistical Learning}} or local copy  {{https://web.stanford.edu/~hastie/Papers/ESLII.pdf|The Elements of Statistical Learning}} or local copy 
- +[{{  :pasted:20191127-150222.png?250}}]
 Simple explanation of the below procedures is like this: Simple explanation of the below procedures is like this:
   * Separately regress Y and X1 against X2, that is,    * Separately regress Y and X1 against X2, that is, 
Line 10: Line 9:
   * Regress the Y residuals against the X1 residuals.   * Regress the Y residuals against the X1 residuals.
 In the below example, In the below example,
-  * regress gpa against sat +  * regress gpa against sat (and get residuals of gpa = a + b) 
-  * regress clep against sat +  * regress clep against sat (and get residuals of clep = b + c) 
-  * regress the gpa residuals against clep residuals. +  * regress the gpa residuals against clep residuals. (''%%lm(a+b~b+c)%%''
-Take a close look at the graphs, especially, the grey areas.+  * In this case, $r^{2} = \displaystyle \frac{b}{(a+b)}$ and $b$ is very small. 
 + 
 +Take a close look at the right graph, especially, the ''%%b%%'' areas although clep's is significantly explains gpa (before controlling sat). 
 + 
  
 For more, see https://stats.stackexchange.com/questions/28474/how-can-adding-a-2nd-iv-make-the-1st-iv-significant For more, see https://stats.stackexchange.com/questions/28474/how-can-adding-a-2nd-iv-make-the-1st-iv-significant
Line 84: Line 87:
  
 linear model  linear model 
-y hat = 0.0024 X + 1.7848+''y hat = 0.0024 X + 1.7848'' 
 +''gpa hat = 0.0024 sat + 1.7848''
  
  
Line 93: Line 97:
 [1] 0.7180529</code> [1] 0.7180529</code>
  
- +Collect  
-<code>> cor.gpa.sat <- as.data.frame(cbind(sat, gpa, lm.gpa.sat$fitted.values, lm.gpa.sat$residuals))+  - sat,  
 +  - gpa,  
 +  - predicted value (y hat),  
 +  - residuals (error) 
 +And see correlation among themselves.  
 +<code> 
 +> cor.gpa.sat <- as.data.frame(cbind(sat, gpa, lm.gpa.sat$fitted.values, lm.gpa.sat$residuals))
 > colnames(cor.gpa.sat) <- c("sat", "gpa", "pred", "resid") > colnames(cor.gpa.sat) <- c("sat", "gpa", "pred", "resid")
 > round(cor.gpa.sat,5) > round(cor.gpa.sat,5)
Line 109: Line 119:
 10 550 2.9 3.13544 -0.23544 10 550 2.9 3.13544 -0.23544
 > >
-round(cor(cor.gpa.sat),3)+round(cor(cor.gpa.sat),4)
         sat   gpa  pred resid         sat   gpa  pred resid
 sat   1.000 0.718 1.000 0.000 sat   1.000 0.718 1.000 0.000
Line 116: Line 126:
 resid 0.000 0.696 0.000 1.000 resid 0.000 0.696 0.000 1.000
  
-</code>+</code> 
 +Note that  
 +  * r (sat and gpa) = .718 (sqrt(r<sup>2</sup>)=0.5156) 
 +  * r (sat and pred) = 1. In other words, predicted values (y hats) are the linear function of x (sat) values (''y hat = 0.0024 X + 1.7848'').  
 +  * r (sat and resid) = 0. residuals are orthogonal to the independent (sat) values. 
 ===== regression gpa against clep ===== ===== regression gpa against clep =====
 <code># import test score data "tests_cor.csv" <code># import test score data "tests_cor.csv"
Line 145: Line 159:
 Residual standard error: 0.1637 on 8 degrees of freedom Residual standard error: 0.1637 on 8 degrees of freedom
 Multiple R-squared:  0.7679, Adjusted R-squared:  0.7388  Multiple R-squared:  0.7679, Adjusted R-squared:  0.7388 
-F-statistic: 26.46 on 1 and 8 DF,  p-value: 0.0008808</code>+F-statistic: 26.46 on 1 and 8 DF,  p-value: 0.0008808 
 +</code> 
 + 
 +''y hat = 0.06054 * clep + 1.17438'' 
 + 
  
 <code> <code>
Line 151: Line 170:
 res.lm.gpa.clep <- lm.gpa.clep$residuals res.lm.gpa.clep <- lm.gpa.clep$residuals
 </code> </code>
 +
 {{lm.gpa.clep.png?500}} {{lm.gpa.clep.png?500}}
 +
 <code> <code>
 # get cor between gpa, sat, pred, and resid from. lm.gpa.clep # get cor between gpa, sat, pred, and resid from. lm.gpa.clep
-cor.gpa.clep <- as.data.frame(cbind(gpa, clep, lm.gpa.clep$fitted.values, lm.gpa.clep$residuals)) +cor.gpa.clep <- as.data.frame(cbind(clep, gpa, lm.gpa.clep$fitted.values, lm.gpa.clep$residuals)) 
-colnames(cor.gpa.clep) <- c("gpa", "clep", "pred", "resid")+colnames(cor.gpa.clep) <- c("clep", "gpa", "pred", "resid")
 cor(cor.gpa.clep) cor(cor.gpa.clep)
 </code> </code>
-<code>         gpa   clep   pred  resid +<code> 
-gpa   1.0000 0.8763 0.8763 0.4818 +> round(cor(cor.gpa.clep),4) 
-clep  0.8763 1.0000 1.0000 0.0000 +        clep    gpa   pred  resid 
-pred  0.8763 1.0000 1.0000 0.0000 +clep  1.0000 0.8763 1.0000 0.0000 
-resid 0.4818 0.0000 0.0000 1.0000 +gpa   0.8763 1.0000 0.8763 0.4818 
-> </code>+pred  1.0000 0.8763 1.0000 0.0000 
 +resid 0.0000 0.4818 0.0000 1.0000 
 +>  
 + 
 +        sat   gpa  pred resid 
 +sat   1.0000 0.7180 1.0000 0.0000 
 +gpa   0.7180 1.0000 0.7180 0.6960 
 +pred  1.0000 0.7180 1.0000 0.0000 
 +resid 0.0000 0.6960 0.0000 1.0000 
 + 
 +</code>
  
  
Line 195: Line 226:
  
 </code> </code>
 +
 +''Multiple R-squared:  0.7778''
 +''F (2, 7) = 12.25, p = 0.005157 ''
 +
 +''intercept 1.1607560 p = 0.0249 ''
 +''clep 0.0729294  p = 0.0239''
 +''sat 0.0007015  p = 0.5940 '' 
 +
 +One other thing that we could do help determine a pragmatic argument is to regress GPA on both SAT and CLEP at the same time to see what happens. If we do that, we find that R-square for the model is .78, F = 12.25, p < .01. The intercept and b weight for CLEP are both significant, but the b weight for SAT is not significant. The values are
 +
 +  * ''Intercept = 1.16, t=2.844, p < .05''
 +  * ''CLEP = 0.07, t=2.874, p < .05''
 +  * ''SATQ = -.0007, t=-0.558, n.s.''
 +
 +In this case, we would conclude that the significant unique predictor is CLEP. Although SAT is highly correlated with GPA, it adds nothing to the prediction equation once the CLEP score is entered. (These data are fictional and the sample size is much too small to run this analysis. It's there for illustration only.)
 +
 +Now suppose we wanted to argue something a little different. Suppose we had a theory that said that all measures of math achievement share a common explanation, which is math ability. In other words, the reason that various (all) math achievement tests are correlated is that they share the math ability factor. In other words, math ability explains the correlation between achievement tests. In path diagram form, we might represent this something like this:
 +
 + 
 ===== checking partial cor 1 ===== ===== checking partial cor 1 =====
 <code> <code>
partial_and_semipartial_correlation.txt · Last modified: 2023/05/31 08:56 by hkimscil

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki