multiple_regression
Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
multiple_regression [2022/11/02 21:58] – [exercise] hkimscil | multiple_regression [2023/10/19 08:39] (current) – [Determining IVs' role] hkimscil | ||
---|---|---|---|
Line 44: | Line 44: | ||
====== e.g.====== | ====== e.g.====== | ||
Data set again. | Data set again. | ||
+ | < | ||
+ | datavar <- read.csv(" | ||
^ DATA for regression analysis | ^ DATA for regression analysis | ||
Line 393: | Line 395: | ||
그렇다면 각각의 독립변인 고유의 설명력은 얼마인가? | 그렇다면 각각의 독립변인 고유의 설명력은 얼마인가? | ||
- | |||
- | ====== Why overall model is significant while IVs are not? ====== | ||
- | see https:// | ||
- | |||
- | < | ||
- | RSS = 3:10 #Right shoe size | ||
- | LSS = rnorm(RSS, RSS, 0.1) #Left shoe size - similar to RSS | ||
- | cor(LSS, RSS) # | ||
- | |||
- | weights = 120 + rnorm(RSS, 10*RSS, 10) | ||
- | |||
- | ##Fit a joint model | ||
- | m = lm(weights ~ LSS + RSS) | ||
- | |||
- | ##F-value is very small, but neither LSS or RSS are significant | ||
- | summary(m) | ||
- | </ | ||
- | |||
- | |||
- | < | ||
- | > LSS = rnorm(RSS, RSS, 0.1) #Left shoe size - similar to RSS | ||
- | > cor(LSS, RSS) # | ||
- | [1] 0.9994836 | ||
- | > | ||
- | > weights = 120 + rnorm(RSS, 10*RSS, 10) | ||
- | > | ||
- | > ##Fit a joint model | ||
- | > m = lm(weights ~ LSS + RSS) | ||
- | > | ||
- | > ##F-value is very small, but neither LSS or RSS are significant | ||
- | > summary(m) | ||
- | |||
- | Call: | ||
- | lm(formula = weights ~ LSS + RSS) | ||
- | |||
- | Residuals: | ||
- | 1 | ||
- | | ||
- | |||
- | Coefficients: | ||
- | Estimate Std. Error t value Pr(> | ||
- | (Intercept) | ||
- | LSS -14.162 | ||
- | RSS | ||
- | --- | ||
- | Signif. codes: | ||
- | |||
- | Residual standard error: 7.296 on 5 degrees of freedom | ||
- | Multiple R-squared: | ||
- | F-statistic: | ||
- | |||
- | > | ||
- | > ##Fitting RSS or LSS separately gives a significant result. | ||
- | > summary(lm(weights ~ LSS)) | ||
- | |||
- | Call: | ||
- | lm(formula = weights ~ LSS) | ||
- | |||
- | Residuals: | ||
- | | ||
- | -6.055 -4.930 -2.925 | ||
- | |||
- | Coefficients: | ||
- | Estimate Std. Error t value Pr(> | ||
- | (Intercept) | ||
- | LSS | ||
- | --- | ||
- | Signif. codes: | ||
- | |||
- | Residual standard error: 7.026 on 6 degrees of freedom | ||
- | Multiple R-squared: | ||
- | F-statistic: | ||
- | |||
- | > | ||
- | </ | ||
Line 493: | Line 420: | ||
| | Standard Multiple | | | Standard Multiple | ||
- | | r< | + | | r< |
| ::: | IV< | | ::: | IV< | ||
- | | sr< | + | | sr< |
| ::: | IV< | | ::: | IV< | ||
- | | pr< | + | | pr< |
| ::: | IV< | | ::: | IV< | ||
| IV< | | IV< |
multiple_regression.1667393881.txt.gz · Last modified: 2022/11/02 21:58 by hkimscil