User Tools

Site Tools


regression

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
Next revisionBoth sides next revision
regression [2023/05/17 08:02] hkimscilregression [2023/05/24 08:41] – [E.g., 3. Simple regression: Adjusted R squared & Slope test] hkimscil
Line 663: Line 663:
  
  
-**__r-square:__**+===== r-square =====
   * $\displaystyle r^2=\frac{SS_{total}-SS_{res}}{SS_{total}} = \frac{\text{Explained sample variability}}{\text{Total sample variability}}$   * $\displaystyle r^2=\frac{SS_{total}-SS_{res}}{SS_{total}} = \frac{\text{Explained sample variability}}{\text{Total sample variability}}$
  
Line 672: Line 672:
  
  
-**__Adjusted r-square:__** +===== Adjusted r-square =====
   * $\displaystyle r^2=\frac{SS_{total}-SS_{res}}{SS_{total}} = 1 - \frac{SS_{res}}{SS_{total}} $ ,   * $\displaystyle r^2=\frac{SS_{total}-SS_{res}}{SS_{total}} = 1 - \frac{SS_{res}}{SS_{total}} $ ,
  
Line 693: Line 693:
       * R2 value goes down -- which means        * R2 value goes down -- which means 
       * more (many) IVs is not always good       * more (many) IVs is not always good
-  * Therefore, the Adjusted r<sup>2</sup> = .367 / 1.5 = 0.756 (green color cell)+  * Therefore, the Adjusted r<sup>2</sup>1- (.367 / 1.5= 0.756 (green color cell)
  
-**__Slope test__**+===== Slope test =====
 If we take a look at the ANOVA result: If we take a look at the ANOVA result:
  
Line 706: Line 706:
 | b Dependent Variable: y    ||||||| | b Dependent Variable: y    |||||||
 <WRAP clear /> <WRAP clear />
 +F test recap. 
   * ANOVA, F-test, $F=\frac{MS_{between}}{MS_{within}}$   * ANOVA, F-test, $F=\frac{MS_{between}}{MS_{within}}$
-  * MS_between? +    * MS_between? 
-  * MS_within? +    * MS_within? 
-  * MS for residual +  * regression에서 within 에 해당하는 것 == residual 
    * $s = \sqrt{s^2} = \sqrt{\frac{SS_{res}}{n-2}} $    * $s = \sqrt{s^2} = \sqrt{\frac{SS_{res}}{n-2}} $
-   * random difference (MS<sub>within</sub> ): $s^2 = \frac{SS_{res}}{n-2} $ +   왜냐하면 이 ss residual이 random difference 를 말하는 것이므로 (MS<sub>within</sub> ): $s^2 = \frac{SS_{res}}{n-2} $ 
   * MS for regression . . . Obtained difference   * MS for regression . . . Obtained difference
    * do the same procedure at the above in MS for <del>residual</del> regression.    * do the same procedure at the above in MS for <del>residual</del> regression.
Line 736: Line 736:
  
   * t-test   * t-test
- 
    * $\displaystyle t=\frac{b_{1} - \text{Hypothesized value of }\beta_{1}}{s_{b_{1}}}$    * $\displaystyle t=\frac{b_{1} - \text{Hypothesized value of }\beta_{1}}{s_{b_{1}}}$
- 
    * Hypothesized value of beta 값은 대개 0. 따라서 t 값은    * Hypothesized value of beta 값은 대개 0. 따라서 t 값은
- 
    * $\displaystyle t=\frac{b_{1}}{s_{b_{1}}}$    * $\displaystyle t=\frac{b_{1}}{s_{b_{1}}}$
 +   * 기울기에 대한 표준오차는 (se) 아래와 같이 구한다
 +
 +\begin{eqnarray*}
 +\displaystyle s_{b_{1}} & = & \sqrt {\frac {MSE}{SS_{X}}} \\
 + & = & \displaystyle \sqrt { \frac{1}{n-2} * \frac{SSE}{SS_{X}}} \\ 
 + & = & \displaystyle \sqrt { \frac{1}{n-2} * \frac{ \Sigma{(Y-\hat{Y})^2} }{ \Sigma{ (X_{i} - \bar{X})^2 } } } \\
 +\end{eqnarray*}
  
-   * $\displaystyle s_{b_{1}} = \sqrt {\frac {MSE}{SS_{X}}} = \frac{\sqrt{\frac{SSE}{n-2}}}{\sqrt{SS_{X}}} = \displaystyle \frac{\sqrt{\frac{\Sigma{(Y-\hat{Y})^2}}{n-2}}}{\sqrt{\Sigma{(X_{i}-\bar{X})^2}}} $ 
  
 ^ X  ^ Y  ^ $X-\bar{X}$  ^ ssx  ^ sp  ^ y<sub>predicted</sub>  ^ error  ^ error<sup>2</sup>  ^ ^ X  ^ Y  ^ $X-\bar{X}$  ^ ssx  ^ sp  ^ y<sub>predicted</sub>  ^ error  ^ error<sup>2</sup>  ^
Line 756: Line 759:
 SSE = Sum of Square Error SSE = Sum of Square Error
 기울기 beta(b)에 대한 표준오차값은 아래와 같이 구한다.  기울기 beta(b)에 대한 표준오차값은 아래와 같이 구한다. 
 +
 \begin{eqnarray*} \begin{eqnarray*}
 se_{\beta} & = & \frac {\sqrt{SSE/n-2}}{\sqrt{SSX}} \\ se_{\beta} & = & \frac {\sqrt{SSE/n-2}}{\sqrt{SSX}} \\
regression.txt · Last modified: 2024/05/22 08:19 by hkimscil

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki