100% Correct Answers
Slope Coefficient (b̂1) for the Regression Line - CORRECT ANSWER✔✔Describes the change in Y
for a one-unit change in X. It can be positive, negative, or zero, depending on the relationship
between the regression variables. The slope term is calculated as:
b̂1 = Cov(xy)/(σ^2)x
Durbin-Watson (DW) Statistic - CORRECT ANSWER✔✔DW = ∑t = 2T(ε^t−ε^t−1)2∑t=1Tε^t2
OR, if the sample size is very large:
**DW ≈ 2(1 − ρ)**-->This is more likely on test
Where:
ε^t = residual for period t
ρ = correlation coefficient between residuals from one period and those from the previous
period
-DW ≈ 2 if the error terms are homoskedastistic and not serially correlated (ρ = 0)
-DW < 2 if the error terms are POSITIVELY serially correlated (ρ > 0)
-DW > 2 if the error terms are NEGATIVELY serially correlated (ρ < 0)
Test Procedure for Positive Serial Correlation:
-H0: the regression has no positive serial correlation
,-->If DW < dl, the error terms are positively serially correlated (i.e., reject the null hypothesis of
no positive serial correlation).
-->If dl < DW < du, the test is inconclusive.
-->If DW > du, there is no evidence that the error terms are positively correlated. (i.e., fail to
reject the null of no positive serial correlation).
Where:
dl = Lower Critical DW value
du = Upper Critical DW Value
**Remember: Cannot use DW to test for Serial Correlation in an AR model
Total Sum of Squares (SST) - CORRECT ANSWER✔✔Measures the total variation in the
dependent variable. SST is equal to the sum of the squared differences between the actual Y-
values and the mean of Y (Ȳ).
SST = ∑(Yi−Ȳ)^2
Where:
Ȳ = Mean of Y
Remember:
SST = RSS + SSE
Regression Sum of Squares (RSS) - CORRECT ANSWER✔✔Measures the variation in the
dependent variable that is explained by the independent variable. RSS is the sum of the squared
distances between the predicted Y-values and the mean of Y.
,This is the "explained" portion of the variation of the dependent variable
RSS = ∑(Ŷi−Ȳ)^2
Where:
Ŷ = Predicted Y-Values
Ȳ = Mean of Y
Sum of Squared Errors (SSE) - CORRECT ANSWER✔✔Measures the unexplained variation in the
dependent variable that is explained by the independent variable; The sum of squared vertical
distances between estimated and actual Y-values
The sum of the squared vertical distances between the estimated and actual Y-values (when
drawing a regression line) is referred to as the sum of squared errors (SSE)
Measures the UNEXPLAINED variation in the dependent variable. It's also known as the sum of
squared residuals or the residual sum of squares. SSE is the sum of the squared vertical
distances between the actual Y-values and the predicted Y-values on the regression line.
The sum of squared residuals.
Regression lines minimize the SSE measure for a given scatter plot of data.
SSE = ∑(Yi−Ŷ)^2
Where:
Ŷ = Predicted Y-Values
, Total Variation (SST) - CORRECT ANSWER✔✔Total Variation = Explained Variation + Unexplained
Variation
or
SST = RSS + SSE
Standard Error of Estimate (SEE) - CORRECT ANSWER✔✔The Standard Error of Estimate (SEE)
for a regression is the standard deviation of its residuals. SEE measure accuracy of predicted
values from regression equation.
Remember: The lower the SEE, the better the model fit. In this way, SEE is a metric of quality of
model fit.
Remember: A LOW SEE implies a HIGH R^2
SEE = (MSE)^(1/2) = ((SSE/(n-k-1))^(1/2)
MSE = Mean Squared Error = (1/n)(∑(Yi−Ŷ)^2)
Mean Squared Error (MSE) - CORRECT ANSWER✔✔MSE measures the average of the squares of
the errors—that is, the average squared difference between the estimated values and the actual
value. MSE is a risk function, corresponding to the expected value of the squared error loss.
MSE = Mean Squared Error = (1/n)(∑(Yi−Ŷ)^2)
or