The Least Squares Approach - ANSWER- Stastical technique to determine the line
best fit for a model
- looks for regression line that MINIMIZES the sum of the squared deviations (SS)
Interpret B0 and B1 - ANSWERB0: the y-intercept (value of y, when x=0)
B1: the slope (for every 1-unit increase in x, y increases/decreases by the
slope(B1) )
Correlation Coefficient = ? - ANSWER"r"
Coefficient of Determination = ? - ANSWER"r^2"
Facts about "r" (correlation coefficient) - ANSWER- ranges from -1 to +1
- **Measures STRENGTH OF ASSOCIATION
* remember the graphs that represent the line as either + or -
Facts about "r^2" (Coefficient of Determination) - ANSWERmeasures the goodness
of fit of the relationship between independent/dependent variables in a regression
analysis
The equation for "r^2" - ANSWERr^2 = SSR / SST
Where:
- SSR = sum of squared regressions
-SST = Total of Sum Squares
The 4 Assumptions about ANOVA: - ANSWER1. Residuals are NORMALLY
DISTRIBUTED
2. Residuals have a MEAN OF ZERO
3. Residuals have CONSTANT VARIANCE
4. Residuals are INDEPENDENT
When residuals are NOT independent, what are they known as? - ANSWERAuto-
correlation
Simple Linear Regression: Null and Alternative Hypothesis - ANSWER(Null) H0: B1
=0
, (alt) Ha: B1 does NOT equal to 1
** we REJECT Ha when the p-value is < 0 (less than 0)
In Simple Linear Regression, we use Null and Alt. Hypo. for testing .....? -
ANSWERTesting t-ratio, which is the number of standard deviations between the
slope and a slope = to 0.
When Large: P-value DECREASES
When Small: p-value INCREASES
simple linear regression model - ANSWERan equation that describes the straight-
line relationship between a dependent variable and an independent variable
**describes how y is related to x and an error term**
How do you calculate F from a partial table? (ANOVA) - ANSWERF = MS MODEL /
MSE
MS Model = Means of squared Model
MSE = Means of squared error
Ch 15 - ANSWER.....
Multiple Regression Model - ANSWERthe equation that describes how the
dependent variable (y) is related to the independent variables (x1,x2,...,xp) and an
error term
In MULTIPLE regression, the F and t-tests have _____________? -
ANSWERDifferent purposes
Multi-collinearity - ANSWERWhen the independent variables are correlated, we
measure with the Variance Inflation Factor (VIF)
VIF Rules (multi-collinearity) - ANSWER1. If VIF > 10, you've got a serious problem
with collinearity
2. If VIF < 10 (or near 1) you're good to go and there's no serious problem
Dummy Variables - ANSWERan independent variable that in nominal, or binary [0,1]
Multiple Regression Model equation - ANSWERy = B0 + B1X1 + B2X2 + B3X3