1. All regularized regression approaches can be used for variable selection. -
CORRECT ANSWER-False
2. Before performing regularized regression, we need to standardize or rescale the pre-
dicting variables. - CORRECT ANSWER-True
3. The larger the number of predicting variables is, the larger the bias but the smaller
the variance is. - CORRECT ANSWER-False
4. Variable selection is a simple and solved statistical problem since we can implement
it using the R statistical software. - CORRECT ANSWER-False
5. BIC penalizes for complexity of the model more than AIC or Mallow's Cp statistics. -
CORRECT ANSWER-True
6. The penalty constant λ in penalized or regularized regression controls the trade-off
between lack of fit and model complexity. - CORRECT ANSWER-True
7. The L1 penalty measures the sparsity of a vector. - CORRECT ANSWER-True
8. The lasso regression requires a numerical algorithm to minimize the penalized sum of
least squares. - CORRECT ANSWER-True
9. An unbiased estimator of the prediction risk is the training risk. - CORRECT
ANSWER-False
10. Backward and forward stepwise regression will generally provide different sets of
selected variables when p, the number of predicting variables, is large. - CORRECT
ANSWER-True
11. If there are variables that need to be used to control the bias selection in the model,
they should forced to be in the model and not being part of the variable selection
process. - CORRECT ANSWER-True
12. Penalization in linear regression models means penalizing for complex models, that
is, models with a large number of predictors. - CORRECT ANSWER-True
13. Elastic net regression uses both penalties of the ridge and lasso regression and
hence combines the benefits of both. - CORRECT ANSWER-True