SOLUTIONS GRADED A+
✔✔Deep Neural Networks have only 1 hidden layer and multiple input layers. (True or
False) - ✔✔False
✔✔Each of the connections between nodes as a connection, each of those connections
has a ________ - ✔✔Activation function
✔✔from our possibly overly simplistic explanation)
In the attempt to fit values from the input layer to the output layer, the hidden layer
applies some weights to the input values. (True or False) - ✔✔True
✔✔The example we walked through was from a fairly famous dataset for learning about
machine learning. The dataset is called: - ✔✔MNIST
✔✔All the the nodes prior to the output nodes essentially 'guess' at the correct weights.
Then the algorithm checks to see if the initial guess is correct (usually not). When it is
wrong... - ✔✔It tries again (runs another epoch)
✔✔Neural networks are an unsupervised technique, because there is no target variable.
(True or False) - ✔✔False
✔✔When viewing a diagram of a neural network there are several layers. The input
layer: - ✔✔Are te Xs, or inputs from your data
✔✔When viewing a diagram of a neural network there are several layers. The Output
layer: - ✔✔Are the Ys (The target variable you are interested in)
✔✔When viewing a diagram of a neural network there are several layers. The hidden
layer: - ✔✔Something you dont see, here there is some computation to transform X into
the Y
✔✔NLP stands for - ✔✔natural language processing
✔✔Tokenization, as defined in the lecture, is - ✔✔A computer turning letters and/or
words into something it can read and understand, like numbers
✔✔Recommenders come in many flavors. 2 of the most common, often used together
and discussed in the lecture are: - ✔✔User based and Item based
, ✔✔Imagine you have a dataset with 2 columns, both filled with continuous numbers.
You believe the first column is a predictor of the second column. Which of the model
approaches below could work when building a model? - ✔✔Random forests,
regression, decision trees (Maybe not the BEST solution, Decision Trees have some
problems like overfitting that we discussed. )
✔✔Decision trees have a few problems, you should probably review those for the final
exam! The problem we talked about the most is: - ✔✔Overfitting
✔✔We will start with the most familiar linear regression, a straight-line fit to data. A
straight-line fit is a model of the form
y=ax+b
Where a is commonly known as the - ✔✔Slope
✔✔We will start with the most familiar linear regression, a straight-line fit to data. A
straight-line fit is a model of the form
y=ax+b"
Where b is commonly known as the - ✔✔Intercept
✔✔The LinearRegression estimator is only capable of simple straight line fits. (True or
false) - ✔✔False
✔✔In class we walked through 5 steps to building a machine learning model. The
textbook also goes over in some depth the 5 steps. What is step 1? - ✔✔Choosing a
class of model
✔✔In class we walked through 5 steps to building a machine learning model. The
textbook also goes over in some depth the 5 steps. What is step 2? - ✔✔Choose
hyperparameters
✔✔In class we walked through 5 steps to building a machine learning model. The
textbook also goes over in some depth the 5 steps. What is step 3? - ✔✔Aarrange data
✔✔In class we walked through 5 steps to building a machine learning model. The
textbook also goes over in some depth the 5 steps. What is step 4? - ✔✔Fit the model
✔✔In class we walked through 5 steps to building a machine learning model. The
textbook also goes over in some depth the 5 steps. What is step 5? - ✔✔Predict