Written by students who passed Immediately available after payment Read online or as PDF Wrong document? Swap it for free 4.6 TrustPilot
logo-home
Exam (elaborations)

CS7643 Quiz 3 Exam – Deep Learning Concepts & Neural Networks Study Guide

Rating
-
Sold
-
Pages
10
Grade
A+
Uploaded on
01-05-2026
Written in
2025/2026

CS7643 Quiz 3 Exam – Deep Learning Concepts & Neural Networks Study Guide

Institution
Computer Tech
Course
Computer Tech

Content preview

CS7643 Quiz 3 Exam – Deep
Learning Concepts & Neural
Networks Study Guide
Modeling Error - ANSWER-Given a particular NN architecture, the actual model
that represents the real world may not be in that space.


When model complexity increases, modeling error reduces, but optimization error
increases.


Estimation Error - ANSWER-Even if finding the best hypothesis, weights, and
parameters that minimize training error, may not generalize to test set


Optimization Error - ANSWER-Even if your NN can perfectly model the world, your
algo may not find good weights that model the function.


When model complexity increases, modeling error reduces, but optimization error
increases.


Effectiveness of transfer learning under certain conditions - ANSWER-Remove last
FC layer of CNN and initialize it randomly, then run new data through network to
train only that layer

, In order to train the NN for transfer learning -freeze the CNN layers or early layers
and learn parameters in the FC layers.
Performs very well on very small amount of training, if similar to the original data
Does not work very well if the target task's dataset is very different
If you have enough data in the target domain, and is different than the source,
better to just train on the new data




Transfer learning = reuse features we learn on a very large dataset on a completely
new thing
Steps:
Train on very large dataset
Take custom dataset and initialize network with weights trained in Step 1 (replace
last fully connected layer since classes in new network will be different)
Final step -> continue training on new dataset
Can either retrain all weights ("finetune") or freeze (ie: not update) weights in
certain layers (freezing reduces number of parameters that you need to learn)


AlexNet - ANSWER-2x(CONV=>MAXPOOL=>NORM)=>3xCONV=>MAXPOOL=>3xFC
ReLU, specialized normalization layers, PCA-based data augmentation, Dropout,
Ensembling (used 7 NN with different random weights)
Critical development: More depth and ReLU


VGGNet - ANSWER-2x(2xCONV=>POOL)=>3x(3xCONV=>POOL)=>3xFC

Written for

Institution
Computer Tech
Course
Computer Tech

Document information

Uploaded on
May 1, 2026
Number of pages
10
Written in
2025/2026
Type
Exam (elaborations)
Contains
Questions & answers

Subjects

$11.99
Get access to the full document:

Wrong document? Swap it for free Within 14 days of purchase and before downloading, you can choose a different document. You can simply spend the amount again.
Written by students who passed
Immediately available after payment
Read online or as PDF

Get to know the seller
Seller avatar
Sharonharry

Get to know the seller

Seller avatar
Sharonharry Chamberlain College Of Nursing
Follow You need to be logged in order to follow users or courses
Sold
2
Member since
2 months
Number of followers
0
Documents
708
Last sold
3 weeks ago

0.0

0 reviews

5
0
4
0
3
0
2
0
1
0

Recently viewed by you

Why students choose Stuvia

Created by fellow students, verified by reviews

Quality you can trust: written by students who passed their tests and reviewed by others who've used these notes.

Didn't get what you expected? Choose another document

No worries! You can instantly pick a different document that better fits what you're looking for.

Pay as you like, start learning right away

No subscription, no commitments. Pay the way you're used to via credit card and download your PDF document instantly.

Student with book image

“Bought, downloaded, and aced it. It really can be that simple.”

Alisha Student

Working on your references?

Create accurate citations in APA, MLA and Harvard with our free citation generator.

Working on your references?

Frequently asked questions