Geschreven door studenten die geslaagd zijn Direct beschikbaar na je betaling Online lezen of als PDF Verkeerd document? Gratis ruilen 4,6 TrustPilot
logo-home
Tentamen (uitwerkingen)

CS 7641 FINAL EXAM QUESTIONS ANSWERED CORRECTLY LATEST UPDATE 2026

Beoordeling
-
Verkocht
-
Pagina's
4
Cijfer
A+
Geüpload op
17-04-2026
Geschreven in
2025/2026

CS 7641 FINAL EXAM QUESTIONS ANSWERED CORRECTLY LATEST UPDATE 2026 Four optimization approaches - Answers 1) Generate and test 2) Calculus 3) Newton's Method 4) Randomized Optimization Hill Climbing Algorithm - Answers Guess x∈X Repeat the following: Let n*=argmax_n∈N(x) f(n) If f(n)f(x): x=n Else: stop Disadvantage: - Get stuck in local optima Randomized Restart Hill Climbing - Answers Same as Hill Climbing but once local optimum reached, restart again with a different starting x Advantage: - Won't get stuck in local optimum - Not much more expensive than HC (constant factor) Disadvantage: - May not do better than enumeration (depends on size of attraction basin around global optimum) Entropy - Answers -∑p(s)log₂p(s) Number of bits per symbol (probability of symbol X # of bits to describe that symbol) Joint Entropy - Answers H(x,y)=-∑p(x,y)log₂p(x,y) Randomness contained in two variables together Conditional Entropy - Answers H(y|x)=-∑p(x,y)log₂p(y|x) Randomness of one variable given the other variable Entropy if x and y are independent - Answers H(Y|X)=H(Y) Y doesn't get any info from x H(X,Y)=H(X)+H(Y) Joint entropy is sum Mutual Information - Answers I(x,y)=H(y)-H(y|x)=I(y,x) Measure of reduction of randomness of variable given some knowledge of another variable. Specific case of KL Divergence Kullback-Leibler Divergence - Answers Always non-negative Zero when P is equal to Q Measures distance between any two distributions Supervised Learning - Answers Use labeled training data to generalize labels to new instances (function approximation) Unsupervised Learning - Answers Make sense out of unlabeled data (data description) Single Linkage Clustering (SLC) Algorithm - Answers - Consider each object a cluster (n objects) - Define intercluster distance as distance between closest two points in the two clusters - Merge 2 closest clusters - Repeat n-k times to make k clusters Notes: - Running time O(n²)x(~n) k-Means Clustering Algorithm - Answers - Pick k centers at random - Each center claims closest points - Recompute the centers by averaging the clustered points - Repeat until convergence Notes: - Can get stuck Expectation Maximization - Answers Expectation - soft clustering: E[Zᵢ]=(P(x=xᵢ|µ=µ)/(∑ᵏP(x=xᵢ|µ=µ) "Likelihood data element i comes from cluster j" Maximization - calculating meas of clusters: µ=(∑E[Zᵢ]xᵢ)/(∑E[Zᵢ]) P(x=xᵢ|µ=µ)=exp[-½σ²(xᵢ-µ)²] Properties: - Monotonically non-decreasing likelihood - Does not converge - Will not diverge - Can get stuck - overcame with randomization - Works with any distribution Clustering Properties - Answers Richness Scale Invariance Consistency Impossibility Theorem - Answers No clustering scheme can achieve all three of: - richness - scale invariance - consistency Why Feature Selection? - Answers 1) Knowledge Discovery - Interpretability and insight 2) Curse of Dimensionality (number of data grows exponentially 2^n with number of features) Filtering versus Wrapping - Answers Filtering - filters features before learning algorithm - Ignores learner - Fast Wrapping - search for features is wrapped around the learning algorithm - Takes into account model bias - Slow Wrapping Methods of Feature Selection - Answers - Hill Climbing - Randomized Optimization - Forward/Backward Selection Strongly relevant - Answers Removing xi degrades the Bayes Optimal Classifier (BOC) {Recall BOC is the classifier that takes the weighted average of all hypotheses based on their probability of being the correct hypothesis} Usefulness - Answers Measures effect on particular predictor (MINIMIZING ERROR|LEARNER) Feature Transformation - Answers Pre-processing a set of features to create a new feature set, while retaining as much (relevant/useful) information as possible Principal Component Analysis - Answers Finds directions that maximize variance and that are mutually orthogonal Independent Components Analysis - Answers Finds linear transformation of feature space into new feature space such that new features are mutually independent (mutual info I between new pairs is 0, I between old and new is high as possible) Random Components Analysis - Answers Projects into random directions Linear Discriminant Analysis - Answers Finds a projection that discriminates based on the label Reinforcement Learning - Answers Agent must learn behavior through trial-and-error interactions in a dynamic environment. Markov Decision Process (MDP) - Answers Variable Length Encoding - Answers Code which maps source symbols to a variable number of bits (smaller bits for more frequent symbols) kMeans properties - Answers - Each iteration polynomial - Finite (exponential) iterations - Error usually decreases monotonically - Can get stuck EM Properties - Answers - Monotonically non-decreasing likelihood - Does not converge (but does practically) b/c of infinite configurations of probabilities - Will not diverge - Can get stuck - Will work with any distribution Richness - Answers For any assignment of objects to clusters, there is some distance matrix D such that your clustering scheme P_D returns the clustering Scale invariance - Answers Scaling distances by a positive value doesn't change the clustering. Consistency - Answers Shrinking intracluster distances and expanding intercluster distances does not change the clustering Best Reconstruction - Answers Minimize L₂ error (squared error) moving from N--m dimensions Policy - Answers Function from state to action.

Meer zien Lees minder
Instelling
CS 7641
Vak
CS 7641

Voorbeeld van de inhoud

CS 7641 FINAL EXAM QUESTIONS ANSWERED CORRECTLY LATEST UPDATE 2026

Four optimization approaches - Answers 1) Generate and test
2) Calculus
3) Newton's Method
4) Randomized Optimization
Hill Climbing Algorithm - Answers Guess x∈X
Repeat the following:
Let n*=argmax_n∈N(x) f(n)
If f(n)>f(x): x=n
Else: stop

Disadvantage:
- Get stuck in local optima
Randomized Restart Hill Climbing - Answers Same as Hill Climbing but once local optimum reached,
restart again with a different starting x

Advantage:
- Won't get stuck in local optimum
- Not much more expensive than HC (constant factor)
Disadvantage:
- May not do better than enumeration (depends on size of attraction basin around global optimum)
Entropy - Answers -∑p(s)log₂p(s)

Number of bits per symbol (probability of symbol X # of bits to describe that symbol)
Joint Entropy - Answers H(x,y)=-∑p(x,y)log₂p(x,y)

Randomness contained in two variables together
Conditional Entropy - Answers H(y|x)=-∑p(x,y)log₂p(y|x)

Randomness of one variable given the other variable
Entropy if x and y are independent - Answers H(Y|X)=H(Y) Y doesn't get any info from x
H(X,Y)=H(X)+H(Y) Joint entropy is sum
Mutual Information - Answers I(x,y)=H(y)-H(y|x)=I(y,x)

Measure of reduction of randomness of variable given some knowledge of another variable.

Specific case of KL Divergence
Kullback-Leibler Divergence - Answers Always non-negative
Zero when P is equal to Q
Measures distance between any two distributions
Supervised Learning - Answers Use labeled training data to generalize labels to new instances
(function approximation)
Unsupervised Learning - Answers Make sense out of unlabeled data (data description)
Single Linkage Clustering (SLC) Algorithm - Answers - Consider each object a cluster (n objects)
- Define intercluster distance as distance between closest two points in the two clusters
- Merge 2 closest clusters
- Repeat n-k times to make k clusters

Notes:
- Running time O(n²)x(~n)
k-Means Clustering Algorithm - Answers - Pick k centers at random
- Each center claims closest points
- Recompute the centers by averaging the clustered points
- Repeat until convergence

Notes:

Geschreven voor

Instelling
CS 7641
Vak
CS 7641

Documentinformatie

Geüpload op
17 april 2026
Aantal pagina's
4
Geschreven in
2025/2026
Type
Tentamen (uitwerkingen)
Bevat
Vragen en antwoorden

Onderwerpen

$11.49
Krijg toegang tot het volledige document:

Verkeerd document? Gratis ruilen Binnen 14 dagen na aankoop en voor het downloaden kun je een ander document kiezen. Je kunt het bedrag gewoon opnieuw besteden.
Geschreven door studenten die geslaagd zijn
Direct beschikbaar na je betaling
Online lezen of als PDF

Maak kennis met de verkoper

Seller avatar
De reputatie van een verkoper is gebaseerd op het aantal documenten dat iemand tegen betaling verkocht heeft en de beoordelingen die voor die items ontvangen zijn. Er zijn drie niveau’s te onderscheiden: brons, zilver en goud. Hoe beter de reputatie, hoe meer de kwaliteit van zijn of haar werk te vertrouwen is.
TutorJosh Chamberlain College Of Nursing
Volgen Je moet ingelogd zijn om studenten of vakken te kunnen volgen
Verkocht
438
Lid sinds
1 jaar
Aantal volgers
17
Documenten
31708
Laatst verkocht
21 uur geleden
Tutor Joshua

Here You will find all Documents and Package Deals Offered By Tutor Joshua.

3.5

73 beoordelingen

5
26
4
16
3
14
2
1
1
16

Recent door jou bekeken

Waarom studenten kiezen voor Stuvia

Gemaakt door medestudenten, geverifieerd door reviews

Kwaliteit die je kunt vertrouwen: geschreven door studenten die slaagden en beoordeeld door anderen die dit document gebruikten.

Niet tevreden? Kies een ander document

Geen zorgen! Je kunt voor hetzelfde geld direct een ander document kiezen dat beter past bij wat je zoekt.

Betaal zoals je wilt, start meteen met leren

Geen abonnement, geen verplichtingen. Betaal zoals je gewend bent via iDeal of creditcard en download je PDF-document meteen.

Student with book image

“Gekocht, gedownload en geslaagd. Zo makkelijk kan het dus zijn.”

Alisha Student

Bezig met je bronvermelding?

Maak nauwkeurige citaten in APA, MLA en Harvard met onze gratis bronnengenerator.

Bezig met je bronvermelding?

Veelgestelde vragen