Questions And Answers Written
Correctly
Four optimization approaches - correct answer <<<<<💕💕💕✔✔1) Generate and test
2) Calculus
3) Newton's Method
4) Randomized Optimization
Hill Climbing Algorithm - correct answer <<<<<💕💕💕✔✔Guess x∈X
Repeat the following:
Let n*=argmax_n∈N(x) f(n)
If f(n)>f(x): x=n
Else: stop
Disadvantage:
- Get stuck in local optima
Randomized Restart Hill Climbing - correct answer <<<<<💕💕💕✔✔Same as Hill Climbing but
once local optimum reached, restart again with a different starting x
Advantage:
- Won't get stuck in local optimum
- Not much more expensive than HC (constant factor)
Disadvantage:
- May not do better than enumeration (depends on size of attraction basin around global optimum)
, Entropy - correct answer <<<<<💕💕💕✔✔-∑p(s)log₂p(s)
Number of bits per symbol (probability of symbol X # of bits to describe that symbol)
Joint Entropy - correct answer <<<<<💕💕💕✔✔H(x,y)=-∑p(x,y)log₂p(x,y)
Randomness contained in two variables together
Conditional Entropy - correct answer <<<<<💕💕💕✔✔H(y|x)=-∑p(x,y)log₂p(y|x)
Randomness of one variable given the other variable
Entropy if x and y are independent - correct answer <<<<<💕💕💕✔✔H(Y|X)=H(Y) Y doesn't get
any info from x
H(X,Y)=H(X)+H(Y) Joint entropy is sum
Mutual Information - correct answer <<<<<💕💕💕✔✔I(x,y)=H(y)-H(y|x)=I(y,x)
Measure of reduction of randomness of variable given some knowledge of another variable.
Specific case of KL Divergence
Kullback-Leibler Divergence - correct answer <<<<<💕💕💕✔✔Always non-negative
Zero when P is equal to Q
Measures distance between any two distributions
Supervised Learning - correct answer <<<<<💕💕💕✔✔Use labeled training data to generalize
labels to new instances (function approximation)