Written by students who passed Immediately available after payment Read online or as PDF Wrong document? Swap it for free 4.6 TrustPilot
logo-home
Summary

Summary EECS 127 / 227AT Final Cheat Sheet | UC Berkeley | Optimization Models in Engineering

Rating
-
Sold
-
Pages
4
Uploaded on
26-04-2026
Written in
2025/2026

Comprehensive 4-page final exam cheat sheet for EECS 127 / EECS 227AT (Optimization Models in Engineering) at UC Berkeley, based on the textbook Optimization Models by Calafiore and El Ghaoui. Densely formatted in LaTeX with worked examples, formula summaries, and problem-solving patterns for every major topic in the course. Topics covered: Linear algebra: vector spaces, norms, projections, Gram-Schmidt, eigenvalues, spectral theorem, PSD/PD matrices, SVD, low-rank approximation Least squares: overdetermined and underdetermined cases, normal equations, ridge regression, LASSO, condition number Convexity: convex sets and functions, sublevel sets, separating hyperplanes, composition rules, common convex functions Optimization problems: LP, QP, QCQP, SOCP, SDP hierarchy, standard form conversions, epigraph reformulations, Weierstrass and coercivity KKT conditions: Slater's condition, complementary slackness, stationarity, worked examples for verifying and finding optima Duality: dual function, weak and strong duality, LP/QP/QCQP dual derivations, infeasibility certificates, sensitivity analysis, Farkas' lemma Convex relaxations: IP to LP, sparse recovery, low-rank recovery, matrix completion, robust PCA, compressed sensing Numerical methods: gradient descent, Newton's method, SGD, projected gradient, Frank-Wolfe, Armijo condition, convergence rates Applications: hard and soft margin SVM, hinge loss, dual SVM, LASSO subdifferential, resource allocation Quick-reference gradient and Hessian formulas, 2x2 matrix tricks, problem-solving patterns for "prove convex," "verify optimality," "LP reformulation," and "prove infeasible" Includes 15+ fully worked examples covering KKT verification, dual derivation, infeasibility proofs, sensitivity analysis, Newton steps, and LP vertex finding. Formula summary section for fast lookup during the exam. Ideal for final exam prep, midterm review, or as a permanent reference for graduate optimization coursework. Written by a UC Berkeley EECS student after taking the course with Professor Somayeh Sojoudi. Distilled from 158 pages of course reader content into 4 pages of paste-ready reference.

Show more Read less

Content preview

qP
κ(A⊤ A) = κ(A)2
P
EECS 127 Final Cheat Sheet Properties: tr(A) = λ , det(A) = Errors: ∥A − A ∥ = 2
i>k σi
Q i i k F
1 Linear Algebra i λi ∥A − Ak ∥2 = σk+1 3 Convexity
1.1 Vector Spaces λ(A−1 ) = 1/λ(A), λ(A + cI) = λ(A) + c ∥A−A ∥2
P
σ2
3.1 Convex Sets
Subspace: Closed under +, scalar ×, con- λ(Ak ) = λ(A)k Relative: ∥A∥k2 F = Pi>kσ2 i Def: ∀x, y ∈ S, ∀θ ∈ [0, 1]:P
θx+(1−θ)y ∈ S
F i i
tains 0 1.7 Spectral Theorem Unique iff σk ̸= σk+1 Convex combination: i θi xi , θi ≥ 0,
Affine set: {x0 + v : v ∈ S}, S subspace For symmetric A = A⊤ :
P
2 Least Squares i θi = 1
Fundamental Thm: N (A) ⊕ R(A⊤ ) = 2.1 Problem Convex hull: smallest convex set contain-
n
Rn X minx ∥Ax − y∥22 ing S
A = U ΛU ⊤ = λ i ui u⊤
dim(R(A)) + dim(N (A)) = n i Normal equations: A⊤ Ax = A⊤ y 3.2 Examples of Convex Sets
i=1 ⊤
Range: R(A) = {Ax : x ∈ R } n Gradient: ∇∥Ax − y∥2 = 2A⊤ (Ax − y) Hyperplane: {x : a x = b}
Nullspace: N (A) = {x : Ax = 0} U orthogonal, Λ = diag(λ1 , . . . , λn ), λi ∈ R 2.2 Cases Halfspace: {x : a⊤ x ≤ b}
Rank: rank(A) = dim(R(A)) Eigenvectors orthogonal Overdetermined (tall A, m > n): Ball: {x : ∥x − c∥ ≤ r}
1.2 Norms 1.8 PSD/PD Matrices Full col rank: x̂ = (A⊤ A)−1 A⊤ y (ls +infea Ellipsoid: {x : (x − c)⊤ P −1 (x − c) ≤ 1},
ℓp : ∥x∥pP= ( i |xi |p )1/p
P
PSD (A ⪰ 0): Any of: unique) P ≻0
∥x∥1 = i |xi | (sparsity promoting) 1. All λi ≥ 0 Underdetermined (fat A, m < n): Polyhedron: {x : Ax ≤ b}
√ ⊤ Full row rank: x̂ = A⊤ (AA⊤ )−1 y (min- Cone: {x : θx ∈ C, ∀θ ≥ 0}
pP
∥x∥2 = 2 ⊤ 2. x Ax ≥ 0 ∀x
i xi = x x
∥x∥∞ = maxi |xi | ⊤
3. A = B B for some B norm) PSD cone: Sn+ = {X : X ⪰ 0}
Matrix norms: 4. Leading principal minors ≥ 0 General: x̂ = A† y Norm ball: {x : ∥x∥p ≤ r}, p ≥ 1
Solution set: S = A† y + N (A)
qP
3.3 Operations Preserving Convexity
Frobenius: ∥A∥F = ij aij
2 = PD (A≻ 0):strict inequalities
p pP
2 a b 2.3 Geometry Intersection of convex sets
tr(A⊤ A) = i σi 2×2: ⪰ 0 iff a ≥ 0, d ≥ 0, ad ≥ b2 ŷ = Ax̂ = ΠR(A) y Affine image: f (S) = {f (x) : x ∈ S}
b d
Spectral: ∥A∥2 = P σmax (A) = σ1   Residual: y − ŷ ⊥ R(A) Affine preimage: f −1 (S)
a b
Nuclear: ∥A∥∗ = i σi (trace norm) 2×2: ≻ 0 iff a > 0, ad > b2 ∥y − Ax̂∥2 = distance from y to R(A) Scaling, translation
b d
Inequalities: ∥x∥∞ ≤ ∥x∥2 ≤ ∥x∥1 Sum: S1 + S2 = {x + y : x ∈ S1 , y ∈ S2 }
√ √ Check PSD: For 2×2, verify det ≥ 0 and 2.4 Example
∥x∥2 ≤ n∥x∥∞ , ∥x∥1 ≤ n∥x∥2
   
tr ≥ 0 1 0 1 3.4 Sublevel Sets
Triangle: ∥x + y∥ ≤ ∥x∥ + ∥y∥ 1.9 SVD A =  0 1  , y =  1 Sα = {x : f (x) ≤ α}
Cauchy-Schwarz: |x⊤ y| ≤ ∥x∥2 ∥y∥2 Xr 1 1 0 f convex ⇒ Sα convex ∀α
1.3 Orthogonality A = U ΣV ⊤ = σi ui vi⊤
   
⊤ 2 1 ⊤ 1 Converse false!
x ⊥ y ⇔ x⊤ y = 0 A A= ,A y= 3.5 Separating Hyperplanes
i=1 1 2 1
Orthonormal: {q1 , . . . , qk } with qi⊤ qj =  
If S1 , S2 convex, disjoint: ∃a ̸= 0, b:
U ∈ Rm×m : left singular vectors (eigenvecs (A⊤ A)−1 = 1 2 −1
δij ⊤ 3 −1 2 a⊤ x ≤ b ∀x ∈ S1 , a⊤ x ≥ b ∀x ∈ S2
Orthogonal matrix: Q⊤ Q = QQ⊤ = I of AA n×n )   For compact, closed sets: strict separation
1.4 Projections V ∈R : right singular vectors (eigenvecs x̂ = 1 1
3.6 Convex Functions - Definitions
3 1
Onto subspace R(A) (full col rank): of A⊤ A) 0th order: f (θx + (1 − θ)y) ≤ θf (x) + (1 −
σ1 ≥ p σ2 ≥ · · · ≥ σr > 0, r = rank(A) 2.5 Ridge Regression
ΠR(A) = A(A⊤ A)−1 A⊤ θ)f (y)
σi = λi (A A) ⊤ minx ∥Ax − y∥2 + λ∥x∥2
⊤ −1 ⊤ 1st order: (diff’able) f (y) ≥ f (x) +
Properties: Π2 = Π, Π⊤ = Π Relationships: R(A) = span{u1 , . . . , ur } x̂ = (A A + λI) A y ∇f (x)⊤ (y − x)
N (A) = span{vr+1 , . . . , vn } Always invertible, shrinks toward 0
Projection of y: ŷ = Πy Tangent plane is global underestimator

R(A ) = span{v1 , . . . , vr } Bias-variance tradeoff
Orthog complement: Π⊥ = I − Π 2.6 LASSO 2nd order: (twice diff’able) ∇2 f (x) ⪰ 0

Onto affine {x : Ax = b}: Π(y) = N (A ) = span{ur+1 , . . . , um } minx ∥Ax − y∥2 + λ∥x∥1 ∀x
y − A⊤ (AA⊤ )−1 (Ay − b) Pseudoinverse: A† = V Σ† U ⊤ Strictly convex: strict inequalities
Promotes sparsity
1.5 Gram-Schmidt Σ†ii = 1/σi if σi > 0, else 0 (unique min)
† ⊤ −1 ⊤ ℓ1 ball has corners on axes
Input: {a1 , . . . , ak }. Output: orthonormal Full col rank: A = (A A) A Strongly convex: ∇2 f ⪰ µI for µ > 0
† ⊤ ⊤ −1 QP form: minx,t ∥Ax − y∥2 + λ1⊤ t
{q1 , . . . , qk } Full row rank: A = A (AA ) 3.7 Common Convex Functions
1.10 Low-Rank Approximation
s.t. −t ≤ x ≤ t (i.e., |xi | ≤ ti ) ⊤
q1 = a1 /∥a 1∥ 2.7 Condition Number
Linear: a x + b
q̃i = ai − j<i (qj⊤ ai )qj , qi = q̃i /∥q̃i ∥ Eckart-Young: Best rank-k: Quadratic: x⊤ P x + q ⊤ x + r (P ⪰ 0)
P
κ(A) = ∥A∥ ∥A−1 ∥ = σ /σ
2 2 1 n
1.6 Eigenvalues ∥∆x∥ Norms: ∥x∥p , p ≥ 1
k ≤ κ(A) ∥∆y∥
Av = λv, v ̸= 0 (eigenvector) X ∥x∥ ∥y∥ Exponential: eax
Ak = σi ui vi⊤ = Uk Σk Vk⊤ Large κ = ill-conditioned
Characteristic poly: det(A − λI) = 0 Neg log: − log x on x > 0
i=1

Document information

Summarized whole book?
Yes
Uploaded on
April 26, 2026
Number of pages
4
Written in
2025/2026
Type
SUMMARY

Subjects

$8.98
Get access to the full document:

Wrong document? Swap it for free Within 14 days of purchase and before downloading, you can choose a different document. You can simply spend the amount again.
Written by students who passed
Immediately available after payment
Read online or as PDF

Get to know the seller
Seller avatar
anahij

Get to know the seller

Seller avatar
anahij University Of California - Berkeley
View profile
Follow You need to be logged in order to follow users or courses
Sold
-
Member since
1 week
Number of followers
0
Documents
1
Last sold
-
Berkeley EECS Study Materials

UC Berkeley EECS cheat sheets and study guides. Comprehensive, exam-tested, made by a current student.

0.0

0 reviews

5
0
4
0
3
0
2
0
1
0

Recently viewed by you

Why students choose Stuvia

Created by fellow students, verified by reviews

Quality you can trust: written by students who passed their tests and reviewed by others who've used these notes.

Didn't get what you expected? Choose another document

No worries! You can instantly pick a different document that better fits what you're looking for.

Pay as you like, start learning right away

No subscription, no commitments. Pay the way you're used to via credit card and download your PDF document instantly.

Student with book image

“Bought, downloaded, and aced it. It really can be that simple.”

Alisha Student

Working on your references?

Create accurate citations in APA, MLA and Harvard with our free citation generator.

Working on your references?

Frequently asked questions