Math 191 Spring 2023 Homework 8
due Tues, April 4, 11:59 PM
(Upload your solutions to Gradescope)
1. (5 points) Suppose A is a real, m × n matrix of rank r. In lecture 23, we stated that the two “defining properties” of the pseudo-inverse A+ are that
(1) AA+ is the orthogonal projection onto R(A).
(2) R(A+) ⊂ N (A)⊥.
Derive from these two properties that A+A is the orthogonal projection onto N(A)⊥. (Hint: con-
sider A+ Ax for x ∈ N (A) and x ∈ N (A)⊥ separately.)
2. (5 points) (Variant of I.9.6, page 80, Strang.) Suppose A = QΛQT is a real, symmetric n × n matrix with Q = (q1,…,qn) orthogonal and Λ = diag(λ1,…,λn). Suppose |λ1| ≥ |λ2| ≥ ··· ≥ |λn|. For what values of a1 is
B = q1a1q1T
a closest rank-1 approximation of A in the 2-norm? Justify your answer, and keep in mind that λ1
can be positive or negative.
3. (5 points) Compute the polar decomposition A = Q|A| of the 3 × 3 matrix
−2 3 −2 A= 0 2 −4.
Here Q is a partial isometry with the same nullspace as A and |A| is positive semi-definite. Notes: to
compute a fully reduced SVD by hand, one can either (1) solve the eigenvalue problem AT A = V ΛV T , √
discard columns of V and rows/columns of Λ corresponding to zero eigenvalues, and set S = Λ, U = AV S−1; or (2) start with a CR factorization, which I’ll call CF T here, and then compute the QR factorizations of C and F:
A=CFT =(Q1R1)(Q2R2)T =Q1(R1R2T)QT2.
You can now compute the SVD of A3 = R1R2T = U3S3V3T, which involves a smaller eigenvalue problem. Eventually, A = Q1U3S3V3T QT2 . In this problem, both approaches are somewhat tedious, so feel free to use a computer to compute the SVD of A or carry out any of the steps described above. Express your answer with rational numbers in the matrix entries of Q and |A|.
Programming Help
4. (5 points) (Variant of I.11.2, page 96, Strang.) Prove the Cauchy-Schwarz inequality ⟨u, v⟩ ≤ ∥u∥∥v∥, u, v ∈ Cn,
where ⟨u, v⟩ = uT v and ∥u∥ = ⟨u, u⟩. Hint: if v ̸= 0, the projection of u onto span(v)⊥ is w = u − ⟨v, u⟩v.
Simplify the equation ⟨w, w⟩ ≥ 0 to obtain the Cauchy-Schwarz inequality when v ̸= 0. (Be careful:
⟨v, u⟩ is a complex number.) Treat the v = 0 case separately.
5. (5 points) In problem 2 of homework 2, we proved that the 1-norm of an m × n matrix A is the maximum absolute column sum of A. Now show that the ∞-norm of an m × n matrix is the maximum absolute row sum, i.e., ∥A∥∞ = C with C = max1≤i≤m nj=1 |aij|. You just have to (i) show that ∥Ax∥∞ ≤ C∥x∥∞ for all x ∈ Cn; and (ii) produce an x ̸= 0 (tailored to A) such that ∥Ax∥∞ ≥ C∥x∥∞.
6. (5 points) The space of real m × n matrices Mm×n is commonly given the operator 2-norm, ∥B∥2 = σ1(B), where σ1(B) is the largest singular value of B. The most general linear functional on this space is obtained by taking a “double dot product” of B with a fixed matrix A, namely
mn fA(B)=A:B=akjbkj =tr(ATB). k=1 j=1
Note that fA : Mm×n → R is a linear mapping. Show that the operator norm of fA is given by
∥fA∥ = ∥A∥N , where
∥A∥N =σj(A), μ=min(m,n).
∥A∥N is known as the nuclear norm of A. At this point it is just a formula involving the singular values of A; this exercise shows that it is actually a norm. (The only difficult property to check for the nuclear norm is the triangle inequality, which automatically holds for operator norms: ∥A1 + A2∥N = ∥fA1+A2∥ = ∥fA1 + fA2∥ ≤ ∥fA1∥ + ∥fA2∥ = ∥A1∥N + ∥A2∥N.) Fixing A, you just have to (i) show that |fA(B)| ≤ ∥A∥N ∥B∥2 for all m × n matrices B, and (ii) find a particular matrix B ̸= 0 such |fA(B)| = ∥A∥N ∥B∥2.
(Hint: for part (i), let A = USV T be the full SVD of A. Then μ
where B = UTBV. Explain the last inequality and show that maxj | ̃bjj| ≤ σ1(B). Once you understand part (i), part (ii) is straightforward.)
|tr(ATB)| = |tr(VSTUTB)| = |tr(STB)| ≤
σj(A)| ̃bjj|,
Github