1. Diagonal matrices
A matrix A is a diagonal matrix if it is a square matrix with Aij=0 whenever i≠j.
- Prove or disprove: If A and B are diagonal matrices of the same size, so is AB.
Let p(A)=Πi Aii. Prove or disprove: If A and B are diagonal matrices as above, then p(AB) = p(A)p(B).
We need to show that (AB)ij=0 for i≠j (we don't care what happens when i=j). Let i≠j, and compute (AB)ij = ∑k AikBkj = AiiBij + AijBjj = 0, where the first simplification uses the fact that Aik=0 unless i=k (and similarly for Bkj), and the second uses the assumption that i≠j.
Now we care what happens to (AB)ii. Compute (AB)ii = ∑k AikBki = AiiBii. So p(AB) = Πi (AB)ii = ∏i (AiiBii) = (∏i Aii)(Πi Bii) = p(A)p(B).
2. Matrix square roots
- Show that there exists a matrix A such that A≠0 but A²=0.
- Show that if A²=0, there exists a matrix B such that B²=I+A. Hint: What is (I+A)²?
Here is a simple example of a nonzero matrix whose square is 0:
For the second part, the hint suggests looking at (I+A)² = I² + IA + AI + A² = I + 2A (since IA=AI=A and it is given that A²=0). So I+A is almost right, but there is that annoying 2 there. We can get rid of the 2 by setting B instead to I+½A, which gives B² = (I+½A)² = I+A+¼A² = I+A.
3. Dimension reduction
Let A be an n×m random matrix obtained by setting each entry Aij independently to ±1 with equal probability.
Let x be an arbitrary vector of dimension m.
Compute E[||Ax||²], as a function of ||x||, n, and m, where ||x|| = (x⋅x)1/2 is the usual Euclidean length.
Mostly this is just expanding definitions.
The second-to-last step follows because E[AijAik] = 0 when Aij and Aik are independent (i.e., when j≠k) and E[AijAij] = E[(±1)²) = 1.
4. Non-invertible matrices
Let A be a square matrix.
Prove that if Ax=0 for some column vector x≠0, then A-1 does not exist.
Prove that if the columns of A are not linearly independent, then A-1 does not exist.
Prove that if the rows of A are not linearly independent, then A-1 does not exit.
Suppose A-1 exists and that Ax=0 for some nonzero x. Then x = (A-1A)x = A-1(Ax) = A-10 = 0, a contradiction.
Let A⋅i represent the i-th column of A. If the columns of a are not linearly independent, there exist coefficients xi, not all zero, such that ∑ xiA⋅i = 0. But then Ax = ∑ xiA⋅i = 0, where x is the (nonzero) vector of these coefficients. It follows from the previous case that A is not invertible.
Observe that if A has an inverse, then so does its transpose A', since if A-1 exists we have (A-1)'A' = (AA-1)' = I and A'(A-1)' = (A-1A)' = I. If the rows of A are not linearly independent, then neither are the columns of A'; it follows that A' has no inverse, and thus neither does A.