How to transform the matrix into the simplest row 0 3 2 2 1 0 3 1 2 1 0 1 3 2 1 2

How to transform the matrix into the simplest row 0 3 2 2 1 0 3 1 2 1 0 1 3 2 1 2


The second line is multiplied by - 2 and - 3 respectively, and added to the third and fourth lines 0 32 210 1 - 6 - 10 2 - 8 - 11, 2 lines swap 1 03 103 2 20 1 - 6 - 10 2 - 8 - 12, 3 lines swap 1 03 10 1 - 6 - 10 3 2 20 2 - 8 - 12, X (- 3) is added to the third line, 2 lines X (- 2) is added to the fourth line 10 3 10 1



Matrix and transformation
1. Let λ be an eigenvalue of matrix A, and prove that λ 2 is an eigenvalue of A2
If A2 = a, prove that the eigenvalue of a is 0 or 1


λ is an eigenvalue of matrix A
be
λp=Ap
Multiply λ twice at the same time
be
λ^2p=λAp=A(λp)=A(Ap)=A^2p
be
λ^ 2 is an eigenvalue of a ^ 2



Finding similar transformation matrix
Let a = {200}, B = {100}, a and B be similar, and find m such that B = m-1am
{0 0 1 } {0 -1 0}
{0 1 0 } {0 -6 2}
Is there any simple way to calculate m, instead of calculating eigenvalues first, finding eigenvectors, and then multiplying matrices?


Without eigenvalue eigenvector method, we have to use similarity transformation
The similarity transformation is difficult to control



Let the real matrix a be a positive definite matrix, and prove that for any positive integer AK, it is also a positive definite matrix,
Make the symbols clear,


Is AK the K power of a?
The eigenvalue of a is λ
Then the eigenvalue of a ^ k is λ ^ k (this is a common conclusion)
A is a positive definite matrix
Then all eigenvalues of a > 0
λ^k>0
So the eigenvalues of a ^ k are all greater than 0
So a ^ k is a positive definite matrix



Let a and B be real symmetric matrices of order n and m respectively, and B be positive definite matrix. It is proved that there exists m * n non-zero matrix H such that b-hah 'becomes positive definite matrix


It is proved that B is a real symmetric matrix of order m, then the eigenvalues of B are all formal real numbers, and for any m-dimensional vector x, 0 b1x'x - (B1 / AM) × am x'x > 0,
So b-hah 'becomes a positive definite matrix



Let m * r matrix f be column full rank and R * n matrix G be row full rank. It is proved that rank (FG) = R,


There are invertible matrices P1, Q1, P2 and Q2 of order m, R, R and n such that f = P1 [i]_ r,0]Q1G=P2[I_ r. So FG = P1 [q1p2,0; 0,0] Q2 is not the most basic offset transformation. We can use Gauss elimination method to realize that for any matrix A, there is always invertible matrix P, Q such that PA



If a and B are matrices of order n, and ab = Ba, it is proved that if a and B are similar to diagonal matrices, then there is an invertible matrix C such that C ^ 1Ac and C ^ 1BC are diagonal matrices


A. If C ^ (- 1) AC and C ^ (- 1) BC are diagonal matrices, then C ^ (- 1) ACC ^ (- 1) BC = C ^ (- 1) BCC ^ (- 1) AC, so a and B are commutative



Let a and B be matrices of order n. It is proved that a partitioned matrix AB Ba is invertible if and only if a + B A-B are invertible


Using the property of determinant
|A B
B A |=
|A+B B
A+B A|=
|A+B B
0 A-B|=|A+B||A-B|
According to the necessary and sufficient condition that the matrix is invertible is that the determinant is not zero, the proposition holds



Let a and B be square matrices of order n, and R (a) + R (b)


If AMB = 0
R (AMB) > = R (AM) + R (BM) - R (m) (Frobenius formula) because m is reversible, so r (AM) = R (a), R (BM) = R (b), R (m) = n, so 0 > = R (a) + R (b) - N, that is n > = R (a) + R (b)
If R (a) + R (b) = 1
From Frobenius formula R (a) + R (b)



Let a be an M * n matrix and B be an n * m matrix, where n


There is something wrong with the title
AB is irreversible, (or m)