It is proved that if n-order matrix a satisfies AAT = E and | a | = - 1, then matrix a must have an eigenvalue of - 1

It is proved that if n-order matrix a satisfies AAT = E and | a | = - 1, then matrix a must have an eigenvalue of - 1


Just prove that the determinant of | a + e | is 0
|A+E|=|A+AA^T|=|A(E+A^T)|=|A||E+A^T|=-|(A+E)^T|=-|A+E|
If we move one term, we will get 2|a + e | = 0, thus |a + e | = 0, that is, a must have an eigenvalue of - 1
Further discussion: q1054 7 2 1 2 4 6



Let a be a second-order matrix, A1 and A2 be linearly independent two-dimensional column vectors, and Aa1 = 2A1, aa2 = 2A1 + A2, find the eigenvalues of matrix A


A 1, a 2 are linearly independent, so the matrix P = (a 1, a 2) is invertible
Aa1 = 2A1, aa2 = 2A1 + A2, so AP = Pb, B is a matrix
2 2
0 1
So a is similar to B and has the same eigenvalue, and the eigenvalue of B is 2,1, so the eigenvalue of a is 2,1
----------
Further, it can be found that A1 is the eigenvector corresponding to 2, and 2a1-a2 is the eigenvector corresponding to 1



If the vector x is the eigenvector corresponding to a non-zero eigenvalue λ of matrix A, then x is the linear combination of the column vectors of A


That's right
Because AX = λ X
So (A1,..., an) x = λ X
That is to say, λ x is a linear combination of a set of column vectors
Because λ≠ 0, X is a linear combination of a set of column vectors