Proof: let n-order square matrix a satisfy a ^ 2 = a, and prove that the eigenvalue of a is 1 or 0
Let a be the eigenvalue of matrix A and X be the corresponding nonzero eigenvector
Then AX = ax
aX = AX = A^2X = A(AX) = A(aX) = aAX = a(aX) = a^2X,
(a^2 - a)X = 0,
Because x is a nonzero vector, so
0 = a^2 - a = a(a-1),
A = 0 or 1
RELATED INFORMATIONS
- 1. Let a be a 3-order square matrix with eigenvalues of 1,2, - 3, find the eigenvalues of a ^ 2-3a + A ^ - 1 + 2E, and | a ^ 2-3a + A ^ - 1 + 2e |, hope to give the process
- 2. Let n square matrix a satisfy a ^ 2 = A and E be the identity matrix of order n, and prove that R (a) + R (A-E) = n
- 3. Let a be a square matrix of order n, and satisfy AA ^ t = E and | a | = - 1, prove that the determinant | e + a | = 0 My question is why |A| |E+A'| = |A| |(E+A)'| = |A| |E+A|
- 4. Let a be a square matrix of order n, AA = a, and prove that R (a) + R (A-E) = n
- 5. On "let square matrix a satisfy a ^ 2-a-2e = 0, prove that a and a + 2E are invertible, and find the inverse matrix of a and the inverse matrix of (a + 2e)" I can't see a big mistake when a teacher or a classmate does this, A ^ 2-e = a + e, left square difference formula, The results are as follows (A+E)(A-E)=A+E, The two sides are multiplied by the inverse of (a + e) to get a = 2E, So the inverse of (a + e) is equal to E / 3 There is another way From the known equation A(A-E) = 2E So a [(1 / 2) (A-E)] = e So a is reversible and a ^ - 1 = (1 / 2) (A-E)
- 6. Let square matrix a satisfy a ^ 2-2a + 4E = O, prove that a + E and a-3e are invertible, and find their inverse matrix I know how to do it, but I don't understand why. Could you please explain how to do it? That is, list the process and explain each step. Thank you!
- 7. Let n-order matrix a satisfy a ^ 2 + 2A – 3E = 0, prove that a + 4E is invertible, and find their inverse
- 8. Given that the eigenvalues of matrix A of order 3 are 1,2,3, try to find the eigenvalues of B = 1 / 2A * + 3E
- 9. If a is a square matrix of order n, e is a unit matrix of order n, and a ^ 3 = O, it is proved that A-E is an invertible matrix!
- 10. Let n-order square matrix a satisfy a * A-A + e = 0, and prove that a is an invertible matrix
- 11. Let the eigenvalues of the third-order square matrix a be - 1, - 2, - 3, and find a *, a & # 178; + 3A + E
- 12. Let the eigenvalues of the fourth-order square matrix a be 1 / 2,1 / 3,1 / 4,1 / 5, then | a ^ - 1-e | =?
- 13. It is known that the fourth-order square matrix a satisfies | A-E | = 0, square matrix B = a ^ 3-3a ^ 2, BB ^ t = 2E, and | B|
- 14. Let a be a matrix of order 4 × 3, C = AAT, then | C|=
- 15. Given that a is a matrix of order n and satisfies the equation A2 + 2A = 0, it is proved that the eigenvalue of a can only be 0 or - 2
- 16. Given that an eigenvalue of matrix A = 1a23 is - 1, find another eigenvalue of matrix A and an eigenvector belonging to λ
- 17. It is known that a = (α, β, γ), B = (α + β + γ, α + 2 β + 4 γ, α + 3 β + 9 γ), where α, β and γ are 3-dimensional sequence vectors, | a | = m, find | B|
- 18. If a is a 3-order square matrix, α is a 3-dimensional column vector If a is a 3-order square matrix, α is a 3-dimensional column vector, and the group of constant vectors α, a α, a & # 178; α is linearly independent, and a & # 179; α = 5A α - 3A & # 178; α, we prove that the matrix B = (α, a α, a ^ 4 α) is invertible My idea is: we can transform the topic into α, a α, a & # 178; α linearly independent, a α, a & # 178; α, a & # 179; α linearly related, and prove α, a α, a ^ 4 α linearly independent. But we still haven't done it
- 19. The existence matrix has a double root eigenvalue, which only corresponds to the eigenvalue of a linearly independent eigenvector
- 20. It is proved that if n-order matrix a satisfies AAT = E and | a | = - 1, then matrix a must have an eigenvalue of - 1