A is a square matrix of order n. if there is a square matrix of order n, B is not equal to 0, so that ab = 0, it is proved that the rank of a is less than n

A is a square matrix of order n. if there is a square matrix of order n, B is not equal to 0, so that ab = 0, it is proved that the rank of a is less than n


Because AB = 0
So the column vectors of B are solutions of AX = 0
Because B ≠ 0, ax = 0 has nonzero solution
So r (a) < n



Let a be the determinant of n-order square matrix A = 0, and prove that it is equivalent to the existence of n-order square matrix B not equal to 0 such that ab = 0





Let a * be the adjoint matrix of square matrix A of order n, and prove that det (a) = O, then det (a *) = 0


DET (a) = O indicates R (a)



Let a be a nonzero real square matrix of order n, a * be the adjoint matrix of a, and at be the transpose matrix of A. when a * = at, it is proved that | a | ≠ 0 is not understood
Proof: from the known a * = a ^ t
So there is AA ^ t = AA * = |a|e
Then let a be a nonzero real square matrix of order n, and let AIJ ≠ 0
Consider the element AA ^ t = | a | e, row I, column I, and get
|A| = ai1^2+...+aij^2+...+ain^2 > 0
(because Ai1,..., AIJ,..., ain are all real numbers, and AIJ ≠ 0)
So | a ≠ 0
Consider the element AA ^ t = | a | e, row I, column I, and get | a | = Ai1 ^ 2 +... + AIJ ^ 2 +... + ain ^ 2 > 0. How did this come about?


|A | e = AA ^ t, then the elements in the i-th row and i-th column of | a | e are the sum of the elements in the i-th row of a multiplied by the elements in the i-th column of a ^ t one by one,
[one by one multiplication means that the elements in row I and column 1 of a ^ t are multiplied by the elements in row I and column 1 of a ^ t, the elements in row I and column 2 of a ^ t are multiplied by the elements in row I and column J of a ^ t,..., the elements in row I and column n of a ^ t are multiplied by the elements in row I and column n of a ^ t,
And the elements of column I and column J of a ^ t are the elements of column I and column J of A,
Then the summation is the i-th row and i-th column element of AA ^ t, that is, the i-th row and i-th column element of | a | e]
That is to say, | a | in row I and column I of | a | e = Ai1 ^ 2 +... + AIJ ^ 2 +... + ain ^ 2
Since AIJ ≠ 0 has been set, so | a | > 0



What is the relationship between the rank of a matrix and the number of nonzero eigenvalues?
Please explain why,


What is given upstairs is a mistake that many people make
In fact, the rank of the upper matrix is greater than or equal to the number of nonzero eigenvalues



How many nonzero eigenvalues does a matrix have


Not necessarily
0 0】
The rank is 1, but the eigenvalues are all 0



Let a be a third-order square matrix, and the square of a is equal to 0, how to find the rank of a and the rank of the adjoint matrix of a


A is a third order matrix
A^2=0
Then 2R (a) 3
r(A)《1
r(A)=0,1
If R (a) = 0,
Then R (a *) = 0
If R (a) = 1 < (n-1) = 2,
Then R (a *) = 0



N-order square matrix a satisfies that the square of a is equal to A. please use the full rank decomposition of matrix to prove that the rank of a plus the rank of A-E is greater than or equal to N, and then prove that it is equal to n


A^2=A
->A(A-E)=0
So r [a (A-E)] ≥ R (a) + R (A-E) - n
r(A)+r(A-E)≥r(A-A+E)
So r (a) + R (A-E) = n
It can also be done with block matrix



A matrix of second order whose square is equal to a


Don't mistake people upstairs, except for 0 and I, and
1 0
0 0
All the similar matrices satisfy the conditions, but you can't find any



Linear algebra problem, let a be a 2005 order matrix, and satisfy that the transpose of a is equal to negative a, the determinant size of a is 0


In fact, this is a basic theorem: the determinant value of antisymmetric matrix of odd order (at = - a) is 0. The proof is as follows: firstly, according to the property of determinant, if a row of a is multiplied by X, then the value of | a | will also become x | a | now multiply each row of a by - 1 one by one, then the odd number of - 1 will be multiplied, so the corresponding row