Is there any skill in elementary transformation of matrix? And how to distinguish whether a square matrix has invertible matrix?

Is there any skill in elementary transformation of matrix? And how to distinguish whether a square matrix has invertible matrix?


Generally speaking, to transform a matrix into a standard matrix follows the following method: first use the first line to eliminate the first term of all the following lines, that is, use a11 to replace A21, A31 An1 cancel to 0, then cancel to 0 the second item of all the lines below the second line, and then cancel to 0 the third item of all the lines below the third line in turn, until it cannot be cancelled



Why to say: elementary row transformation is equivalent to multiplying a matrix left by an invertible matrix
(AE)->(EA^(-1)) A*A^(-1)=E
For the above equation
What is the relationship between the elementary row transformation and the matrix of the algebraic covalent?
Finally, give me their principle explanation, or give me a hyperlink to see the principle
Please don't give me examples and sentences, because he told you to ask for it, and then follow the example, it may not prove anything at all
If they are all primitive questions, I just can't get through them. It's better to answer both of the above two questions. If I don't answer one of them, I'll get points


Let me answer the second question, because any invertible matrix is equivalent to the identity matrix, so any invertible matrix is equal to the product of some elementary matrices. It is easy to verify that when a matrix multiplies an elementary matrix to the left, it is equivalent to the elementary row transformation of itself. Therefore, when a matrix multiplies an invertible matrix to the left, it is equivalent to multiplying some elementary matrices to the left, Thus, it is equivalent to the elementary row transformation of the original matrix



For example, if we do not require the divisibility condition but the elimination condition, that is, we require that χ = y can be derived from a χ = ay, and χ = y can be derived from χ · a = y · a, then G is not necessarily a group. What if G is finite?


For a finite g, if it is already a monoid, then it must be a group. Take any x ∈ g, suppose G is not a group, then x ^ n must not be equal to the identity element E, and it holds for any n. therefore, because G is finite, {x ^ n} this seemingly infinite set must also be finite, then there must be y ∈ g, such that x ^ m = x ^ (M + n) = y (there must be repetition), then x ^ m * e = x ^ m * x ^ n, then x ^ n = E, Then the inverse of X can be defined as x ^ (n-1) ∈ G



AB are all m * n matrices, try to prove R (a + b)


These two inequalities can be regarded as the same inequality. There are many ways to prove them. They can be proved by the method of formula or by the method of vector group representation. Let a be A1, A2,... An, B be B1, B2,..., BN. Then a + B be B1, B2,..., BN



Let a and B be m * n, n * s matrices respectively, and B be row full valued matrices. It is proved that R (AB) = R (a) is a detailed solution


It is proved that: firstly, R (AB) ≤ min (R (a), R (b)) ≤ R (a)
Let B be the row full rank, R (b) = n
So B can be transformed into (EN, B1) by elementary row transformation
So there is an invertible matrix P such that Pb = (EN, B1), and R (AP ^ (- 1)) = R (a)
So r (AB) = R ((AP ^ (- 1)) (PB)) = R ((AP ^ (- 1)) (EN, B1))
= r(AP^(-1),AP^(-1)B1)≥r(AP^(-1)) = r(A).
In conclusion, R (AB) = R (a)#
This problem uses the block matrix method as well as many knowledge points, need patience to understand!



Let a be a matrix of M * n, B be a matrix of n * m, M > N, and prove that ab = 0


It should be determinant | ab | = 0
Because a is a matrix of M * n
So r (a)



Let a be m * n matrix and B be n * s matrix. If AB = O, we prove that R (a) + R (b) ≤ n


Because AB = 0, every column vector of B is the solution of AX = 0
(1) If rank (a) = n, then AX = 0 has only zero solution, so rank (b) = 0 satisfies the condition;
(2) If rank (a)



Prove that R (EM AB) + n = R (EN BA) + m in matrix


In fact, I'm not sure, but you can look at this website:
In example 10, it should be enough to imitate his method



How to prove R (a) = R (a ^ TA) if a is m * n matrix


The proposition needs a to be a real matrix
prove:
(1) Let X1 be the solution of AX = 0, then ax1 = 0
So a ^ tax1 = a ^ t (ax1) = a ^ t0 = 0
So X1 is the solution of a ^ tax = 0
So the solution of AX = 0 is the solution of a ^ tax = 0
(2) Let x2 be the solution of a ^ tax = 0, then a ^ tax 2 = 0
Multiply x2 ^ t on both sides of the equation to get x2 ^ TA ^ TAX2 = 0
So we have (AX2) ^ t (AX2) = 0
So AX2 = 0
So X2 is the solution of AX = 0
So the solution of a ^ tax = 0 is the solution of AX = 0
In conclusion, the homogeneous linear equations AX = 0 and a ^ tax = O are the same solution equations
So the number of vectors contained in their basic solution system is the same
So r (a) = R (a ^ TA)



Are the three ranks of all matrices equal? Why? (row rank, column rank, and matrix rank)


The most fundamental idea of matrix is multiple equations. The so-called rank means that after the equations are reduced to the simplest form, we can see at a glance which equations are redundant, and the number of the remaining non redundant equations is rank
For example, 4x y = 3
8x 2y=6
3x y=2
More than one formula, rank 2, row rank, column rank are 2
If you really understand this point, you will be able to solve the relationship between rank and solution without recitation. This is what I understand in my study. I think it's very correct for self application. There's no textbook to write it like this. So you can understand by your own judgment