Matrix and transformation a b 1 Given matrix A = [], if matrix A belongs to an eigenvector of eigenvalue 3, α = [], it belongs to eigenvector c d 1 one An eigenvector of value - 1 is β = [], and the matrix A is obtained -1 a b A=[c d] one α=[ 1 ] one β=[ -1 ]

Matrix and transformation a b 1 Given matrix A = [], if matrix A belongs to an eigenvector of eigenvalue 3, α = [], it belongs to eigenvector c d 1 one An eigenvector of value - 1 is β = [], and the matrix A is obtained -1 a b A=[c d] one α=[ 1 ] one β=[ -1 ]


To put it simply, a + B = 3; C + D = 3; A-B = - 1; C-D = 1;
So a = [1,2]
2,1
Complex point
The characteristic equation satisfies the characteristic equation a * [α, β] = [3, - 1] * a;



On elementary transformation and matrix
If you change a matrix into a ladder matrix and the simplest ladder matrix, can you only use row transformation instead of column transformation
Ladder type matrix is also called row ladder type matrix, and the simplest ladder type matrix is also called row simplest matrix. Is that right?
Under what circumstances can we transform both rows and columns


You may not know how the ranks change
A row transformation is to multiply an invertible matrix on the left, and a column transformation is to multiply an invertible matrix on the right
For example, if you add the first line of a to the second line, then a multiplies an invertible matrix to the left
1 0 0 ...0
1 1 0 ...0
0 0 1 ...0
...
0 0 0 ...1
Now, your question:
In fact, whether it's row transformation or column transformation, only from the operation point of view, we can turn the matrix into the simplest ladder type, which is very easy to understand, right. But there are differences between the two in effect
To illustrate the problem, let's assume that the original matrix is a and the simplest ladder type is the unit matrix I
If you only do row transformation to get I, it is equivalent to a left multiplying a series of invertible matrices to get I. if you multiply those invertible matrices together and record them as P, then PA = I
If you do both row transformation and column transformation to get I, it is equivalent to a multiplying a series of invertible matrices left and right to get I. if you multiply the invertible matrices left and right together and record them as P, and multiply the invertible matrices right together and record them as Q, then PAQ = I
Here comes the question, "what's the purpose of your row column transformation?"
Suppose you want to find the inverse matrix of a, then obviously you can only use row transformation to get PA = I, then p is the inverse matrix of A. if you do row transformation and column transformation in this process, that is PAQ = I, you can't find the inverse matrix of a in this equation
Suppose you want to find the rank of a, then the row column transformation can be used. Because the row column transformation does not change the rank of the matrix, although it is also PAQ = I, the P and Q here are useless to me when finding the rank, so don't worry about it
Do you know what I mean? Remember that row transformation is left multiplication and column transformation is right multiplication. You'll know when you can do both row transformation and column transformation



Is the elementary transformation matrix multiplied by another matrix a equal to the original matrix multiplied by a?


Don't wait!



Let P and Q be invertible matrices and PA and AQ be meaningful, then R (PA) = R (AQ) = R (a)


P. If q is an invertible matrix, it can be expressed as the product of elementary matrices
PA and AQ are equivalent to implementing a series of elementary transformations on a, so the rank is unchanged



If the rank r (a) = 3 and P is an invertible matrix of order n, how much is the rank r (PA)? Explain the reason


3. Multiplication of matrix and invertible matrix is elementary transformation! So rank is invariable!



It is proved that the necessary and sufficient condition for the row equivalence of matrix amxn and bmxn is the existence of invertible matrix P of order m such that PA = B


Using the elementary transformation of matrix to find the row transformation is equal to multiplying an elementary matrix by left, and the column transformation is equal to multiplying an elementary matrix by right
The equivalence shows that a can get B through K row transformations, that is, left multiplied by K elementary matrices
The elementary matrix is an invertible matrix and its product is an invertible matrix, that is, the inverse matrix P



Find the invertible matrix P so that PA is the row simplest matrix of matrix A
Let a be a matrix=
1 2 3
2 3 4
3 4 5
Find an invertible matrix P so that PA is the row simplest matrix of matrix A


(A,E)=
1 2 3 1 0 0
2 3 4 0 1 0
3 4 5 0 0 1
r2-2r1,r3-3r1
1 2 3 1 0 0
0 -1 -2 -2 1 0
0 -2 -4 -3 0 1
r1+2r2,r3-2r2
1 0 -1 -3 2 0
0 -1 -2 -2 1 0
0 0 0 1 -2 1
r2*(-1)
1 0 -1 -3 2 0
0 1 2 2 -1 0
0 0 0 1 -2 1
Order p=
-3 2 0
2 -1 0
1 -2 1
Then p is reversible and PA is reversible=
1 0 -1
0 1 2
0 0 0
Is the row simplest matrix of matrix A



If and only if a matrix A is invertible
Just the answer


I don't know what you want to do with this. We just learned here today that a matrix A is invertible if and only if a is not degenerate, that is, a is not equal to 0



The sufficient and necessary condition for proving the invertibility of matrix A is | a | ≠ 0


Let a * denote the adjoint matrix, and a 'denote the transpose matrix - - - - - - - - -. Suppose that n-order matrix A is not invertible, then | a | = 0. A * = a', then AA '= AA * = | a | e, e is the identity matrix. So AA' = 0. Let the element of column J in the i-th row of a be AIJ, then the k-th principal diagonal element of AA 'is ∑ (akj) ^ 2, j = 1,2,..., n



A necessary and sufficient condition for invertibility of matrix A of order n is ()
A. Every row vector is a nonzero vector B. every column vector is a nonzero vector C. AX = B has a solution D. when x ≠ 0, ax ≠ 0, where x = (x1 ,xn)T


For option (a) and (b): take counterexample a = 1212, any row column vector is non-zero vector, but a is irreversible; so exclude option a and B. for option (c): take counterexample, for example, a is a square matrix of order n,. A is an augmented matrix, when R (a) = R (. A) < n, ax = B has infinite solutions, but a is irreversible. For option (d), prove that