Let a and B be symmetric matrices of the same order, and prove that AB is symmetric if and only if a and B are commutative

Let a and B be symmetric matrices of the same order, and prove that AB is symmetric if and only if a and B are commutative


Because a and B are symmetric matrices, a '= a, B' = B
So (AB) '= (transpose algorithm) B' a '= ba
So (AB) '= AB if and only if AB = Ba,
That is, AB is a symmetric matrix if and only if a and B are commutative



Let a and B be symmetric matrices of the same order. It is proved that ab + Ba is symmetric and ab-ba is antisymmetric


(AB+BA)T=(AB)T+(BA)T=BTAT+ATBT=BA+AB=AB+BA,
So AB + Ba is a symmetric matrix;
(AB-BA)T=BTAT-ATBT=BA-AB=-(AB-BA)
So ab-ba is antisymmetric matrix



Let n-order square matrix a satisfy that the square of a is equal to a, and prove that a is either an identity matrix or an irreversible matrix


It is proved that a is invertible and its inverse matrix is B
E=AB
Multiply both sides by a at the same time
A=AAB=AB
therefore
A=E
So a is either irreversible or unit matrix E



It is proved that if the square of matrix A is equal to I and a is not equal to I, then a + I is irreversible


Proof: from a ^ 2 = I, (a + I) (A-I) = 0
If a + I is reversible
If (a + I) ^ - 1 is left multiplied on both sides of the equation, then A-I = 0
So a = I
This is in contradiction with the known a ≠ I
So a + I is irreversible



On the proof of matrix in linear algebra!
Let a be m * n matrix, B be n * l matrix, and R (a) = n
If AB = AC, then B = C


R (a) = n indicates that the column of a is linearly independent, that is, ax = 0 has only zero solution, so a (B-C) = 0 = > B-C = 0



How to prove that a + B is also diagonalized


Let p ^ (- 1) AP and P ^ (- 1) BP be diagonal at the same time. This theorem can also be extended to {A1, A2} AIAJ = ajai (i.j = 1,2,...) K), and each AI is similar to a diagonal matrix



How to prove | ab | = | a | B |


I can only tell you the general steps:
Construct a (AB all of order n)
| A O |
| -E B |
Then, the determinant can be transformed into:
(-1)^n | -E O |
|A C | (where C = AB)
Multiplication using block determinant
It can be proved that | ab | = | a | B |
It has been proved in Tongji's teaching materials, and it is estimated that there are also general teaching materials



If AB = Ba, then B is said to be commutative with A. find all matrices B which are commutative with a
A = matrix (first row 1 1, second row 0 0)


Do you know the undetermined coefficient? The answer is
A + B, a, a and B are arbitrary real numbers
0 b



Know a matrix, how to find its commutative matrix
Know a matrix A = [0 1 0], how to find its commutative matrix
0 0 1
0 0 0


Let B = (bij) be commutative with a, then AB = ba. By comparing the corresponding elements on both sides, it is found that B11 = B22 = B33, B12 = B23, B21 = B31 = B32 = 0. Therefore, the commutative matrix with a is in the following form:
a b c
0 a b
0 0 a
Where a, B and C are arbitrary real numbers



A belongs to P. it is proved that all matrices commutative with a constitute a subspace of P
Just write the proof


Suppose that the set of all commutative matrices with a is V, and B and C are any two elements in V, then (λ b) a = λ (BA) = λ (AB) = a (λ b), that is, λ B also belongs to v. because (B + C) a = Ba + Ca = AB + AC = a (B + C), B + C also belongs to v. that is, V is closed to linear operation, so V is a subspace