Let a and B be matrices of order n, and a be symmetric. It is proved that btab is also symmetric

Let a and B be matrices of order n, and a be symmetric. It is proved that btab is also symmetric


From the known at = a
So (btab) t = btatb = btab
So it is a symmetric matrix



Why is it necessary to prove symmetry to prove positive definite matrix
Since the transformation of positive definite matrix A = p times P, it is of course a symmetric matrix


If you prove that a = PP ^ t and P is invertible, then of course a is symmetric and positive definite
But if you prove that for any non-zero real vector x, x ^ tax > 0, then a is positive definite (this is the definition of general asymmetric positive definite matrix), but not necessarily symmetric, for example
A=
1 1
-1 1
Is an asymmetric positive definite matrix
You may not have defined an asymmetric positive definite matrix in your theoretical system, but at least x ^ tax > 0 is not enough to deduce symmetry, so if you want to prove that a matrix is symmetric positive definite, in essence, you have to verify these two properties separately



How to construct symmetric positive definite matrix
How to construct a large scale symmetric positive definite matrix, such as 1000 * 1000, for testing!
I've learned linear algebra, but it doesn't say how to construct it. What I want is a few thousand order matrix!


If the diagonal matrix is too special,
Take a matrix A whose determinant is not zero,
Then the product of its transpose and itself is positive definite
The method of the invertible matrix can be randomly generated (most invertible)
Or refer to any special matrix, such as tridiagonal / Vandermonde and so on



A. If B is a square matrix of order n and all matrices are nonzero, let AB = 0, then what is the necessary and sufficient condition?


AB=0
|AB|=0
|A|*|B|=0
|A | = 0 or B | = 0



Any real symmetric matrix must be congruent with a diagonal matrix. Any real symmetric matrix can be similar diagonalized into a diagonal matrix. Are these two matrices the same?


Generally speaking, it must be wrong. The order mentioned above is only a small problem
The diagonal matrix after congruent diagonalization has a lot of room for change, but the diagonal matrix obtained by similar diagonalization is unique in the sense of a different arrangement. For example, the non-zero diagonal matrices A and 2A must be congruent, but the eigenvalues are different, and they must not be similar. Or in this way, if the real symmetric matrices are similar, they must be congruent, but vice versa
Since you have asked such a question, you should also know an important conclusion spectral decomposition theorem: any real symmetric matrix is orthogonal and similar to diagonal matrix
Orthogonal similarity transformation is both similarity transformation and congruent transformation, so spectral decomposition theorem can connect similarity and congruence



Does a symmetric matrix have a contract matrix?
If not, please give an example!


No, it should be a symmetric matrix that contracts with a symmetric matrix. It doesn't mean that only a symmetric matrix can contract. If you get a matrix A and find a reversible matrix C, then the transpose * a * C of C is a matrix that contracts with a, and a is not necessarily a symmetric matrix. Try it and pay attention to the concept later,



Why is the congruent matrix of a symmetric matrix a symmetric matrix?


The essence of the congruent transformation is to do the similar transformation qtaq (QT is the transpose of Q, which is also the inverse of Q) on the real symmetric matrix A to get a diagonal matrix. You can see the characteristics of the diagonal matrix. Obviously, all the diagonal matrices after transpose are the same as the original diagonal matrix, which is obviously symmetric. So the congruent matrix of the symmetric matrix must be a symmetric matrix?



The concrete concept of simplified ladder matrix
| 0 0 0 4 |
| 1 2 0 4 |
|Is this a simplification
| 0 0 2 1 |
| 0 0 0 0 |
The concept of simplified ladder matrix is that the matrix is
Ladder matrix, and the first row of non-zero elements is one, all the other elements of the first row of non-zero elements in the column are zero
As long as it is a non-zero line, its first element is 1
All the other elements in the column where the first non-zero element is located are zero
Or should all the elements above 1 be 0


No
Yes. The first nonzero element from the left of the nonzero line is 1
The rest of the elements in the column 1 above are all 0



Enhanced row simplification ladder matrix
1 1 -1 1 0
0 0 1 1 1
2 2 -1 0 1
0 0 2 -4 2
-1 -1 2 -3 1
The head of state is one


The general method is: 1: only do line transformation, the reason is to solve the equation can directly write the equivalent equation. 2: fix a line, generally the first line, and the first element of the first line is better to be 1. If this point is not satisfied in the determinant, you can do it by line substitution and multiplying by an appropriate number. 3: fix well



How to prove that the product of two n-order upper triangular matrices is still an upper triangular matrix


Let a = (AIJ), B = (bij) be an upper triangular n-order square matrix, then AIJ = bij = 0 when I > J. note that C = AB = (CIJ) then CIJ = ai1b1j +... + aii-1bi-1j + AI, IBI, j +... + ainbnj when I > J (Note: the first half AIJ = 0, the second half bij = 0) = 0, so C = AB is also