If a real symmetric matrix is diagonalized by a non orthogonal matrix, are the diagonal elements of the resulting diagonal matrix eigenvalues?

If a real symmetric matrix is diagonalized by a non orthogonal matrix, are the diagonal elements of the resulting diagonal matrix eigenvalues?


As long as it is similar diagonalization, the elements of diagonal matrix are eigenvalues
Orthogonal diagonalization is mainly used in quadratic form, where Q ^ - 1aq = q ^ Taq



Why are the eigenvalues of a diagonal matrix elements on its diagonal


Why are the eigenvalues of an upper triangular matrix diagonal elements? Let an upper triangular matrix A of order n have an eigenvalue of λ. According to the formula for calculating the eigenvalues of the matrix, | a - λ e | = 0, then, | a11 - λ A12 A13



It is proved that real symmetric matrix must be similar to diagonal matrix
Like the title,


N-order real symmetric matrix A
Calculate the eigenvalue, and then you can find n eigenvectors
Let p be the matrix with n eigenvectors as column vectors
Then a = P Λ p ^ (- 1), where Λ is a similar diagonal matrix, and the value on the diagonal is the eigenvalue
This is a specific method, strict proof needs to use the matrix quadratic base transformation, in any mathematics professional advanced algebra book can be found



It is proved that real symmetric matrix is similar to diagonal matrix
The matrix is: the first line: 2 1 / N 1 / N 1 / N 1/n
The second line: 1 / N 4 1 / N 1 / N 1/n
..
.
.
Line n: 1 / N 1 / N 1 / N 1 / N 2n
How to prove that this matrix is similar to diagonal matrix?


It is troublesome to find the characteristic polynomial | a - λ e | of this matrix 1/n1/n 4 -λ 1/n 1/n…… 1/n.1/n 1/n 1/n 1/n …… For example, when k = 1, if the eigenvalue is 2 - 1 / N, then the determinant is 1 / n



What is simultaneous diagonalization of two matrices


That is, there is an invertible matrix P
Let p ^ - 1AP and P ^ - 1bp be diagonal matrices



Let a and B be m × n matrices, and prove that a and B are equivalent if and only if R (a) = R (b)


It is proved that: (necessary) let a and B be equivalent, then B can be regarded as a matrix obtained by a finite elementary transformation, and the elementary transformation does not change the rank of the matrix, so r (a) = R (b). (sufficient) let R (a) = R (b), then the standard forms of a and B are erooo, that is, a and B are equivalent to erooo, so a and B are equivalent



A is an s * n matrix of rank n, ab = BC, B = C


Is ab = AC
When column A is full rank
The homogeneous linear equations AX = 0 have only zero solutions
Because AB = AC
So a (B-C) = 0
So the column vectors of B-C are solutions of AX = 0
So B-C = 0, that is b = C



Let a and B be NN matrices, and prove that if AB = 0, then rank (a) + rank (b)


Because B1, B2,..., BN are solutions of AX = 0
The solutions of homogeneous linear equations can be expressed linearly by their fundamental solutions
So B1, B2,..., BN can be expressed linearly by the fundamental solution system of AX = 0



It is proved that B is a matrix of order n and C is a full rank matrix of order n × M. if BC = 0, then B = 0


Column transformation, there exists Q such that CQ = (C1 | o), the rank of C1 is n
BC = O, BCQ = O, so B (c1|o) = O, BC1 = o
For a square matrix B, C1 of order n, if BC1 = O, then R (b) + R (C1)



A. B is a real symmetric matrix. If a and B are similar, we can deduce that the eigenvalues of a and B are the same?


On the premise that a and B are real symmetric matrices, a and B are similar if and only if the eigenvalues of a and B are the same
If they are similar, the eigenvalues are the same, which is no problem
On the contrary, if the eigenvalues of a and B are the same, because a and B are real symmetric matrices, a and B are similar to the same diagonal matrix, so a and B are similar