Is the eigenvectors of any matrix with different eigenvalues necessarily orthogonal

Is the eigenvectors of any matrix with different eigenvalues necessarily orthogonal


No, it's linear
The eigenvectors of real symmetric matrices belonging to different eigenvalues must be orthogonal
The eigenvectors of different eigenvalues are linearly independent, but it is meaningless to orthogonalize them, because they are not eigenvectors after orthogonalization



Let 1 and 2 be the two different eigenvalues of matrix A, and the corresponding eigenvectors are A1 and A2 respectively. Then it is proved that A1, a (a1 + A2) are linearly independent
Finally, it was knocked out. If and only if 2 is not equal to 0


I still provide ideas, to the landlord seriously completed independently
1: Since the eigenvectors corresponding to different eigenvalues are linearly independent, this is condition one
2: A1, a (a1 + A2) let the coefficients in front of them be K1 and K2
3: Aa1 = in 1A1, aa2 = in 2A2
4: Linear independent definition, this is not much



Let all the elements of matrix A of order n be 1, then the n eigenvalues of a are?


Obviously, 0 is its eigenvalue, and there are n-1 basic solution systems with 0 as eigenvalue, so the multiplicity of 0 is n-1;
And because each row has n 1, considering (n-1) * 1 + (1-N) = 0, it also has eigenvalue n
In fact, for the latter eigenvalue, you can also see that the sum of eigenvalues is the trace of the matrix n, so the eigenvalues of the matrix are n-1 zeros and 1 n



It is known that all elements of n-order matrix A are 1, and the eigenvector of a belonging to the eigenvalue λ = n is obtained
I can find the eigenvalue of a, but when I find the eigenvector of a which belongs to the eigenvalue λ = n, I can only simplify it to the row ladder type, not to the row simplest type,


The sum of elements in each row is 1, so a (1,1,..., 1) '= n (1,1,..., 1)', and the eigenvector is K (1,1,..., 1) '



If matrix A and B are similar, are their eigenvalues and eigenvectors the same


If they are similar, the characteristic polynomials are the same, so the eigenvalues are the same
But the eigenvectors are not necessarily the same



Why is the similarity transformation P in the process of similarity diagonalization of an asymmetric matrix necessarily a matrix composed of eigenvectors corresponding to different eigenvalues of the matrix?
For example, it is known that an asymmetric third-order matrix A can be similar diagonalized, that is, there exists an invertible matrix P such that P ^ (- 1) AP = diag (a, B, c). Why is the similarity transformation P in the process of similar diagonalization composed of three eigenvalues (possibly with multiple roots) corresponding to the eigenvectors in a column?


Let P = (P1, P2, P3)
Then AP = (AP1, ap2, AP3) = pdiag (a, B, c) = (AP1, bp2, CP3)
So AP1 = AP1
Ap2=bp2
Ap3=cp3
In this way, we can know the relationship between eigenvalue, eigenvector, invertible matrix P, diagonal matrix diag (a, B, c)



Linear algebra. Let a matrix A = 1 - 11; x 4 y; - 3 - 35. There are three linearly independent eigenvectors, and λ = 2 is the double eigenvalue of a, then the values of X and y are?
x=2 y=-2


R(A-2E)=1 1:x=-1:2=1:y x=2,y=-2



Second order matrix has only one linearly independent eigenvector. Why do eigenvalues have double roots?


It has nothing to do with the order of A
As long as the number of linearly independent eigenvectors of a is less than n (the order of a)
A must have multiple eigenvalues



Known matrix and eigenvector, eigenvalue problem!
The known matrix A = [4,2,1]
x,1,2
3, y, - 1] has the eigenvector a = [1, - 2,3] ^ t, then what are the values of X and y?


(4,2,1 (1 (1
x. 1,2 * - 2 = R * - 2 (let the eigenvalue be r)
3,y,-1) 3) 3)
You can get it
(3 (1
X + 4 = R - 2, so 3 = R, x + 4 = - 2R, - 2Y = 3R
-2y) 3)
The solution is: x = - 10, y = - 9 / 2



The method of finding eigenvalues with known matrix A and its eigenvectors
Can we still find the eigenvalue first and then find the corresponding eigenvector? There should be a formula


According to the definition of eigenvalue, as long as ax is calculated, the eigenvalue can be separated