Proof of Linear Algebra (rank of matrix) A is a real square matrix of order n, the proof is: R (a * a ^ t) = R (a ^ t * a) = R (a)

Proof of Linear Algebra (rank of matrix) A is a real square matrix of order n, the proof is: R (a * a ^ t) = R (a ^ t * a) = R (a)


On the one hand, R (a ^ t * a) = R (a);
From these two aspects, R (a ^ t * a) = R (a)
Similarly, R (a * a ^ t) = R (a ^ t) = R (a)



Linear Algebra: let the rank of the third order real symmetric matrix a be 2, R1 = R2 = 6 be the double eigenvalue of A
Let the rank of the third order real symmetric matrix a be 2, R1 = R2 = 6 be the double eigenvalue of A. if α 1 = (1,1,0) ^ t, α 2 = (2,1,1) ^ t, α 3 = (- 1,2, - 3) ^ t are all the eigenvectors of a belonging to the eigenvalue 6. (1) find another eigenvalue of a and the corresponding eigenvector (2) find the matrix A


The rank is 2, and the other eigenvalue is 0. The eigenvectors of different eigenvalues are vertical, and the condition is given to alpha_ 1=(1,1,0),\alpha_ 2-\alpha_ 1 = (1,0,1) is the two eigenvectors of 6, so (1,1,0) * (1,0,1) = (1, - 1, - 1) (cross product) is the eigenvector of 0 ╮(...



In linear algebra, it is known that an eigenvalue of a symmetric matrix A of third order is λ = 2, the corresponding eigenvector α = (1,2, - 1), and all the elements on the main diagonal of a are 0, so a
Given that an eigenvalue of a symmetric matrix of third order is λ = 2, the corresponding eigenvector α = (1, - 1), and all the elements on the main diagonal of a are 0, find a


Let a be a=
0 a b
a 0 c
b c 0
From a α = λ α
2a - b = 2
a - c = 4
b + 2c = -2
The solution is a = 2, B = 2, C = - 2
So a=
0 2 2
2 0 -2
2 -2 0



Linear Algebra: let the eigenvalues of the third order real symmetric matrix a be λ 1 = - 1, λ 2 = λ 3 = 1, and we know that the eigenvector of a belonging to λ 1 = - 1 is P1 = {0,1,1}
The eigenvector of a with eigenvalue λ 2 = λ 3 = 1 is obtained, and the symmetric matrix A is obtained
Let the eigenvector x = {x1, X2, X3} transpose. The two eigenvectors obtained, x1, should be 1 and 0 respectively? What's the reason
One of the solutions is P2 = {1,0,0} transpose and P2 = {0,1, - 1} transpose. Why don't we let P2 = {1,1, - 1}, which has nothing to do with linearity? If it's two vectors, how can we judge the correlation? I only know three vectors


The first problem: since the eigenvectors belonging to different eigenvalues are orthogonal to each other, the eigenvectors belonging to 1 are orthogonal to the eigenvectors belonging to - 1. Assuming that the eigenvectors belonging to 1 are (x, y, z), then: y + Z = 0, X is arbitrary. In this way, the basic solution system α = (1,0,0) β = (0,1, - 1) the eigenvectors belonging to 1 can be regarded as



It is known that the eigenvalues of the third order real symmetric matrix A are A1 = - 1, A2 = A3 = 1, and (0, 1, 1) t is the eigenvector of - 1
Let the eigenvector with eigenvalue 1 be (x1, X2, x3) T. when we get x2 + X3 = 0, how can we find their corresponding eigenvectors


This is a system of homogeneous linear equations
The basic solution system (1,0,0) ^ t, (0, - 1,1) ^ t is obtained by taking the free variables X1 and X3 as 1,0; 0,1 respectively



The problem of orthogonal eigenvectors of real symmetric matrices replaced by lines,
Suppose that a third-order real symmetric matrix has three eigenvalues 3,3,1, and the corresponding eigenvector (1,1,2) with eigenvalue 1 is known. At this time, finding the eigenvector with eigenvalue 3 can directly use the orthogonal property to list the equation x1
+The basic solution system obtained by x2 + 2x3 = 0 is the eigenvector corresponding to the eigenvalue of 3. Then, if the three eigenvalues are not the same, such as 3, 5, 1, what is the basic solution system obtained by listing the equations in this way at this time? I don't quite understand. I hope you can explain it in detail. I'm so confused that I didn't do a problem right n times today
There is also a question about quadratic form. There is a question in Li Yongle's online tutoring. Specifically, I will not write it. One thing I don't understand is that if a quadratic form is transformed into a standard form, it is f = 5y2 ^ 2 + 6y3 ^ 2. When the given condition is x ^ TX = 2, the maximum value of F is required. The reference answer is x ^ TX = y ^ ty = 2, so x ^ tax = 5y2 ^ 2 + 6y3 ^ 2


Suppose that a third-order real symmetric matrix has three eigenvalues 3,3,1, and the corresponding eigenvector (1,1,2) with eigenvalue 1 is known. At this time, the eigenvector with eigenvalue 3 can be obtained by directly using the orthogonal property to list the equation X1 + x2 + 2x3 = 0. The basic solution system obtained is the eigenvector with eigenvalue 3. Then, if the three eigenvalues are different, for example, 3,5,1, At this time, what is the basic solution system of the equation?
The real symmetric matrix has some properties. ① the eigenvectors of different eigenvalues are orthogonal to each other. ② the algebraic multiplicity of each eigenvalue
Numbers and geometric multiplicities are equal
The eigensubspace V of eigenvalue 1 is one-dimensional, and the eigensubspace U of eigenvalue 3 is two-dimensional
From (1) R & sup3; = V × U (direct product), that is, u is the orthogonal complement of V, V is known, and the orthogonal complement is unique, the following results are obtained
So, the two vectors you find out in that way are the orthogonal complementary bases of V and thus the bases of U
As for the eigenvalues are 3,5,1, it is not enough to know only the eigenvector of 1, because according to the original method, only 1
The orthogonal complement of characteristic subspace V of 1 is the direct product of characteristic subspace U of 3 and characteristic subspace w of 5
Therefore, in this case, we must know the eigenvectors of the two eigenvalues before we can determine them
The third eigenvalue is the eigenvector of the third eigenvalue,
The fundamental solution system has only one vector
[another question. Please ask another question. Especially explain the question clearly so that others can help you



Is there a relationship between the rank of a matrix and its eigenvalues?
It doesn't feel connected, is it


There is a little connection, but not very close
1. If a matrix A is not full of rank, it is equivalent to a having zero eigenvalues
2. The rank of a is not less than the number of nonzero eigenvalues of A



Why is the eigenvalue of matrix after elementary transformation different from that of matrix without elementary transformation
Why is the eigenvalue of the matrix after elementary transformation different from that of the matrix without elementary transformation?


Elementary transformation has no substantial connection with eigenvalue. For example, if we give a square matrix A, its eigenvalue is the solution of the equation | xe-a | = 0, but if we transform a into e, the eigenvalue becomes the solution of | xe-e | = 0, i.e. 1, which is different from the original solution



Linear Algebra: let a be an M * n matrix and B be an n * m matrix. It is proved that the determinant of EM AB is equal to that of en ba
Such as the title


Consider determinants
| En B |
| A Em|
Using column transformation, subtract the first column from the second column and multiply it by B to get the above formula = | em-ab |,
Similarly, with the row transformation, the first line minus the second line multiplied by B, the above equation is equal to en ba|
So the determinant of EM AB is equal to that of en ba



On the problem of linear algebra, the element of n-order determinant is AIJ = | I-J | (I, j = 1,2,3.) to find the value of the determinant
This is just the beginning. It's very difficult


Let me help you solve it, the answer is (- 1) n + 1 power multiplied by (n-1) * (2 n-2 power) because it is a web message can not use the formula editor, I mean you understand, the specific solution is as follows: from the problem, it is a symmetric determinant, its specific elements are as follows: 0 1 2... N-11 0 1... N-22