Let a be a matrix of order 2, α 1 and α 2 be two linearly independent two-dimensional vectors, a α 1 = O, a α 2 = 2 α 1 + α 2, and find the non-zero eigenvalues of A

Let a be a matrix of order 2, α 1 and α 2 be two linearly independent two-dimensional vectors, a α 1 = O, a α 2 = 2 α 1 + α 2, and find the non-zero eigenvalues of A

In fact, rewrite a ^ 2A2 = aa2 as a (aa2) = aa2 = 1 * aa2,
It shows that 1 is the eigenvalue of a, and the corresponding eigenvector is aa2, that is, 2A1 + A2