How to judge whether a matrix is invertible?

How to judge whether a matrix is invertible?


A square matrix of order n is invertible if and only if its determinant is not equal to 0



How to judge whether it is an invertible matrix
Is the determinant value not equal to 0?


Determinant not 0



The method of judging invertible matrix


1. Determinant is not equal to 0
2. The system AX = 0 has only 0 solution
3. Rank = order
4. All eigenvalues are not 0
5. Linear independence of row vector group
6. The column vector group is linearly independent
7. There is another B such that ab = Ba = e (definition)



Judge whether the matrix A = 10, 1, 2, 10 is invertible. If it is invertible, find its inverse matrix? - 3, 2 - 5
1, solve the linear equations X1 + x2-3x3-x4 = 1
3x1-x2-3x3+4x4=4
x1+5x2-9x3-8x4=0
2. Let the probability of continuous random variable X be f (x) = KX & # 178;, 0
The first problem is the judgment matrix A = 1.01
2 1 0
-3 2 -5
Is it invertible? If it is invertible, find its inverse matrix?


2. By definition: probability density integral is 1
∴∫(-∞,+∞)f(x)dx=∫(0,1)kx²dx=k/3=1
∴k=3
F(x)=∫(0,x)f(t)dt=∫(0,x)3t²dt=x^3 0



Ask what is the value of K when there is an invertible matrix P such that P ^ (- 1) a p is a diagonal matrix? Find out P and the corresponding diagonal matrix
A = (first line 3 2 - 2; second line - K - 1 K; third line 4 2 - 3),


|A-λE|=
3-λ 2 -2
-k -1-λ k
4 2 -3-λ
r3-r1
3-λ 2 -2
-k -1-λ k
1+λ 0 -1-λ
c1+c3
1-λ 2 -2
0 -1-λ k
0 0 -1-λ
= (1-λ)(1+λ)^2
So the eigenvalues of a are 1, - 1, - 1
So a is diagonalizable if and only if the eigenvalue - 1 has two linearly independent eigenvectors
That is, R (a + e) = 3-2 = 1
A+E=
4 2 -2
-k 0 k
4 2 -2
So k = 0
You should know the solution later



Let the vector α = (A1, A2, A3 It is proved that if a = α ^ t α, then there is a constant m such that a ^ k = ma, find the invertible matrix P and P ^ - 1AP is a diagonal matrix
In addition, how to find | λ e-A | is the determinant of this


In order to simplify the notation, α 'is used to denote the transpose of α
The vector α can be regarded as 1 × n matrix, and α 'is n × 1 matrix
From the associative law of matrix multiplication, there is a & # 178; = (α 'α) (α' α) = α '(α α') α
And α 'α is a 1 × 1 matrix, which is a constant, let B = α'
Then a & # 178; = α '(α α') α = B α 'α = ba
It is not difficult to find that for any positive integer k, a ^ k = B ^ (k-1) · a
From α≠ 0, there is R (α) = 1, so there are n-1 vectors in the fundamental solution system of linear equations α x = 0
It is easy to see that they all satisfy AX = α 'α x = 0, which is the eigenvector of a belonging to eigenvalue 0
On the other hand, a α '= (α' α) α '= α' (α α ') = B α', so α '(≠ 0) is the eigenvector of a which belongs to the eigenvalue B
And by B = A1 & # 178; + A2 & # 178; +... + an & # 178; ≠ 0, α 'is linearly independent of the eigenvector of eigenvalue 0
Then the matrix P consisting of the fundamental solution system of α x = 0 and the column vector of α 'is invertible and P ^ (- 1) AP is diagonal
According to the above results, all eigenvalues of a are 0 (n-1 weight) and B
Therefore, the characteristic polynomial of a | λ e-A | = (λ - b) λ ^ (n-1)



Let matrix A = find an invertible matrix P, and let P-1 AP be a diagonal matrix
A={ -1 -1 2 }
3 -5 6
2 -2 2


A - λ e | = - 1 - λ - 123 - 5 - λ 62 - 22 - λ C1 + C2-2 - λ - 12 - λ - 5 - λ 60 - 22 - λ r2-r1-2 - λ - 120 - 4 - λ 40 - 22 - λ = (- 2 - λ) [(- 4 - λ) (2 - λ) + 8] = (- 2 - λ) (λ ^ 2 + 2 λ) = - λ (λ + 2) ^ 2, so the eigenvalues of a are 0, - 2, - 2



What are the conditions of invertibility of matrix?
It's better to have more than ten. Thank you, everyone!


Matrix of necessary conditions
The sufficient conditions are as follows
One rank equals the number of rows
2 determinant is not 0
Three row vectors (or column vectors) are linearly independent groups
There is a matrix whose product is identity matrix
That's all I can think of
Racking my brains and thinking~~
As the coefficient of linear equations, 5 has unique solution
6 full rank
7 can be transformed into identity matrix by elementary row transformation
8 adjoint matrix invertible
9 can be expressed as the product of elementary matrices
Its transposition is reversible
It goes left (right) and multiplies another matrix with the same rank
It's not easy to check the books a little bit
Oh, your 5 points are too rare, + + + points
Good luck to you



If a is an invertible matrix and B is an invertible matrix, what is the inverse of a + B


A + B does not necessarily have an inverse matrix
= = = = = = = = =
Let en be the identity matrix of order n
Let a = en, B = - en
Then a and B are reversible
(the inverse of a is en, and the inverse of B is - EN)
But a + B = O, irreversible



Is the eigenvalue of a matrix equal to the invertible eigenvalue of a
The exact title is
The eigenvalues of a are 1 / 2,1 / 3,1 / 4, and | B ^ (- 1) - e |
I want to know the eigenvalue of B ^ (- 1) - E


The title is not very clear!
The relationship between eigenvalue and its inverse matrix is opposite, and the corresponding multiplication is equal to 1
If the eigenvalues of similar matrices are equal, B and a are the same, then the inverse of B is 2 34, the eigenvalue of E is 1, and the inverse of B-E is 2-13-14-1, which is 1 23