Ax + B = CX + B (x is unknown, a-c ≠ 0)

Ax + B = CX + B (x is unknown, a-c ≠ 0)


(a-c)x=b-b
(a-c)=0
a-c≠0
x=0



Y = (AX + b) / (Cx + D) (a is not equal to 0)


It can be used to calculate the range,
(1) When a / C is not equal to B / D, the range of Y is not equal to a / C, then as long as y in the range is not equal to a / C, there is x corresponding to it, and there is a solution, and there is no solution when y = A / C,
(2) When a / C equals B / D, the range is y equals a / C, so when y = A / C, there are innumerable X and its corresponding solutions. When y is not equal to a / C, there is no solution



Is it true if the absolute value of a less than 0A minus a equals 2A


It doesn't work,
For example, | - 1 | - (- 1) = 1 + 1 = 2 = - 2A



Let f (x) be defined near x = 0 and satisfy that the absolute value of F (x) is less than or equal to X * X. it is proved that f (x) is differentiable at x = 0 and f '(x) = 0


If f (x) is defined near x = 0, and f (x) (f (x) |f (x) 124 124 124 ≤ x ^ 2, then f (0) (0) 124124\ (x) \\\124 ≤ 0 ^ 2 = 0, if f (0) \124 (0) \ (0) 124\ (f (x (x (x (x (x) (x) \\\\\\\124;f (x) / X |≤ x ^



Let f (x) be differentiable in (a, b) and the absolute value of F '(x) be less than or equal to M. it is proved that f (x) is bounded in (a, b)


Since f (x) is differentiable in (a, b), f (x) is continuous in (a, b)
Let ε > 0, such that a + 3 * ε 0, such that | f (x) | ≤ M1 on [a + ε, B - ε]
For any x0 ∈ (a, a + ε), there is x0 + ε



The absolute value of F '(x) is less than or equal to m (x belongs to the interval [0, a]), and f (x) (0, a) has max
The sum of the absolute values of F '(0) and f' (a) is less than or equal to ma


Let f (c) be the maximum, 0



It is proved that for a real symmetric matrix A, there must be a real symmetric matrix B such that a = B & #


Do spectral decomposition a = q Λ Q ^ t
Then take the diagonal matrix D such that d ^ 3 = Λ
B = qdq ^ t satisfies the condition



It is known that n-order symmetric matrix A (not necessarily invertible) satisfies a ^ = 2A. It is proved that A-I is an orthogonal matrix


A ^ 2 = 2A means that the eigenvalue of a can only be 0 or 2, so the eigenvalue of A-I is 1 or - 1
By using the orthogonal similarity of real symmetric matrix to diagonal matrix, we can get that A-I is orthogonal matrix
Another method is to calculate (A-I) (A-I) ^ t = I directly, but the above method should also be mastered



If a is a symmetric matrix and B is an antisymmetric matrix, is ab-ba a symmetric matrix
Thank you very much to the heroes downstairs.
I proved myself to be - AB + BA in the end.
How did you start with (ab-ba) '= b'a' - a'B ',


Certification:
∵ A is a symmetric matrix
∴A^T=A
∵ B is an antisymmetric matrix
∴B^T=-B
∴(AB-BA)^T=B^T*A^T-A^T*B^T=-BA-A(-B)=AB-BA
The ab-ba is a symmetric matrix
It's over



Let a and B be symmetric matrices of the same class, and prove that ab + Ba is also symmetric matrix


(AB+BA)T
=(AB)T+(BA)T
=BTAT+ATBT
=BA+AB
=AB+BA
So AB + Ba is also a symmetric matrix