Let f (x) = ax ^ 2 + BX + C (A.B.C belongs to R), and f (1) = - A / 2. A > 2B > C.1 2. Prove that f (x) = 0 has at least one real root in the interval (0.2)

Let f (x) = ax ^ 2 + BX + C (A.B.C belongs to R), and f (1) = - A / 2. A > 2B > C.1 2. Prove that f (x) = 0 has at least one real root in the interval (0.2)


(1)f(1)=a+b+c=-a/2
We get 3A / 2 + B + C = 0,
a>2b>c,
Then 0 = 3A / 2 + B + C > 3C / 2 + C / 2 + C = 3C, that is, C < 0,
0 = 3A / 2 + B + C < 3A / 2 + A / 2 + a = 3A, i.e. a > 0,
A is positive and C is negative;
(2)f(1)=-a/2<0,
f(2)=4a+2b+c=4a+2(-c-3a/2)+c=a-c>0,
So there must be at least one real root in the interval (1.2),
That is, f (x) = 0 has at least one real root in the interval (0.2)



Given that the quadratic function f (x) = AX2 + BX + C, and f (1) = - A, 3A > 2B > C, then the value range of B / A is


f(1)=a+b+c=-a,2a+b+c=0 => c=-2a-b
3a>2b,3a>c=-2a-b,2b>c=-2a-b;
a>2b/3,5a>-b,3b>-2a;
=>a>2b/3,a>-b/5,a>-3b/2
A > 0, then B / a-5, B / a > - 2 / 3
=> -2/3