Let the probability density of the population X be f (x, θ), where θ has unknown parameters, and E (x) = 2 θ, x1, X2 Xn is a sample from population X - x is the sample mean value, CX & # 175; is the unbiased estimate of θ (Cx - is the mean value of C multiplied by x), then what is the constant C equal to

Let the probability density of the population X be f (x, θ), where θ has unknown parameters, and E (x) = 2 θ, x1, X2 Xn is a sample from population X - x is the sample mean value, CX & # 175; is the unbiased estimate of θ (Cx - is the mean value of C multiplied by x), then what is the constant C equal to


According to the definition of unbiased estimation, the mathematical expectation of statistics is equal to the estimated parameter
E (average value of C * x) = θ
And by the nature of expectation
E (average value of C * x) = Ce (average value of x) = θ
that
E (average value of x) = θ / C
And E (the average value of x) is actually the overall mean value, which is 2 θ
So theta / C = 2 theta, C equals 1 / 2



Let the probability density of population X be f (x; θ) = e ^ - (x - θ), when x > = 0; f (x; θ) = 0, X


Because those symbols are too hard to lose, as shown in the picture



Let x1, X2... And xn be a sample taken from the population x ~ e (x), the joint probability density of sample x1, X2... Xn and the moment estimator of population parameter λ are obtained


First of all, it should be e
FXI (XI) = in e ^ (- in Xi) I ∈ {1,2,... N}
Multiply all of them together, let the joint density be P
P (x1, X2, X3., xn) = in ^ n e ^ (- in NX)
Note that the following e (x) is the expected value
E (x) = 1 / in
(x1 +... + xn) / N = 1 / in
In = 1 / (x-means)