How to calculate the standard deviation of lognormal distribution

How to calculate the standard deviation of lognormal distribution


If the random variable X: {x1, X2,..., xn} obeys lognormal distribution,
Then its mathematical expectation is as follows:
E=(lnx1+lnx2+...+lnxn)/n;
The standard deviation is as follows
σ=√{Σ(i:1→n) [ln xi - E]² / n} .



What is the difference between mean and standard deviation? Which one can better reflect the degree of dispersion?
Want to do a performance analysis, to see how discrete the scores of several classes are, is it better to use the average deviation or the standard deviation? What are their advantages and disadvantages?
In my opinion, the mean deviation is equal to all data, while the standard deviation seems to be more "biased" to data with larger deviation
Example 1:
1,1,1,1,1,-1,-1,-1,-1,-1
Standard deviation: 1
Mean difference: 1
Example 2:
5,0,0,0,0,0,0,0,0,-5
Standard deviation: 2.236067977
Mean difference: 1
Which better reflects the degree of dispersion?


The mean difference reflects the average difference between each mark value and the arithmetic mean, and is the average of the absolute value of the difference between each data and the mean value; the standard deviation is the square root of the sum of squares of the mean difference, which can better reflect the dispersion degree of a data set
Standard deviation is widely used in general statistics, especially when the sample size is large enough



If the nth power of 3 is equal to 2, what is the value of the logarithm of 8 with 3 as the base minus the logarithm of 36 with 3 as the base?


First, the nth power of 3 is 2 - log3 (2) = nlog3 (8) - log3 (36)
=log3(8/36)=log3(2/9)=log3(2)+log3(1/9)
=n-2log3(3)
=n-2
Your adoption is the driving force of my answer!