By: Neil E. Cotter

Probability

 

 

Chebyshev's theorem

 

 

Example 1

 

 

 

 
 
 

 

Ex:           Are the bounds given by Chebyshev's inequality more accurate when f(x) is a uniform distribution or when f(x) is a gaussian distribution? Justify your answer.

Sol'n:      Chebyshev's inequality is more helpful when a distribution has long tails. The probability density for a uniform distribution drops to zero for x more than a certain number of σ's from the mean, μ. For a uniform distribution from 0 to 1, for example, σ2 = 1/12, and σ = 1/. The probability density drops to zero for values farther than 1/2 from μ = 1/2. Solving cs = 1/2, we find that c = . Thus, for a uniform distribution, we have

for c.

In this case, Chebyshev's inequality only guarantees a probability of

= for c = .

Thus, Chebyshev's inequality is of little use for a uniform density function.

If we consider a standard gaussian (with μ = 0 and σ = 1), the probability never drops to zero as we move away from the mean. If, for example, we consider c = , we can use a table for the area under a standard gaussian to find P(Xm + cs) = P(X) ≈ P(X ≤ 1.73) = 0.9582. We subtract from this P(Xm cs) = 0.0418 to obtain

for c = .

In this case, Chebyshev's inequality guarantees a probability of

= = 0.6667 for c = .

This is better than the approximation for the uniform density function, although it still seems rather conservative.