The Gaussian probability distribution with Mean and Standard Deviation is a Gaussian
Function of the form

(1) |

(2) |

(3) |

Gaussian distributions have many convenient properties, so random variates with unknown distributions are often assumed to be Gaussian, especially in physics and astronomy. Although this can be a dangerous assumption, it is often a good approximation due to a surprising result known as the Central Limit Theorem. This theorem states that the Mean of any set of variates with any distribution having a finite Mean and Variance tends to the Gaussian distribution. Many common attributes such as test scores, height, etc., follow roughly Gaussian distributions, with few members at the high and low ends and many in the middle.

Making the transformation

(4) |

(5) |

The Normal Distribution Function gives the probability that a standard normal variate assumes a value in the interval
,

(6) |

The Gaussian distribution is also a special case of the Chi-Squared Distribution, since substituting

(7) |

(8) |

(9) |

which is a Chi-Squared Distribution in with (i.e., a Gamma Distribution with and ).

Cramer showed in 1936 that if and are Independent variates and has a Gaussian distribution, then both and must be Gaussian (Cramer's Theorem).

The ratio of independent Gaussian-distributed variates with zero Mean is distributed with a Cauchy
Distribution. This can be seen as follows. Let and both have Mean 0 and standard deviations of and
, respectively, then the joint probability density function is the Gaussian Bivariate Distribution with
,

(10) |

(11) |

But

(12) |

(13) |

which is a Cauchy Distribution with Mean and full width

(14) |

The Characteristic Function for the Gaussian distribution is

(15) |

(16) |

Completing the Square in the exponent,

(17) |

(18) | |||

(19) | |||

(20) |

The integral then becomes

(21) |

so

(22) | |||

(23) |

and

(24) | |||

(25) |

These can also be computed using

(26) | |||

(27) | |||

(28) |

yielding, as before,

(29) | |||

(30) |

The moments can also be computed directly by computing the Moments about the origin
,

(31) |

(32) | |||

(33) | |||

(34) |

giving

(35) |

(36) | |||

(37) | |||

(38) | |||

(39) | |||

(40) |

where are Gaussian Integrals.

Now find the Moments about the Mean,

(41) | |||

(42) | |||

(43) | |||

(44) |

so the Variance, Standard Deviation, Skewness, and Kurtosis are given by

(45) | |||

(46) | |||

(47) | |||

(48) |

The Variance of the Sample Variance for a sample taken from a population with a Gaussian distribution is

(49) |

If , this expression simplifies to

(50) |

(51) |

The Cumulant-Generating Function for a Gaussian distribution is

(52) |

(53) | |||

(54) | |||

(55) |

For Gaussian variates, for , so the variance of *k*-Statistic is

(56) |

Also,

(57) | |||

(58) | |||

(59) |

where

(60) | |||

(61) |

If is a Gaussian distribution, then

(62) |

(63) |

The Gaussian distribution is an approximation to the Binomial Distribution in the limit of large numbers,

(64) |

The differential equation having a Gaussian distribution as its solution is

(65) |

(66) |

(67) |

(68) |

**References**

Beyer, W. H. *CRC Standard Mathematical Tables, 28th ed.* Boca Raton, FL: CRC Press, pp. 533-534, 1987.

Kraitchik, M. ``The Error Curve.'' §6.4 in *Mathematical Recreations.* New York: W. W. Norton,
pp. 121-123, 1942.

Spiegel, M. R. *Theory and Problems of Probability and Statistics.* New York: McGraw-Hill, p. 109-111, 1992.

© 1996-9

1999-05-25