The Gaussian probability distribution with Mean and Standard Deviation is a Gaussian
Function of the form
(1) |
(2) |
(3) |
Gaussian distributions have many convenient properties, so random variates with unknown distributions are often assumed to be Gaussian, especially in physics and astronomy. Although this can be a dangerous assumption, it is often a good approximation due to a surprising result known as the Central Limit Theorem. This theorem states that the Mean of any set of variates with any distribution having a finite Mean and Variance tends to the Gaussian distribution. Many common attributes such as test scores, height, etc., follow roughly Gaussian distributions, with few members at the high and low ends and many in the middle.
Making the transformation
(4) |
(5) |
The Normal Distribution Function gives the probability that a standard normal variate assumes a value in the interval
,
(6) |
The Gaussian distribution is also a special case of the Chi-Squared Distribution, since substituting
(7) |
(8) |
(9) |
Cramer showed in 1936 that if and are Independent variates and has a Gaussian distribution, then both and must be Gaussian (Cramer's Theorem).
The ratio of independent Gaussian-distributed variates with zero Mean is distributed with a Cauchy
Distribution. This can be seen as follows. Let and both have Mean 0 and standard deviations of and
, respectively, then the joint probability density function is the Gaussian Bivariate Distribution with
,
(10) |
(11) |
(12) |
(13) |
(14) |
The Characteristic Function for the Gaussian distribution is
(15) |
(16) |
(17) |
(18) | |||
(19) | |||
(20) |
(21) |
(22) | |||
(23) |
(24) | |||
(25) |
(26) | |||
(27) | |||
(28) |
(29) | |||
(30) |
The moments can also be computed directly by computing the Moments about the origin
,
(31) |
(32) | |||
(33) | |||
(34) |
(35) |
(36) | |||
(37) | |||
(38) | |||
(39) | |||
(40) |
Now find the Moments about the Mean,
(41) | |||
(42) | |||
(43) | |||
(44) |
(45) | |||
(46) | |||
(47) | |||
(48) |
The Variance of the Sample Variance for a sample taken from a population with a Gaussian distribution is
(49) |
(50) |
(51) |
The Cumulant-Generating Function for a Gaussian distribution is
(52) |
(53) | |||
(54) | |||
(55) |
For Gaussian variates, for , so the variance of k-Statistic is
(56) |
(57) | |||
(58) | |||
(59) |
(60) | |||
(61) |
If is a Gaussian distribution, then
(62) |
(63) |
The Gaussian distribution is an approximation to the Binomial Distribution in the limit of large numbers,
(64) |
The differential equation having a Gaussian distribution as its solution is
(65) |
(66) |
(67) |
(68) |
See also Binomial Distribution, Central Limit Theorem, Erf, Gaussian Bivariate Distribution, Logit Transformation, Normal Distribution, Normal Distribution Function, Pearson System, Ratio Distribution, z-Score
References
Beyer, W. H. CRC Standard Mathematical Tables, 28th ed. Boca Raton, FL: CRC Press, pp. 533-534, 1987.
Kraitchik, M. ``The Error Curve.'' §6.4 in Mathematical Recreations. New York: W. W. Norton,
pp. 121-123, 1942.
Spiegel, M. R. Theory and Problems of Probability and Statistics. New York: McGraw-Hill, p. 109-111, 1992.
© 1996-9 Eric W. Weisstein