Given
sets of variates denoted
, ...,
, a quantity called the Covariance Matrix is defined
by
where
and
are the Means of
and
, respectively.
An individual element
of the Covariance Matrix is called the covariance of the two variates
and
, and provides a measure of how strongly correlated these variables are. In fact, the derived quantity
![\begin{displaymath}
\mathop{\rm cor}\nolimits (x_i,x_j)\equiv {\mathop{\rm cov}\nolimits (x_i,x_j)\over\sigma_i\sigma_j},
\end{displaymath}](c3_643.gif) |
(4) |
where
,
are the Standard Deviations, is called the Correlation of
and
. Note that if
and
are taken from the same set of
variates (say,
), then
![\begin{displaymath}
\mathop{\rm cov}\nolimits (x,x) = \left\langle{x^2}\right\ra...
...left\langle{x}\right\rangle{}^2=\mathop{\rm var}\nolimits (x),
\end{displaymath}](c3_646.gif) |
(5) |
giving the usual Variance
. The covariance is also symmetric since
![\begin{displaymath}
\mathop{\rm cov}\nolimits (x,y) = \mathop{\rm cov}\nolimits (y,x).
\end{displaymath}](c3_648.gif) |
(6) |
For two variables, the covariance is related to the Variance by
![\begin{displaymath}
\mathop{\rm var}\nolimits (x+y) = \mathop{\rm var}\nolimits ...
...mathop{\rm var}\nolimits (y)+2\mathop{\rm cov}\nolimits (x,y).
\end{displaymath}](c3_649.gif) |
(7) |
For two independent variates
and
,
![\begin{displaymath}
\mathop{\rm cov}\nolimits (x,y)=\left\langle{xy}\right\rangl...
...{x}\right\rangle{}\left\langle{y}\right\rangle{}-\mu_x\mu_y=0,
\end{displaymath}](c3_652.gif) |
(8) |
so the covariance is zero. However, if the variables are correlated in some way, then their covariance will be
Nonzero. In fact, if
, then
tends to increase as
increases. If
, then
tends to decrease as
increases.
The covariance obeys the identity
By induction, it therefore follows that
See also Correlation (Statistical), Covariance Matrix, Variance
© 1996-9 Eric W. Weisstein
1999-05-25