For two variables
and
,
![\begin{displaymath}
\mathop{\rm cor}\nolimits (x,y) \equiv {{\rm cov}(x,y)\over \sigma_x\sigma_y},
\end{displaymath}](c3_405.gif) |
(1) |
where
denotes Standard Deviation and
is the Covariance of these two variables. For
the general case of variables
and
, where
, 2, ...,
,
![\begin{displaymath}
\mathop{\rm cor}\nolimits (x_i,x_j) = {{\rm cov}(x_i,x_j)\over \sqrt{V_{ii}V_{jj}}},
\end{displaymath}](c3_411.gif) |
(2) |
where
are elements of the Covariance Matrix. In general, a correlation gives the strength of the
relationship between variables. The variance of any quantity is alway Nonnegative by
definition, so
![\begin{displaymath}
\mathop{\rm var}\nolimits \left({{x\over\sigma_x} + {y\over\sigma_y}}\right)\geq 0.
\end{displaymath}](c3_413.gif) |
(3) |
From a property of Variances, the sum can be expanded
![\begin{displaymath}
\mathop{\rm var}\nolimits \left({x\over\sigma_x}\right)+\mat...
...imits \left({{x\over\sigma_x} , {y\over\sigma_y}}\right)\geq 0
\end{displaymath}](c3_414.gif) |
(4) |
![\begin{displaymath}
{1\over{\sigma_x}^2} \mathop{\rm var}\nolimits (x)+{1\over{\...
...{2\over\sigma_x\sigma_y}\mathop{\rm cov}\nolimits (x,y) \geq 0
\end{displaymath}](c3_415.gif) |
(5) |
![\begin{displaymath}
1 + 1 + {2\over\sigma_x\sigma_y}\mathop{\rm cov}\nolimits (x...
...\over\sigma_x\sigma_y}
\mathop{\rm cov}\nolimits (x,y) \geq 0.
\end{displaymath}](c3_416.gif) |
(6) |
Therefore,
![\begin{displaymath}
\mathop{\rm cor}\nolimits (x,y) = {\mathop{\rm cov}\nolimits (x,y)\over\sigma_x\sigma_y} \geq -1.
\end{displaymath}](c3_417.gif) |
(7) |
Similarly,
![\begin{displaymath}
\mathop{\rm var}\nolimits \left({x\over\sigma_x}\right)- \left({y\over\sigma_y}\right)\geq 0
\end{displaymath}](c3_418.gif) |
(8) |
![\begin{displaymath}
\mathop{\rm var}\nolimits \left({x\over\sigma_x}\right)+\mat...
...mits \left({{x\over\sigma_x} , -{y\over\sigma_y}}\right)\geq 0
\end{displaymath}](c3_419.gif) |
(9) |
![\begin{displaymath}
{1\over{\sigma_x}^2} \mathop{\rm var}\nolimits (x) + {1\over...
...2\over\sigma_x\sigma_y} \mathop{\rm cov}\nolimits (x,y) \geq 0
\end{displaymath}](c3_420.gif) |
(10) |
![\begin{displaymath}
1 + 1 - {2\over\sigma_x\sigma_y} \mathop{\rm cov}\nolimits (...
...\over\sigma_x\sigma_y}
\mathop{\rm cov}\nolimits (x,y) \geq 0.
\end{displaymath}](c3_421.gif) |
(11) |
Therefore,
![\begin{displaymath}
\mathop{\rm cor}\nolimits (x,y) = {\mathop{\rm cov}\nolimits (x,y)\over\sigma_x\sigma_y} \leq 1,
\end{displaymath}](c3_422.gif) |
(12) |
so
. For a linear combination of two variables,
Examine the cases where
,
![\begin{displaymath}
\mathop{\rm cor}\nolimits (x,y) \equiv {\mathop{\rm cov}\nolimits (x,y)\over\sigma_x\sigma_y} = \pm 1
\end{displaymath}](c3_429.gif) |
(14) |
![\begin{displaymath}
\mathop{\rm var}\nolimits (y-bx) = b^2{\sigma_x}^2+{\sigma_y}^2\mp 2b\sigma_x\sigma_y = (b\sigma_x\mp\sigma_y)^2.
\end{displaymath}](c3_430.gif) |
(15) |
The Variance will be zero if
, which requires that the argument of the
Variance is a constant. Therefore,
, so
. If
,
is either perfectly
correlated (
) or perfectly anticorrelated (
) with
.
See also Covariance, Covariance Matrix, Variance
© 1996-9 Eric W. Weisstein
1999-05-25