In probability theory and statistics, a cross-covariance matrix is a matrix whose element in the i, j position is the covariance between the i-th element of a random vector and j-th element of another random vector.
When the two random vectors are the same, the cross-covariance matrix is referred to as covariance matrix.
A random vector is a random variable with multiple dimensions. Each element of the vector is a scalar random variable. Each element has either a finite number of observed empirical values or a finite or infinite number of potential values. The potential values are specified by a theoretical joint probability distribution. Intuitively, the cross-covariance matrix generalizes the notion of covariance to multiple dimensions.
The cross-covariance matrix of two random vectors
and
is typically denoted by
or
.
Definition
For random vectors
and
, each containing random elements whose expected value and variance exist, the cross-covariance matrix of
and
is defined by[1]: 336
![{\displaystyle \operatorname {K} _{\mathbf {X} \mathbf {Y} }=\operatorname {cov} (\mathbf {X} ,\mathbf {Y} ){\stackrel {\mathrm {def} }{=}}\ \operatorname {E} [(\mathbf {X} -\mathbf {\mu _{X}} )(\mathbf {Y} -\mathbf {\mu _{Y}} )^{\rm {T}}]}](./96ab8cdb7a99b79fcc12aa96759624fca8288f92.svg) | | Eq.1 |
where
and
are vectors containing the expected values of
and
. The vectors
and
need not have the same dimension, and either might be a scalar value.
The cross-covariance matrix is the matrix whose
entry is the covariance
![{\displaystyle \operatorname {K} _{X_{i}Y_{j}}=\operatorname {cov} [X_{i},Y_{j}]=\operatorname {E} [(X_{i}-\operatorname {E} [X_{i}])(Y_{j}-\operatorname {E} [Y_{j}])]}](./00317be01da6672a69243a46336219d92e1336c9.svg)
between the i-th element of
and the j-th element of
. This gives the following component-wise definition of the cross-covariance matrix.
![{\displaystyle \operatorname {K} _{\mathbf {X} \mathbf {Y} }={\begin{bmatrix}\mathrm {E} [(X_{1}-\operatorname {E} [X_{1}])(Y_{1}-\operatorname {E} [Y_{1}])]&\mathrm {E} [(X_{1}-\operatorname {E} [X_{1}])(Y_{2}-\operatorname {E} [Y_{2}])]&\cdots &\mathrm {E} [(X_{1}-\operatorname {E} [X_{1}])(Y_{n}-\operatorname {E} [Y_{n}])]\\\\\mathrm {E} [(X_{2}-\operatorname {E} [X_{2}])(Y_{1}-\operatorname {E} [Y_{1}])]&\mathrm {E} [(X_{2}-\operatorname {E} [X_{2}])(Y_{2}-\operatorname {E} [Y_{2}])]&\cdots &\mathrm {E} [(X_{2}-\operatorname {E} [X_{2}])(Y_{n}-\operatorname {E} [Y_{n}])]\\\\\vdots &\vdots &\ddots &\vdots \\\\\mathrm {E} [(X_{m}-\operatorname {E} [X_{m}])(Y_{1}-\operatorname {E} [Y_{1}])]&\mathrm {E} [(X_{m}-\operatorname {E} [X_{m}])(Y_{2}-\operatorname {E} [Y_{2}])]&\cdots &\mathrm {E} [(X_{m}-\operatorname {E} [X_{m}])(Y_{n}-\operatorname {E} [Y_{n}])]\end{bmatrix}}}](./2ddb845e247c254b51496284762fbdba9532f4c4.svg)
Example
For example, if
and
are random vectors, then
is a
matrix whose
-th entry is
.
Properties
For the cross-covariance matrix, the following basic properties apply:[2]
![{\displaystyle \operatorname {cov} (\mathbf {X} ,\mathbf {Y} )=\operatorname {E} [\mathbf {X} \mathbf {Y} ^{\rm {T}}]-\mathbf {\mu _{X}} \mathbf {\mu _{Y}} ^{\rm {T}}}](./98310d15037b674359dab0b260360fa442336fa3.svg)



- If
and
are independent (or somewhat less restrictedly, if every random variable in
is uncorrelated with every random variable in
), then 
where
,
and
are random
vectors,
is a random
vector,
is a
vector,
is a
vector,
and
are
matrices of constants, and
is a
matrix of zeroes.
Definition for complex random vectors
If
and
are complex random vectors, the definition of the cross-covariance matrix is slightly changed. Transposition is replaced by Hermitian transposition:
![{\displaystyle \operatorname {K} _{\mathbf {Z} \mathbf {W} }=\operatorname {cov} (\mathbf {Z} ,\mathbf {W} ){\stackrel {\mathrm {def} }{=}}\ \operatorname {E} [(\mathbf {Z} -\mathbf {\mu _{Z}} )(\mathbf {W} -\mathbf {\mu _{W}} )^{\rm {H}}]}](./35f2df6d5550bc7eff958f026787a738eb3700d4.svg)
For complex random vectors, another matrix called the pseudo-cross-covariance matrix is defined as follows:
![{\displaystyle \operatorname {J} _{\mathbf {Z} \mathbf {W} }=\operatorname {cov} (\mathbf {Z} ,{\overline {\mathbf {W} }}){\stackrel {\mathrm {def} }{=}}\ \operatorname {E} [(\mathbf {Z} -\mathbf {\mu _{Z}} )(\mathbf {W} -\mathbf {\mu _{W}} )^{\rm {T}}]}](./b70f47f27e26849236481952e871b8e3bcf52708.svg)
Two random vectors
and
are called uncorrelated if their cross-covariance matrix
matrix is a zero matrix.[1]: 337
Complex random vectors
and
are called uncorrelated if their covariance matrix and pseudo-covariance matrix is zero, i.e. if
.
References