 # Online Estimation of the Inverse Covariance Matrix

This post is part 5 of a series of five articles:

$$\def\myX{\mathbf{x}} \def\myM{\mathbf{M}} \def\myY{\mathbf{y}} \def\mySigma{\mathbf{\Sigma}} \def\myM{\mathbf{M}} \def\myT{\mathsf{T}}$$

In practice, it is often required to estimate the inverse of the covariance matrix of a Gaussian distribution, for example, when only a Mahalanobis distance between points has to be computed. In an online setting it can be quite expensive to compute the inverse of the covariance matrix again for each new point arriving, since the complexity of the matrix inverse is $\mathcal{O(n^3)}$. As we will see in the following, it is not necessary to estimate the covariance matrix, if only its inverse is needed and furthermore, the inverse can be adapted incrementally in an efficient manner. The results are shown for the general case with a weighted exponentially decaying estimator.

Remember that the weighted exponentially decaying estimator could be incrementally adapt the coviarance matrix using the following relations:

Now, in order to compute the inverse, we simply apply the Sherman-Morrison formula  – a special case of the Woodbury matrix identity  – to incrementally update $M_n^{-1}$. The formula is given by

\begin{align} (A+uv^\mathsf{T})^{-1}= A^{-1} - \frac{A^{-1}uv^\myT A^{-1}}{1+v^\myT A^{-1}u}. \end{align}

If we look at Eq. \eqref{eq:M}, we can identify:

In the following some R-code is listed, which illustrates the procedure to incrementally estimate the inverse of the covariance matrix for a set of points collected in the matrix $X$.