Kalman filter toolbox
Written by Kevin Murphy, 1998.
Last updated: 19 November 1999.
This toolbox supports filtering, smoothing and parameter estimation
(using EM) for Linear Dynamical Systems.
Note: Ali Taylan Cemgil has Matlab code for
the extended
switching Kalman filter.
- kalman_filter
- kalman_smoother - implements the RTS equations
- learn_kalman - finds maximum likelihood estimates of the parameters using EM
- sample_lds - generate random samples
- AR_to_SS - convert Auto Regressive model of order k to State Space form
- SS_to_AR
- learn_AR - finds maximum likelihood estimates of the parameters
using least squares
For an excellent web site,
see
Welch/Bishop's KF
page.
For a brief intro, read on...
A Linear Dynamical System is a partially observed stochastic process with linear
dynamics and linear observations, both subject to Gaussian noise.
It can be defined as follows, where X(t) is the hidden state at time
t, and Y(t) is the observation.
x(t+1) = F*x(t) + w(t), w ~ N(0, Q), x(0) ~ N(X(0), V(0))
y(t) = H*x(t) + v(t), v ~ N(0, R)
The conditional independence assumptions that are being made can be
concisely modelled using a Bayesian network as
follows.
(Circles denote random variables with linear-Gaussian
distributions,
clear means hidden, shaded means observed.)
The Kalman filter is an algorithm for performing filtering on this
model, i.e., computing P(X(t) | Y(1), ..., Y(t)).
The Rauch-Tung-Striebel (RTS) algorithm performs fixed-interval
offline smoothing, i.e., computing P(X(t) | Y(1), ..., Y(T)), for t <= T.
Here is a simple example.
Consider a particle moving in the plane at
constant velocity subject to random perturbations in its trajectory.
The new position (x1, x2) is the old position plus the velocity (dx1,
dx2) plus noise w.
[ x1(t) ] = [1 0 1 0] [ x1(t-1) ] + [ wx1 ]
[ x2(t) ] [0 1 0 1] [ x2(t-1) ] [ wx2 ]
[ dx1(t) ] [0 0 1 0] [ dx1(t-1) ] [ wdx1 ]
[ dx2(t) ] [0 0 0 1] [ dx2(t-1) ] [ wdx2 ]
We assume we only observe the position of the particle.
[ y1(t) ] = [1 0 0 0] [ x1(t) ] + [ vx1 ]
[ y2(t) ] [0 1 0 0] [ x2(t) ] [ vx2 ]
[ dx1(t) ]
[ dx2(t) ]
Suppose we start out at position (10,10) moving to the right with
velocity (1,0).
We sampled a random trajectory of length 15.
Below we show the filtered and smoothed trajectories.
The mean squared error of the filtered estimate is 4.9; for the
smoothed estimate it is 3.2.
Not only is the smoothed estimate better, but we know that
it is better, as illustrated by the smaller uncertainty ellipses;
this can help in e.g., data association problems.
Note how the smoothed ellipses are larger at the ends, because these
points have seen less data. Also, note how rapidly the filtered
ellipses reach their steady-state (Ricatti) values.
(Click here to see the code used to generate this
picture, which illustrates how easy it is to use the toolkit.)
For nonlinear systems with Gaussian noise, the
unscented Kalman filter (UKF)
due to Julier and Uhlmann, is far superior to the extended Kalman
filter (EKF), both in theory and practice.
The key idea of the UKF is that it is easier to estimate a Gaussian distribution
from a set of points than to approximate an arbitrary non-linear
function.
We start with points that are plus/minus sigma away from the mean along
each dimension, and then pipe them through the nonlinearity, and
then fit a Gaussian to the transformed points.
(No need to compute Jacobians, unlike the EKF!)
For systems with non-Gaussian noise, I recommend
Particle
filtering (PF), which is a popular sequential Monte Carlo technique.
See
also this
discussion on
pros/cons of particle filters.
and the following tutorial:
M. Arulampalam, S. Maskell, N. Gordon, T. Clapp,
"A Tutorial on Particle
Filters for Online Nonlinear/Non-Gaussian Bayesian Tracking," IEEE
Transactions on Signal Processing, Volume 50, Number 2, February 2002, pp
174-189.
The EKF can be used as a proposal distribution for a PF.
This method is better than either one alone.
The Unscented Particle Filter,
by R van der Merwe, A Doucet, JFG de Freitas and E Wan, May 2000.
Matlab software
for the UPF is also available.
-
The system identification toolbox from the Mathworks implements many
classical algorithms.
-
ARfit is an excellent package for autoregressive models.
-
Zoubin Ghahramani
has matlab code for EM in LDS's which is similar to mine, but is partially written in C.
- KBF,
an implementation of the Kalman filter-smoother in Omatrix,
a (supposedly faster) version of matlab.
- Le Sage's
econometrics toolbox, contains lots of excellent matlab time series
modelling functions
- Econometric Links
Econometrics Journal. Most of the software is either commercial or written in
Gauss, which is similar to Matlab.
- SSF pack is
a set of C routines for state-space filtering.
- Stamp is a
commercial package for structural time series analysis.
-
Welch/Bishop's KF page
- T. Minka,
"From HMMs to LDSs",
tech report.
-
Sam Roweis'
tutorial slides
- S. Roweis and Z. Ghahramani (1997)
A Unifying Review of Linear Gaussian Models
Neural Computation
- K. Murphy, 1998.
"Filtering, Smoothing, and the Junction Tree Algorithm",.
tech. report.
- K. Murphy, 1998.
"Switching Kalman Filters",
tech. report.
- Ghahramani and Hinton, 1996.
"Parameter Estimation for LDS", tech report.
- Bar-Shalom and Li, "Estimation and Tracking", 1983.
- A. Doucet, N. de Freitas and N.J. Gordon,
"Sequential Monte Carlo Methods in Practice", Springer-Verlag, 2000 (forthcoming)