Go to the notes + code (preliminary and in progress)
One of the joys of working in academia is the time available during summers to learn new tools. One of my projects this summer has been making notes for Petris et al’s Dynamic Linear Models in R text. This text is a Bayesian treatment of dynamic linear (DLM) / state space models (SSM) and a companion to the authors’ DLM package for R. Currently, there are no comparable packages written in Python, the language in which the bulk of my data collection and munging is written. Thus, my attempt at a detailed summary of the text’s content and writing Python code for the estimation/simulation procedures contained therein should serve both pedagogical and research purposes.
A brief overview of DLMs
Conceptually, SSMs attempt to capture time series data as comprised of 2 components:
- a hidden (or latent) process that describes the true state of some system over time.
- an observation process that maps the “true” latent states onto observable measures (with error).
There are 3 major functions for these types of models:
- Figuring out the probabilistic distribution of the current state of the latent space (aka filtering, e.g. Kalman Filter in the DLM case). This may be useful for on line applications that require on demand decisions making.
- Estimating the historical true latent states (aka smoothing). This is typically what economic researchers attempt to recover.
- Predicting the future states of the world (aka forecasting). This is often of interest for policy makers.
DLMs are just a special case of SSMs where distributional assumptions are Gaussian and models are linear and additive such that they can be generically written as:
where the first line is the observational equation (Yt is the vector of observed time series data) and the second line is the state equation (θt is the vector of latent states). Ft is loading matrix for observed data (how states map to measurements) and Gt is the transition matrix that describes the evolution of latent states over time. The evolution of the latent state spaces can include components such as: seasonality, (polynomial) non-stationary trends, autoregressive processes, random walks (noise process), and regression components. νt and wt are the normally distributed realized errors for the observation and state equations with covariances Vt and Wt, respectively.
From signal processing perspective, the ratio of Wt to Vt represents the signal to noise ratio. One can think of DLMs as a method of ascribing the change in observed data over time to changes in the true latent state and to measurement noise after controlling for “known” measurement and state transition processes.
In the physical sciences, SSMs often have clear choices for model parameters. For example, the position of a ballistic object equipped with sensors can be predicted fairly accurately given the laws of motion – the state transition process. Moreover, the measurement instrument often has well known precision specifications (the observation equation has a known Vt).
In economic settings, we often do not have very precise theory about either the state or the measurement processes. The specification about the transition process, though “guided” by theory, is mostly flexibly ad-hoc. It is often chosen by comparing various model specifications’ out of sample predictive properties. Furthermore, we often need to estimate many missing parameters from the data in economic applications. This typically includes, at a minimum, the observation and state space covariance matrices – adding a layer of complexity to the application of DLMs in social science settings. It is in the estimation of these unknown parameters that Bayesian simulation methods are often useful (or even necessary).
So what are some applicable economic settings? SSMs are used to…
- model business cycles given various economic indicators (ala Diebold & Rudebusch).
- model the evolution of asset prices (e.g. Engle-type ARCH models)
- on line forecasting in digital commerce applications (e.g. Scott & Varian)
- spotting brand trends using search data (Du & Kamakura)
Though the modeling context of these scenarios are quite diverse, the models all center around the assumption that there is some latent truth in each context that is measured with noise. In the above 4 examples, we roughly have the following latent spaces: current state of the business cycle, asset’s underlying value as a function of market factors, the expected clicks/sales, and the consumer interest of a brand, respectively.
Have thoughts of novel applications of DLMs/SSMs? I’d love to hear from you.
Notes and Code (preliminary, mostly for my access, use with caution)
10/22/17 (Update #2)
Dynamic factor model estimation notes and code, adapted from Rex Du, included.
7/20/17 (Update #1)
I am currently about 2/3 through the text with much of the estimation code already written in a series of Jupiter notebooks. I will update this Dropbox folder as I complete this project.
2 thoughts on “State Space Modeling Notes”