Importance Sampling for Continuous Time Bayesian Networks (2010)

by Yu Fan, Jing Xu, and Christian R. Shelton

Abstract: A continuous time Bayesian network (CTBN) uses a structured representation to describe a dynamic system with a finite number of states which evolves in continuous time. Exact inference in a CTBN is often intractable as the state space of the dynamic system grows exponentially with the number of variables. In this paper, we first present an approximate inference algorithm based on importance sampling. We then extend it to continuous-time particle filtering and smoothing algorithms. These three algorithms can estimate the expectation of any function of a trajectory, conditioned on any evidence set constraining the values of subsets of the variables over subsets of the time line. We present experimental results on both synthetic networks and a network learned from a real data set on people's life history events. We show the accuracy as well as the time efficiency of our algorithms, and compare them to other approximate algorithms: expectation propagation and Gibbs sampling.

Download Information

Yu Fan, Jing Xu, and Christian R. Shelton (2010). "Importance Sampling for Continuous Time Bayesian Networks." Journal of Machine Learning Research, 11(Aug), 2115-2140. pdf   ps    

Bibtex citation

@article{FanXuShe10,
   author = "Yu Fan and Jing Xu and Christian R. Shelton",
   title = "Importance Sampling for Continuous Time {B}ayesian Networks",
   journal = "Journal of Machine Learning Research",
   volume = 11,
   number = "Aug",
   year = 2010,
   pages = "2115--2140",
}

full list