For this purpose, we consider a deep Neyman-Scott process in this thesis, for which the building components of a network are all Poisson processes. The number of variables is a random variable.
We develop an efficient posterior sampling via Markov chain Monte Carlo and use it for likelihood-based inference. Our method allows for the inference in sophisticated hierarchical point processes. We show in the experiments that more hidden Poisson processes yield better performance for likelihood fitting and event types prediction. We also compare our method with recently developed neural-network-based models for temporal real-world datasets and demonstrate competitive abilities for both data fitting and prediction.
We also show that we can use our likelihood-based inference algorithm to learn approximate posterior point processes. Our approximate posterior point processes are close to the true posterior point processes. When doing prediction, we no longer have to do Markov chain Monte Carlo to approximate the true posterior point processes. Instead, we can directly sample our approximate posterior point processes based on the observed data, and the prediction performance of our approximate posterior point processes is better than the prediction performance of the samples from Markov chain Monte Carlo when only a limited amount of running time is allowed.
Chengkuan Hong (2022). Deep Neyman-Scott Processes. Doctoral dissertation, University of California at Riverside. |
@phdthesis{Hon22, author = "Chengkuan Hong", title = "Deep {N}eyman-{S}cott Processes", school = "University of California at Riverside", schoolabbr = "UC Riverside", year = 2022, month = Sept, }