Void Probabilities and Cauchy-Schwarz Divergence for Generalized Labeled Multi-Bernoulli Models
Access Status
Authors
Date
2017Type
Metadata
Show full item recordCitation
Source Title
ISSN
School
Funding and Sponsorship
Collection
Abstract
Crown The generalized labeled multi-Bernoulli (GLMB) is a family of tractable models that alleviates the limitations of the Poisson family in dynamic Bayesian inference of point processes. In this paper, we derive closed form expressions for the void probability functional and the Cauchy-Schwarz divergence for GLMBs. The proposed analytic void probability functional is a necessary and sufficient statistic that uniquely characterizes a GLMB, while the proposed analytic Cauchy-Schwarz divergence provides a tractable measure of similarity between GLMBs. We demonstrate the use of both results on a partially observed Markov decision process for GLMBs, with Cauchy-Schwarz divergence based reward, and void probability constraint.
Related items
Showing items related by title, author, creator and subject.
-
Beard, M.; Vo, Ba-Ngu; Vo, Ba Tuong; Arulampalam, S. (2015)In this paper, we propose a method for optimal stochastic sensor control, where the goal is to minimise the estimation error in multi-object tracking scenarios. Our approach is based on an information theoretic divergence ...
-
Hoang, Hung Gia; Vo, Ba-Ngu; Vo, Ba Tuong; Mahler, R. (2014)Information theoretic divergences are fundamental tools used to measure the difference between the information conveyed by two random processes. In this paper, we show that the Cauchy-Schwarz divergence between two Poisson ...
-
Liu, Y.; Hoang, Hung Gia (2015)In this paper, we present a novel sensor selection technique for multi-target tracking where the sensor selection criterion is the Cauchy-Schwarz divergence between the predicted and updated densities. The proposed approach ...