Derivation of the PHD and CPHD Filters Based on Direct Kullback–Leibler Divergence Minimization



Garcia-Fernandez, Angel F ORCID: 0000-0002-6471-8455 and Vo, Ba-Ngu
(2015) Derivation of the PHD and CPHD Filters Based on Direct Kullback–Leibler Divergence Minimization. IEEE Transactions on Signal Processing, 63 (21). pp. 5812-5820.

[img] Text
PHD_CPHD_derivation_accepted1.pdf - Author Accepted Manuscript

Download (751kB)

Abstract

In this paper, we provide novel derivations of the probability hypothesis density (PHD) and cardinalised PHD (CPHD) filters without using probability generating functionals or functional derivatives. We show that both the PHD and CPHD filters fit in the context of assumed density filtering and implicitly perform Kullback-Leibler divergence (KLD) minimizations after the prediction and update steps. We perform the KLD minimizations directly on the multitarget prediction and posterior densities.

Item Type: Article
Uncontrolled Keywords: Random finite sets, PHD filter, CPHD filter, multiple target tracking, Kullback-Leibler divergence
Depositing User: Symplectic Admin
Date Deposited: 03 Jan 2018 11:43
Last Modified: 15 Mar 2024 13:55
DOI: 10.1109/TSP.2015.2468677
Related URLs:
URI: https://livrepository.liverpool.ac.uk/id/eprint/3015342