Adaptive particle filtering approach to audio-visual tracking

Volkan Kilic, Mark Barnard, Wenwu Wang, Josef Kittler

    Research output: Contribution to conferencePaperpeer-review

    Abstract

    Particle filtering has emerged as a useful tool for tracking problems. However, the efficiency and accuracy of the filter usually depend on the number of particles and noise variance used in the estimation and propagation functions for re-allocating these particles at each iteration. Both of these parameters are specified beforehand and are kept fixed in the regular implementation of the filter which makes the tracker unstable in practice. In this paper we are interested in the design of a particle filtering algorithm which is able to adapt the number of particles and noise variance. The new filter, which is based on audio-visual (AV) tracking, uses information from the tracking errors to modify the number of particles and noise variance used. Its performance is compared with a previously proposed audio-visual particle filtering algorithm with a fixed number of particles and an existing adaptive particle filtering algorithm, using the AV 16.3 dataset with single and multi-speaker sequences. Our proposed approach demonstrates good tracking performance with a significantly reduced number of particles. © 2013 EURASIP.
    Original languageEnglish
    Publication statusPublished - Sept 2013
    Event21st European Signal Processing Conference - Marrakech, Morocco
    Duration: 9 Sept 201313 Sept 2013

    Conference

    Conference21st European Signal Processing Conference
    Period9/09/1313/09/13

    Keywords

    • Computer science and informatics

    Fingerprint

    Dive into the research topics of 'Adaptive particle filtering approach to audio-visual tracking'. Together they form a unique fingerprint.

    Cite this