Audio constrained particle filter based visual tracking

  • Volkan Kilic
  • , Mark Barnard
  • , Wenwu Wang
  • , Josef Kittler

    Research output: Contribution to conferencePaperpeer-review

    Abstract

    We present a robust and efficient audio-visual (AV) approach to speaker tracking in a room environment. A challenging problem with visual tracking is to deal with occlusions (caused by the limited field of view of cameras or by other speakers). Another challenge is associated with the particle filtering (PF) algorithm, commonly used for visual tracking, which requires a large number of particles to ensure the distribution is well modelled. In this paper, we propose a new method of fusing audio into the PF based visual tracking. We use the direction of arrival angles (DOAs) of the audio sources to reshape the typical Gaussian noise distribution of particles in the propagation step and to weight the observation model in the measurement step. Experiments on AV16.3 datasets show the advantage of our proposed method over the baseline PF method for tracking occluded speakers with a significantly reduced number of particles. © 2013 IEEE.
    Original languageEnglish
    DOIs
    Publication statusPublished - Oct 2013
    Event2013 IEEE International Conference on Acoustics, Speech and Signal Processing - Vancouver, Canada
    Duration: 26 May 201331 May 2013

    Conference

    Conference2013 IEEE International Conference on Acoustics, Speech and Signal Processing
    Period26/05/1331/05/13

    Keywords

    • Computer science and informatics

    Fingerprint

    Dive into the research topics of 'Audio constrained particle filter based visual tracking'. Together they form a unique fingerprint.

    Cite this