Calibration of Audio-Video Sensors for Multi-Modal Event Indexing
MetadataShow full item record
This paper addresses the coordinated use of video and audio cues to capture and index surveillance events with multimodal labels. The focus of this paper is the development of a joint-sensor calibration technique that uses audio-visual observations to improve the calibration process. One significant feature of this approach is the ability to continuously check and update the calibration status of the sensor suite, making it resilient to independent drift in the individual sensors. We present scenarios in which this system is used to enhance surveillance.
Showing items related by title, author, creator and subject.
Kühnapfel, Thorsten (2009)For humans, hearing is the second most important sense, after sight. Therefore, acoustic information greatly contributes to observing and analysing an area of interest. For this reason combining audio and video cues for ...
Walmsley, Corrin; Williams, Sian; Grisbrook, Tiffany; Elliott, Catherine; Imms, C.; Campbell, Amity (2018)Background: Wearable sensors are portable measurement tools that are becoming increasingly popular for the measurement of joint angle in the upper limb. With many brands emerging on the market, each with variations in ...
McAtee, Brendon Kynnie (2003)Remote sensing of land surface temperature (LST) is a complex task. From a satellite-based perspective the radiative properties of the land surface and the atmosphere are inextricably linked. Knowledge of both is required ...