Background subtraction is a common method for detecting moving objects from static cameras able to achieve real-time performance. However, it is highly dependent on a good background model particularly to deal with dynamic scenes. In this paper a novel real-time algorithm for creating a robust and multimodal background model is presented. The proposed approach is based on an on-line clustering algorithm to create the model and on a novel conditional update mechanism that allows for obtaining an accurate foreground mask. A quantitative comparison of the algorithm with several state-of-the-art methods on a well-known benchmark dataset is provided demonstrating the effectiveness of the approach. © 2012 Taylor & Francis Group.

Independent multimodal background subtraction

BLOISI, Domenico Daniele;
2012-01-01

Abstract

Background subtraction is a common method for detecting moving objects from static cameras able to achieve real-time performance. However, it is highly dependent on a good background model particularly to deal with dynamic scenes. In this paper a novel real-time algorithm for creating a robust and multimodal background model is presented. The proposed approach is based on an on-line clustering algorithm to create the model and on a novel conditional update mechanism that allows for obtaining an accurate foreground mask. A quantitative comparison of the algorithm with several state-of-the-art methods on a well-known benchmark dataset is provided demonstrating the effectiveness of the approach. © 2012 Taylor & Francis Group.
2012
9780203075371
File in questo prodotto:
File Dimensione Formato  
independent-multimodal-background-subtraction.pdf

non disponibili

Tipologia: Documento in Pre-print
Licenza: DRM non definito
Dimensione 1.9 MB
Formato Adobe PDF
1.9 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11563/137519
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 54
  • ???jsp.display-item.citation.isi??? 43
social impact