Space-time fluctuations of Earth’s emitted Thermal Infrared (TIR) radiation have been observed from satellite months to weeks before earthquakes occurrence and reported in several studies . Among the various genetic models, the increase of green-house gas (such as CO2, CH4, etc.) emission rates, have been suggested to explain the appearance of anomalous TIR signal transients in some relation with the place and time of earthquake occurrence. Among the others, a Robust Satellite data analysis Technique (RST) has been proposed in order to discriminate normal (i.e. related to the change of natural factor and/or observation conditions) TIR signal fluctuations from anomalous signal transient possibly associated to earthquake occurrence and already implemented on different satellite sensors for study various geotectonic contexts. In this paper RST approach has been applied to 7 years of GOES-10/IMAGER thermal infrared observations over the western part of the United States of America at the time of Hector Mine earthquake (16 October 1999, MW \~{}7.1) in order to verify the effective exportability of the methodology in different geographic contest and when applied using data from a different satellite sensor. Moreover for the first time in this work, the Abruzzo earthquake (6 April 2009, ML \~{}5.8) has been analyzed by applying the RST approach to data of Land Surface Temperature (LST) obtained from geostationary satellite. Both tests have been accompanied by the control analysis performed considering the same periods of validation steps but in a different years in order to verify the absence of TIR anomalies in a relatively seismically unperturbed periods (confutation phase).
Assessment of the exportability of the use of Robust Satellite Technique (RST) for seismic areas monitoring
LISI, MARIANO
2011-01-01
Abstract
Space-time fluctuations of Earth’s emitted Thermal Infrared (TIR) radiation have been observed from satellite months to weeks before earthquakes occurrence and reported in several studies . Among the various genetic models, the increase of green-house gas (such as CO2, CH4, etc.) emission rates, have been suggested to explain the appearance of anomalous TIR signal transients in some relation with the place and time of earthquake occurrence. Among the others, a Robust Satellite data analysis Technique (RST) has been proposed in order to discriminate normal (i.e. related to the change of natural factor and/or observation conditions) TIR signal fluctuations from anomalous signal transient possibly associated to earthquake occurrence and already implemented on different satellite sensors for study various geotectonic contexts. In this paper RST approach has been applied to 7 years of GOES-10/IMAGER thermal infrared observations over the western part of the United States of America at the time of Hector Mine earthquake (16 October 1999, MW \~{}7.1) in order to verify the effective exportability of the methodology in different geographic contest and when applied using data from a different satellite sensor. Moreover for the first time in this work, the Abruzzo earthquake (6 April 2009, ML \~{}5.8) has been analyzed by applying the RST approach to data of Land Surface Temperature (LST) obtained from geostationary satellite. Both tests have been accompanied by the control analysis performed considering the same periods of validation steps but in a different years in order to verify the absence of TIR anomalies in a relatively seismically unperturbed periods (confutation phase).File | Dimensione | Formato | |
---|---|---|---|
Lisi - 2011 - Assessment of the exportability of the use of Robust Satellite Technique (RST) for seismic areas monitoring.pdf
non disponibili
Tipologia:
Documento in Post-print
Licenza:
DRM non definito
Dimensione
301.5 kB
Formato
Adobe PDF
|
301.5 kB | Adobe PDF | Visualizza/Apri Richiedi una copia |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.