Stochastic optimization methods, such as genetic algorithms, search for the global minimum of the misfit function within a given parameter range and do not require any calculation of the gradients of the misfit surfaces. More importantly, these methods collect a series of models and associated likelihoods that can be used to estimate the posterior probability distribution. However, because genetic algorithms are not a Markov chain Monte Carlo method, the direct use of the genetic-algorithm-sampled models and their associated likelihoods produce a biased estimation of the posterior probability distribution. In contrast, Markov chain Monte Carlo methods, such as the Metropolis-Hastings and Gibbs sampler, provide accurate posterior probability distributions but at considerable computational cost. In this paper, we use a hybrid method that combines the speed of a genetic algorithm to find an optimal solution and the accuracy of a Gibbs sampler to obtain a reliable estimation of the posterior probability distributions. First, we test this method on an analytical function and show that the genetic algorithm method cannot recover the true probability distributions and that it tends to underestimate the true uncertainties. Conversely, combining the genetic algorithm optimization with a Gibbs sampler step enables us to recover the true posterior probability distributions. Then, we demonstrate the applicability of this hybrid method by performing one-dimensional elastic full-waveform inversions on synthetic and field data. We also discuss how an appropriate genetic algorithm implementation is essential to attenuate the "genetic drift" effect and to maximize the exploration of the model space. In fact, a wide and efficient exploration of the model space is important not only to avoid entrapment in local minima during the genetic algorithm optimization but also to ensure a reliable estimation of the posterior probability distributions in the subsequent Gibbs sampler step.

1D elastic full-waveform inversion and uncertainty estimation by means of a hybrid genetic algorithm-Gibbs sampler approach

ALEARDI, MATTIA;MAZZOTTI, ALFREDO
2017-01-01

Abstract

Stochastic optimization methods, such as genetic algorithms, search for the global minimum of the misfit function within a given parameter range and do not require any calculation of the gradients of the misfit surfaces. More importantly, these methods collect a series of models and associated likelihoods that can be used to estimate the posterior probability distribution. However, because genetic algorithms are not a Markov chain Monte Carlo method, the direct use of the genetic-algorithm-sampled models and their associated likelihoods produce a biased estimation of the posterior probability distribution. In contrast, Markov chain Monte Carlo methods, such as the Metropolis-Hastings and Gibbs sampler, provide accurate posterior probability distributions but at considerable computational cost. In this paper, we use a hybrid method that combines the speed of a genetic algorithm to find an optimal solution and the accuracy of a Gibbs sampler to obtain a reliable estimation of the posterior probability distributions. First, we test this method on an analytical function and show that the genetic algorithm method cannot recover the true probability distributions and that it tends to underestimate the true uncertainties. Conversely, combining the genetic algorithm optimization with a Gibbs sampler step enables us to recover the true posterior probability distributions. Then, we demonstrate the applicability of this hybrid method by performing one-dimensional elastic full-waveform inversions on synthetic and field data. We also discuss how an appropriate genetic algorithm implementation is essential to attenuate the "genetic drift" effect and to maximize the exploration of the model space. In fact, a wide and efficient exploration of the model space is important not only to avoid entrapment in local minima during the genetic algorithm optimization but also to ensure a reliable estimation of the posterior probability distributions in the subsequent Gibbs sampler step.
2017
Aleardi, Mattia; Mazzotti, Alfredo
File in questo prodotto:
File Dimensione Formato  
Geoph Prosp 2016 Aleardi Mazzotti.pdf

solo utenti autorizzati

Descrizione: versione editoriale finale
Tipologia: Versione finale editoriale
Licenza: NON PUBBLICO - Accesso privato/ristretto
Dimensione 9.73 MB
Formato Adobe PDF
9.73 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
paper_GA_GS_postprint.pdf

accesso aperto

Descrizione: post-print
Tipologia: Documento in Post-print
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 3.23 MB
Formato Adobe PDF
3.23 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11568/810500
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 61
  • ???jsp.display-item.citation.isi??? 45
social impact