PAPER LIST: Bayesian inverse problem
-
Projection-based model reduction:
1.1 Parameter and State Model Reduction for Large-Scale Statistical Inverse Problems
General idea: Using projection-based model reduction method (which is based on a greedy algorithm) to reduce the dimension of both the parameter space and the state space simultaneously. Rudeced basis as surrogate model and also, the basis is used to parameterize the unknown random field. We can consider the model construction as an offline stage since it is done before sampling step.
1.2 Data‐driven model reduction for the Bayesian solution of inverse problems
Motivation: For the purpose of inversion, we do not need a surrogate which has high numerical accuracy over the whole region. On the contrary, it would save a lot of computation cost if the surrogate is accurate near the groundtruth and less accurate otherwise, and the surrogate should be adaptively constructed during the data harvest process, in an adaptive, online way. Not only it can save cost, the base it obtained has lower dimension.
General idea: They first propose a full target algorithm which is basically a modification of delayed rejection MCMC algorithm, however, in the first stage of DR, instead of generate a proposal state from the proposal distribution, they use a subchain whose MH ratio is defined via the reduced order model. The MH ratio is computed via the full model to ensure ergodicity. The modified algorithm changed this point and let the MH ratio in the second stage also compute from the reduced model under some circumenstances. -
MCMC algorithms
2.1 Classical MCMC algorithms:
Metropolis-Hasting MCMC
Adapative Metropolis MCMC
Delayed Rejection MCMC
DRAM MCMC -
Two-stage / Multiscale type MCMC
3.1 Preconditionedn MCMC: Preconditioning Markov Chain Monte Carlo Simulations Using Coarse-Scale Models
General idea: Coarse grid solutions of the underlying PDE were used in a two-stage process to guide sampling to reduce the number of full scale computations. -
Learning-based parameterization method
4.1 A deep-learning-based surrogate model for data assimilation
in dynamic subsurface flow problems
Their work considers 2D immiscible oil-water flow problems. -
SVGD-based Bayesian inverse problems
5.1 Stein Variational Gradient Descent with local approximations
Main idea: Vanilla SVGD requires calculating the gradient of the target distribution, which can difficult in practice. Therefore, they construct a local surrogate of the target distribution via DNN, in this way, the gradient is avaliable.