An analysis of implicit sampling in the small-noise limit

Kevin Lin, Univeristy of Arizona
October 21st, 2015 at 3:30PM–4:30PM in 939 Evans Hall [Map]

Weighted direct samplers, also known as importance samplers, are Monte Carlo algorithms for generating independent, weighted samples from a given target probability distribution.  Such algorithms have a variety of applications in, e.g., data assimilation and state estimation problems involving stochastic and chaotic dynamics.  One challenge in designing and implementing weighted samplers is to ensure the variance of the weights (and that of the resulting estimator) are well-behaved.  In recent work, Chorin, Tu, Morzfeld, and coworkers have introduced a class of novel weighted samplers, called implicit samplers, which have been shown to possess a number of nice properties.  In this talk, I will report on an analysis of the variance of implicit samplers in the small-noise limit, and describe a simple method (suggested by the analysis) to obtain higher-order implicit samplers. The algorithms are compared on a number of concrete examples.  This is joint work with Jonathan Goodman and Matthias Morzfeld.