Adaptive MCMC schemes for variable selection problems
Data set with many variables (often, in the hundreds, thousands, or more) are routinely collected in many disciplines. This has lead to interest in variable selection in regression models with a large number of variables. A standard Bayesian approach defines a prior on the model space and uses Markov chain Monte Carlo methods to sample the posterior. Unfortunately, the size of the space (2^p if there are p variables) and the use of simple proposals in Metropolis-Hastings steps has lead to samplers that mix poorly over models.
In this talk, I will describe two adaptive Metropolis-Hastings scheme which adapts an independence proposals to the posterior distribution. This leads to substantial improvements in the mixing over standard algorithms in large data sets. The methods will be illustrated on simulated and real data with with hundreds or thousands of possible variables. This is joint work with Krys Latuszynski (Warwick), Mark Steel (Warwick), Kitty Wan (Novartis), Doug Robinson (Novartis) and David Morris (KCL).
Professor Jim Griffin, Professor