In this work, we propose a novel approximated collapsed variational Bayes approach to model selection in linear regression. The approximated collapsed variational Bayes algorithm offers improvements over mean field variational Bayes by marginalizing over a subset of parameters and using mean field variational Bayes over the remaining parameters in an analogous fashion to collapsed Gibbs sampling. We have shown that the proposed algorithm, under typical regularity assumptions, (a) includes variables in the true underlying model at an exponential rate in the sample size, or (b) excludes the variables at least at the first order rate in the sample size if the variables are not in the true model. Simulation studies show that the performance of the proposed method is close to that of a particular Markov chain Monte Carlo sampler and a path search based variational Bayes algorithm, but requires an order of magnitude less time. The proposed method is also highly competitive with penalized methods, expectation propagation, stepwise AIC/BIC, BMS, and EMVS under various settings. Supplementary materials for the article are available online.
- Collapsed Gibbs sampling
- Markov chain Monte Carlo
ASJC Scopus subject areas
- Discrete Mathematics and Combinatorics
- Statistics and Probability
- Statistics, Probability and Uncertainty