TY - JOUR
T1 - An Approximated Collapsed Variational Bayes Approach to Variable Selection in Linear Regression
AU - You, Chong
AU - Ormerod, John T.
AU - Li, Xiangyang
AU - Pang, Cheng Heng
AU - Zhou, Xiao Hua
N1 - Publisher Copyright:
© 2023 American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America.
PY - 2023
Y1 - 2023
N2 - In this work, we propose a novel approximated collapsed variational Bayes approach to model selection in linear regression. The approximated collapsed variational Bayes algorithm offers improvements over mean field variational Bayes by marginalizing over a subset of parameters and using mean field variational Bayes over the remaining parameters in an analogous fashion to collapsed Gibbs sampling. We have shown that the proposed algorithm, under typical regularity assumptions, (a) includes variables in the true underlying model at an exponential rate in the sample size, or (b) excludes the variables at least at the first order rate in the sample size if the variables are not in the true model. Simulation studies show that the performance of the proposed method is close to that of a particular Markov chain Monte Carlo sampler and a path search based variational Bayes algorithm, but requires an order of magnitude less time. The proposed method is also highly competitive with penalized methods, expectation propagation, stepwise AIC/BIC, BMS, and EMVS under various settings. Supplementary materials for the article are available online.
AB - In this work, we propose a novel approximated collapsed variational Bayes approach to model selection in linear regression. The approximated collapsed variational Bayes algorithm offers improvements over mean field variational Bayes by marginalizing over a subset of parameters and using mean field variational Bayes over the remaining parameters in an analogous fashion to collapsed Gibbs sampling. We have shown that the proposed algorithm, under typical regularity assumptions, (a) includes variables in the true underlying model at an exponential rate in the sample size, or (b) excludes the variables at least at the first order rate in the sample size if the variables are not in the true model. Simulation studies show that the performance of the proposed method is close to that of a particular Markov chain Monte Carlo sampler and a path search based variational Bayes algorithm, but requires an order of magnitude less time. The proposed method is also highly competitive with penalized methods, expectation propagation, stepwise AIC/BIC, BMS, and EMVS under various settings. Supplementary materials for the article are available online.
KW - Collapsed Gibbs sampling
KW - Consistency
KW - Markov chain Monte Carlo
UR - http://www.scopus.com/inward/record.url?scp=85147678802&partnerID=8YFLogxK
U2 - 10.1080/10618600.2022.2149539
DO - 10.1080/10618600.2022.2149539
M3 - Article
AN - SCOPUS:85147678802
SN - 1061-8600
JO - Journal of Computational and Graphical Statistics
JF - Journal of Computational and Graphical Statistics
ER -