Method of ContractionExpansion (MOCE) For Simultaneous PostModel Selection Inference in Linear Models
Seminar/Forum
Abstract: Simultaneous inference after model selection is of critical importance to address scientific hypotheses involving a set of parameters. We consider highdimensional linear regression model in which a regularization procedure such as LASSO is applied to yield a sparse model. To establish a simultaneous postmodel selection inference, we propose a method of contraction and expansion (MOCE) along the line of debiasing estimation that enables us to balance the biasandvariance tradeoff so that the supersparsity assumption may be relaxed. We establish key theoretical results for the proposed MOCE procedure from which the expanded model can be selected with theoretical guarantees and simultaneous confidence regions can be constructed by the joint asymptotic normal distribution. In comparison with existing methods, our proposed method exhibits stable and reliable coverage at a nominal significance level with substantially less computational burden, and thus it is trustworthy for its application in solving realworld problems. This is a joint work with Wang, Zhou and Tang.
Presenter

Professor Peter Song, University of Michigan