On the PaTh to Greatness

D. Vats and R. G. Baraniuk, “Path Thresholding: Asymptotically Tuning-Free High-Dimensional Sparse Regression,” in Proceedings of the 17th International Conference on Artificial Intelligence and Statistics (AISTATS), 2014

Abstract: In this paper, we address the challenging problem of selecting tuning parameters for high-dimensional sparse regression. We propose a simple and computationally efficient method, called path thresholding (PaTh), which transforms any tuning parameter-dependent sparse regression algorithm into an asymptotically tuning-free sparse regression algorithm. More specifically, we prove that, as the problem size becomes large (in the number of variables and in the number of observations), PaTh performs accurate sparse regression, under appropriate conditions, without specifying a tuning parameter. In finite-dimensional settings, we demonstrate that PaTh can alleviate the computational burden of model selection algorithms by significantly reducing the search space of tuning parameters.

The above example illustrates the advantages of using PaTh on real data.  The first figure applies the forward-backward (FoBa) sparse regression algorithm to the UCI crime data.  The horizontal axis specifies the sparsity level and the vertical axis specifies the coefficient values.  The second figure applies PaTh to the solution path in the first figure.  PaTh reduces the total number of solutions from 50 (in the first figure) to 4 (in the second figure). We observe similar trends for the gene data (last two figures).

Software: http://dsp.rice.edu/software/path

This entry was posted in Uncategorized. Bookmark the permalink.

Comments are closed.