The Horseshoe Estimator for Sparse Signals
##
The Horseshoe Estimator for Sparse Signals

### Carlos M. Carvalho, Nicholas G. Polson, and James G. Scott

### University of Chicago GSB and Duke University

* October 2008 *

This paper proposes a new approach to sparse-signal detection called the horseshoe estimator. The horseshoe is a close cousin of other widely used Bayes rules arising from, for example, double-exponential and Cauchy priors, in that it is a member of the same family of multivariate scale mixtures of normals. Its advantage, however, is its robustness at handling unknown sparsity and large outlying signals. We give analytical results showing why the horseshoe is a good default for robust estimation of sparse normal means, along with a
new representation theorem for the posterior mean under normal scale mixtures. This theorem is related to classic results of Stein and Masreliez, and gives qualitative insight into some aspects of robust Bayesian analysis in sparse settings. Most importantly, we show that the horseshoe estimator corresponds quite closely to the answers one would get by pursuing a full Bayesian model-averaging approach using a traditional `spike and slab' prior to model signals and noise. This correspondence holds both for the posterior mean itself and
for the classification rule induced by a simple thresholding scheme, meaning that the resulting `thresholded horseshoe' can also be viewed as a novel Bayes multiple-testing procedure.

Keywords: shrinkage; lasso; Bayesian lasso; multiple testing; empirical Bayes

The manuscript is in available in PDF
format.

Cite as:

@TechReport{Carvalho:Polson:Scott:2008a,
Author = "Carlos M.~Carvalho and Nicholas G.~Polson and James G.~Scott",
Title = "The horseshoe estimator for sparse signals",
Year = 2008,
Institution = "Duke University Department of Statistical Science",
Type = "Discussion Paper",
Number = "2008-31",
}