Felix Abramovich, Yoav Benjamini (Tel Aviv University, Israel),

David Donoho, Iain Johnston (Stanford University, USA)

Adapting to unknown sparsity by control of the False Discovery Rate

We consider the problem of recovering a high-dimensional vector observed in white noise, where the vector is known to be sparse, but the degree of sparsity is unknown. We consider three different ways of defining sparsity of a vector: using the fraction of nonzero terms; imposing power-law decay bounds on the ordered entries; and controlling the norm for p small. We obtain a procedure which is asymptotically minimax for loss, simultaneously throughout a range of such sparsity classes. The simultaneous asymptotic minimaxity is achieved by a data-adaptive thresholding scheme, based on controlling the False Discovery Rate (FDR). FDR control is a recent innovation in simultaneous testing, in which one seeks to ensure that at most a certain fraction of the rejected null hypotheses will correspond to false rejections. In our treatment, the FDR control parameter q plays an informative role in understanding how to achieve asymptotic minimaxity. Our results say that letting with problem size n is sufficient for asymptotic minimaxity, while keeping q > 1/2 prevents asymptotic minimaxity. To our knowledge, this relation between ideas in simultaneous inference and asymptotic decision theory is new. Our work suggest insights about a class of model selection procedures which has been introduced recently by several authors. These new procedures are based on complexity penalization of the form . We exhibit a close connection to FDR-controlling procedures with q tending to 0, which strongly supports a conjecture of simultaneous asymptotic minimaxity of such procedures.