This paper presents a study on Sparse Linear Regression (SLR), a statistical problem of predicting a response vector with a sparse vector and small arbitrary noise. The authors provide evidence of the average-case hardness of SLR with respect to all efficient algorithms, assuming the worst-case hardness of lattice problems. They do this by reducing a variant of the bounded distance decoding (BDD) problem on lattices to SLR. The research also shows the hardness of outputting any good solution in the unidentifiable regime where there are many solutions, assuming the worst-case hardness of standard and well-studied lattice problems.


Publication date: 22 Feb 2024
Project Page: