Stay in the loop.
Subscribe to our newsletter for a weekly update on the latest podcast, news, events, and jobs postings.
Tight Lower Bounds for Locally Differentially Private Selection.
We prove a tight lower bound (up to constant factors) on the samplecomplexity of any non-interactive local differentially private protocol foroptimizing a linear function over the simplex. This lower bound also implies atight lower bound (again, up to constant factors) on the sample complexity ofany non-interactive local differentially private protocol implementing theexponential mechanism. These results reveal that any local protocol for theseproblems has exponentially worse dependence on the dimension than correspondingalgorithms in the central model. Previously, Kasiviswanathan et al. (FOCS 2008)proved an exponential separation between local and central model algorithms forPAC learning the class of parity functions. In contrast, our lower bound arequantitatively tight, apply to a simple and natural class of linearoptimization problems, and our techniques are arguably simpler.