Lower Bounds for the MMSE via Neural Network Estimation and Their Applications to Privacy
Abstract
The minimum meansquare error (MMSE) achievable by optimal estimation of a random variable $Y\in\mathbb{R}$ given another random variable $X\in\mathbb{R}^{d}$ is of much interest in a variety of statistical settings. In the context of estimationtheoretic privacy, the MMSE has been proposed as an information leakage measure that captures the ability of an adversary in estimating $Y$ upon observing $X$. In this paper we establish provable lower bounds for the MMSE based on a twolayer neural network estimator of the MMSE and the Barron constant of an appropriate function of the conditional expectation of $Y$ given $X$. Furthermore, we derive a general upper bound for the Barron constant that, when $X\in\mathbb{R}$ is postprocessed by the additive Gaussian mechanism and $Y$ is binary, produces order optimal estimates in the large noise regime. In order to obtain numerical lower bounds for the MMSE in some concrete applications, we introduce an efficient optimization process that approximates the value of the proposed neural network estimator. Overall, we provide an effective machinery to obtain provable lower bounds for the MMSE.
 Publication:

arXiv eprints
 Pub Date:
 August 2021
 DOI:
 10.48550/arXiv.2108.12851
 arXiv:
 arXiv:2108.12851
 Bibcode:
 2021arXiv210812851D
 Keywords:

 Computer Science  Information Theory
 EPrint:
 42 pages