A Smooth Binary Mechanism for Efficient Private Continual Observation
Abstract
In privacy under continual observation we study how to release differentially private estimates based on a dataset that evolves over time. The problem of releasing private prefix sums of $x_1,x_2,x_3,\dots \in\{0,1\}$ (where the value of each $x_i$ is to be private) is particularly wellstudied, and a generalized form is used in stateoftheart methods for private stochastic gradient descent (SGD). The seminal binary mechanism privately releases the first $t$ prefix sums with noise of variance polylogarithmic in $t$. Recently, Henzinger et al. and Denisov et al. showed that it is possible to improve on the binary mechanism in two ways: The variance of the noise can be reduced by a (large) constant factor, and also made more even across time steps. However, their algorithms for generating the noise distribution are not as efficient as one would like in terms of computation time and (in particular) space. We address the efficiency problem by presenting a simple alternative to the binary mechanism in which 1) generating the noise takes constant average time per value, 2) the variance is reduced by a factor about 4 compared to the binary mechanism, and 3) the noise distribution at each step is identical. Empirically, a simple Python implementation of our approach outperforms the running time of the approach of Henzinger et al., as well as an attempt to improve their algorithm using highperformance algorithms for multiplication with Toeplitz matrices.
 Publication:

arXiv eprints
 Pub Date:
 June 2023
 DOI:
 10.48550/arXiv.2306.09666
 arXiv:
 arXiv:2306.09666
 Bibcode:
 2023arXiv230609666A
 Keywords:

 Computer Science  Machine Learning;
 Computer Science  Cryptography and Security;
 Computer Science  Data Structures and Algorithms
 EPrint:
 Appeared at NeurIPS 2023