Distributed Matrix-Vector Multiplication with Sparsity and Privacy Guarantees
Abstract
We consider the problem of designing a coding scheme that allows both sparsity and privacy for distributed matrix-vector multiplication. Perfect information-theoretic privacy requires encoding the input sparse matrices into matrices distributed uniformly at random from the considered alphabet; thus destroying the sparsity. Computing matrix-vector multiplication for sparse matrices is known to be fast. Distributing the computation over the non-sparse encoded matrices maintains privacy, but introduces artificial computing delays. In this work, we relax the privacy constraint and show that a certain level of sparsity can be maintained in the encoded matrices. We consider the chief/worker setting while assuming the presence of two clusters of workers: one is completely untrusted in which all workers collude to eavesdrop on the input matrix and in which perfect privacy must be satisfied; in the partly trusted cluster, only up to $z$ workers may collude and to which revealing small amount of information about the input matrix is allowed. We design a scheme that trades sparsity for privacy while achieving the desired constraints. We use cyclic task assignments of the encoded matrices to tolerate partial and full stragglers.
- Publication:
-
arXiv e-prints
- Pub Date:
- March 2022
- DOI:
- 10.48550/arXiv.2203.01728
- arXiv:
- arXiv:2203.01728
- Bibcode:
- 2022arXiv220301728X
- Keywords:
-
- Computer Science - Information Theory
- E-Print:
- 6 pages, 2 figures, submitted for review at ISIT 2022