Distributed Approximation of Functions over Fast Fading Channels with Applications to Distributed Learning and the Max-Consensus Problem
In this work, we consider the problem of distributed approximation of functions over multiple-access channels with additive noise. In contrast to previous works, we take fast fading into account and give explicit probability bounds for the approximation error allowing us to derive bounds on the number of channel uses that are needed to approximate a function up to a given approximation accuracy. Neither the fading nor the noise process is limited to Gaussian distributions. Instead, we consider sub-gaussian random variables which include Gaussian as well as many other distributions of practical relevance. The results are motivated by and have immediate applications to a) computing predictors in models for distributed machine learning and b) the max-consensus problem in ultra-dense networks.
- Pub Date:
- July 2019
- Computer Science - Information Theory
- 2019 57th Annual Allerton Conference on Communication, Control, and Computing (Allerton), Monticello, IL, USA, September 24-27, 2019, pp. 1146-1153