Least Squares Approximation for a Distributed System
Abstract
In this work, we develop a distributed least squares approximation (DLSA) method that is able to solve a large family of regression problems (e.g., linear regression, logistic regression, and Cox's model) on a distributed system. By approximating the local objective function using a local quadratic form, we are able to obtain a combined estimator by taking a weighted average of local estimators. The resulting estimator is proved to be statistically as efficient as the global estimator. Moreover, it requires only one round of communication. We further conduct shrinkage estimation based on the DLSA estimation using an adaptive Lasso approach. The solution can be easily obtained by using the LARS algorithm on the master node. It is theoretically shown that the resulting estimator possesses the oracle property and is selection consistent by using a newly designed distributed Bayesian information criterion (DBIC). The finite sample performance and the computational efficiency are further illustrated by an extensive numerical study and an airline dataset. The airline dataset is 52 GB in size. The entire methodology has been implemented in Python for a defacto standard Spark system. The proposed DLSA algorithm on the Spark system takes 26 minutes to obtain a logistic regression estimator, whereas a full likelihood algorithm takes 15 hours to obtain an inferior result.
 Publication:

arXiv eprints
 Pub Date:
 August 2019
 arXiv:
 arXiv:1908.04904
 Bibcode:
 2019arXiv190804904Z
 Keywords:

 Statistics  Methodology;
 Computer Science  Distributed;
 Parallel;
 and Cluster Computing;
 Computer Science  Machine Learning;
 Statistics  Computation;
 Statistics  Machine Learning