Lab Home | Phone | Search | ||||||||
|
||||||||
We propose a distributed second order method for general consensus. Our approach is the fastest in literature so-far as it outperforms state-of-the-art methods, including ADMM, by significant margins. This is achieved by exploiting the sparsity pattern of the dual Hessian and transforming the problem to a one of efficiently solving a sequence of symmetric diagonally dominant system of equations. For the latter problem we suggest two novel techniques: fully distributed and mixed. For the first algorithm we exploit the extremal properties of Chebyshev polynomials to achieve fast performance. The mixed algorithm is based on distributed sparsification approach and exhibits linear time complexity. We validate our guarantee for general consensus both theoretically and empirically. On the theory side, we prove that similar to exact Newton, our algorithm exhibits superlinear convergence within a neighborhood of the optimal solution. Empirically, we demonstrate the superiority of this new method on a variety of machine learning problems and show that our improvements arrive at low communication overhead between processors. Host: Michael Chertkov |