Files

Abstract

In this report, we address the issue of scalability of the existing secure aggregation protocols used in decentralized machine learning to a very high number of nodes. As a solution, we propose a novel decentralized aggregation protocol that can be parameterized so that the overall computation overhead scales logarithmically with the number of nodes. The parameterization also affects input privacy of the protocol, ranging from no input privacy to the privacy against a collusion of up to all but 2 nodes. However, stronger privacy guarantees come at the cost of the computation overhead. The protocol in its current version doesn't support users dropping out. We also discuss our implementation of this protocol and measure how well it performs.

Details

Actions

Preview