CMStatistics 2022: Start Registration
View Submission - CMStatistics
B1466
Title: On the stability and generalization of the privacy-preserving decentralized learning Authors:  Yafei Wang - University of Alberta (Canada) [presenting]
Abstract: The stochastic decentralized optimization that minimizes a finite sum of expected losses over a topology network diagram has found tremendous success within distributed and parallel learning due to its natural relevance to sophisticated computing and large-scale optimization. With the communication across the nodes through the network system, the concerns of privacy risk have motivated the development of private variants of learning algorithms for many complex inference and training tasks. We discuss a novel formulation of operator splitting schemes that solve complicated monotone inclusions and stochastic optimization problems built by alliteratively updating each piece of decomposition. In particular, leveraging the decentralized learning procedure to train models under privacy constraints, we propose a general framework of privacy-preserving stochastic decentralized operator iteration algorithms and show that the proposed algorithm retains the performance guarantee in terms of stability, generalization, and finite sample performance. We further investigate the impact of the local privacy-preserving computation on global differential privacy through the composition theorem.