Paper accepted to IEEE PerCom 2023 -
Towards Scalable Resilient Federated Learning: A Fully Decentralised Approach
Divi De Lacour (1) (2), Marc Lacoste (1), Mario Südholt (2), Jacques Traoré (1)
1) Orange Innovation,
2) INRIA, IMT-Atlantique
Federated Learning (FL) collaboratively trains machine learning models on the data of local devices without having to move the data itself: a central server aggregates models, with privacy and performance benefits but also scalability and resilience challenges. In this paper we present FDFL, a new fully decentralized FL model and architecture that improves standard FL scalability and resilience with no loss of convergence speed. FDFL provides an aggregator-based model that enables scalability benefits and features an election process to tolerate node failures. Simulation results show that FDFL scales well with network size in terms of computing, memory, and communication compared to related FL approaches such as standard FL, FL with aggregators, or FL with election, with also good resilience to node failures.
Workshop PeRConAI: https://perconai.iit.cnr.it/ Conference IEEE PerCom 2023: https://www.percom.org/