FLaaS: Federated Learning as a Service
- Contributors: Nicolas Kourtellis, Kleomenis Katevas and Diego Perino
- Year: 2020
- Venue: Distributed Machine Learning (DistributedML'20)
- Abstract:
Federated Learning (FL) is emerging as a promising technology to build machine learning models in a decentralized, privacy-preserving fashion. Indeed, FL enables local training on user devices, avoiding user data to be transferred to centralized servers, and can be enhanced with differential privacy mechanisms. Although FL has been recently deployed in real systems, the possibility of collaborative modeling across different 3rd-party applications has not yet been explored. In this paper, we tackle this problem and present Federated Learning as a Service (FLaaS), a system enabling different scenarios of 3rd-party application collaborative model building and addressing the consequent challenges of permission and privacy management, usability, and hierarchical model training. FLaaS can be deployed in different operational environments. As a proof of concept, we implement it on a mobile phone setting and discuss practical implications of results on simulated and real devices with respect to on-device training CPU cost, memory footprint and power consumed per FL model round. Therefore, we demonstrate FLaaS’s feasibility in building unique or joint FL models across applications for image object detection in a few hours, across 100 devices.
- Repository link: https://arxiv.org/abs/2011.09359v1
- Download: PDF file