Federated Learning (FL) enables model training across decentralized devices by communicating solely local model updates to an aggregation server. Although such limited data sharing makes FL more secure than centralized approached, FL remains vulnerable to inference attacks during model update transmissions. Existing secure aggregation approaches rely on differential privacy or cryptographic schemes like Functional Encryption (FE) to safeguard individual client data. However, such strategies can reduce performance or introduce unacceptable computational and communication overheads on clients running on edge devices with limited resources. In this work, we present EncCluster, a novel method that integrates model compression through weight clustering with recent decentralized FE and privacy-enhancing data encoding using probabilistic filters to deliver strong privacy guarantees in FL without affecting model performance or adding unnecessary burdens to clients. We performed a comprehensive evaluation, spanning various datasets and architectures, to demonstrate EncCluster scalability across encryption levels. Our findings reveal that EncCluster significantly reduces communication costs — below even conventional FedAvg — and accelerates encryption by more than four times over all baselines; at the same time, it maintains high model accuracy and enhanced privacy assurances.

EncCluster: Scalable Functional Encryption in Federated Learning through Weight Clustering and Probabilistic Filters / Tsouvalas, Vasileios; Mohammadi, Samaneh; Balador, Ali; Ozcelebi, Tanir; Flammini, Francesco; Meratnia, Nirvana. - In: PERVASIVE AND MOBILE COMPUTING. - ISSN 1873-1589. - ELETTRONICO. - (2024), pp. 0-0.

EncCluster: Scalable Functional Encryption in Federated Learning through Weight Clustering and Probabilistic Filters

Flammini, Francesco;
2024

Abstract

Federated Learning (FL) enables model training across decentralized devices by communicating solely local model updates to an aggregation server. Although such limited data sharing makes FL more secure than centralized approached, FL remains vulnerable to inference attacks during model update transmissions. Existing secure aggregation approaches rely on differential privacy or cryptographic schemes like Functional Encryption (FE) to safeguard individual client data. However, such strategies can reduce performance or introduce unacceptable computational and communication overheads on clients running on edge devices with limited resources. In this work, we present EncCluster, a novel method that integrates model compression through weight clustering with recent decentralized FE and privacy-enhancing data encoding using probabilistic filters to deliver strong privacy guarantees in FL without affecting model performance or adding unnecessary burdens to clients. We performed a comprehensive evaluation, spanning various datasets and architectures, to demonstrate EncCluster scalability across encryption levels. Our findings reveal that EncCluster significantly reduces communication costs — below even conventional FedAvg — and accelerates encryption by more than four times over all baselines; at the same time, it maintains high model accuracy and enhanced privacy assurances.
2024
0
0
Tsouvalas, Vasileios; Mohammadi, Samaneh; Balador, Ali; Ozcelebi, Tanir; Flammini, Francesco; Meratnia, Nirvana
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in FLORE sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificatore per citare o creare un link a questa risorsa: https://hdl.handle.net/2158/1453451
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact