Federated Learning (FL) combined with Parameter-Efficient Fine-Tuning (PEFT) methods, such as Low-Rank Adaptation (LoRA) has emerged as a promising approach to address data scarcity challenges in fine-tuning Large Language Models (LLMs) while ensuring privacy and computational efficiency. However, when applying LoRA in traditional FL, separately averaging the adapters during aggregation results in non-exact aggregation. While recent research has investigated this issue, its application to heterogeneous data settings remains largely unexplored. Data heterogeneity across clients can significantly affect the effectiveness of parameter-efficient adaptations and complicate the aggregation process.In this work, we explore the concept of exact aggregation in heterogeneous federated fine-tune settings, specifically focusing on LoRA-based approaches. We propose HEXA (Heterogeneity-aware EXact Aggregation), a novel method that mitigates the effects of data heterogeneity while preserving the benefits of exact aggregation in LoRA-enabled FL. We present a comprehensive theoretical framework for extending exact aggregation to heterogeneous settings and validate our approach through extensive empirical evaluation on the GLUE benchmark. Our results show that HEXA improves model performance in heterogeneous contexts while maintaining the computational efficiency of PEFT methods.

HEXA: Heterogeneity-aware Exact Aggregation for Efficient Fine-Tuning in Federated Learning

Garofalo, Marco;
2025-01-01

Abstract

Federated Learning (FL) combined with Parameter-Efficient Fine-Tuning (PEFT) methods, such as Low-Rank Adaptation (LoRA) has emerged as a promising approach to address data scarcity challenges in fine-tuning Large Language Models (LLMs) while ensuring privacy and computational efficiency. However, when applying LoRA in traditional FL, separately averaging the adapters during aggregation results in non-exact aggregation. While recent research has investigated this issue, its application to heterogeneous data settings remains largely unexplored. Data heterogeneity across clients can significantly affect the effectiveness of parameter-efficient adaptations and complicate the aggregation process.In this work, we explore the concept of exact aggregation in heterogeneous federated fine-tune settings, specifically focusing on LoRA-based approaches. We propose HEXA (Heterogeneity-aware EXact Aggregation), a novel method that mitigates the effects of data heterogeneity while preserving the benefits of exact aggregation in LoRA-enabled FL. We present a comprehensive theoretical framework for extending exact aggregation to heterogeneous settings and validate our approach through extensive empirical evaluation on the GLUE benchmark. Our results show that HEXA improves model performance in heterogeneous contexts while maintaining the computational efficiency of PEFT methods.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11568/1352251
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact