Mitigating Catastrophic Forgetting in Personalized Federated Learning for Edge Devices using State-Space Models

Weidong Zhang, Dongshang Deng, Xuangou Wu, Tao Zhang, Xiao Zheng, Dusit Niyato, Dong In Kim

Research output: Contribution to journalArticlepeer-review

Abstract

Edge computing involves distributed devices operating in dynamic environments with diverse resource constraints and highly heterogeneous data distributions. In these settings, personalized federated learning (PFL) provides a collaborative learning framework that preserves the unique data characteristics of each device. However, PFL models often encounter bidirectional catastrophic forgetting. During consecutive training rounds, the personalized characteristics learned by local models are readily overwritten by updates from the global model, while the global model's shared representations are degraded by distribution shifts arising from heterogeneous local data. This challenge is further amplified in edge computing settings, where devices must adapt to fast-changing heterogeneous data. To address this issue, we propose FedSSM, a novel framework that leverages state-space models to mitigate forgetting in PFL. By capturing the temporal evolution of local model parameters through hidden states, the framework enhances the retention of critical knowledge throughout training rounds. Extensive experiments on multiple benchmark datasets demonstrate that FedSSM outperforms various state-of-the-art PFL algorithms, particularly with high data heterogeneity.

Original languageEnglish
JournalIEEE Transactions on Mobile Computing
DOIs
StateAccepted/In press - 2025
Externally publishedYes

Keywords

  • Catastrophic Forgetting
  • Data Heterogeneity
  • Edge Devices
  • Federated Learning
  • State-space Models

Fingerprint

Dive into the research topics of 'Mitigating Catastrophic Forgetting in Personalized Federated Learning for Edge Devices using State-Space Models'. Together they form a unique fingerprint.

Cite this