Abstract
Edge computing involves distributed devices operating in dynamic environments with diverse resource constraints and highly heterogeneous data distributions. In these settings, personalized federated learning (PFL) provides a collaborative learning framework that preserves the unique data characteristics of each device. However, PFL models often encounter bidirectional catastrophic forgetting. During consecutive training rounds, the personalized characteristics learned by local models are readily overwritten by updates from the global model, while the global model's shared representations are degraded by distribution shifts arising from heterogeneous local data. This challenge is further amplified in edge computing settings, where devices must adapt to fast-changing heterogeneous data. To address this issue, we propose FedSSM, a novel framework that leverages state-space models to mitigate forgetting in PFL. By capturing the temporal evolution of local model parameters through hidden states, the framework enhances the retention of critical knowledge throughout training rounds. Extensive experiments on multiple benchmark datasets demonstrate that FedSSM outperforms various state-of-the-art PFL algorithms, particularly with high data heterogeneity.
| Original language | English |
|---|---|
| Journal | IEEE Transactions on Mobile Computing |
| DOIs | |
| State | Accepted/In press - 2025 |
| Externally published | Yes |
Keywords
- Catastrophic Forgetting
- Data Heterogeneity
- Edge Devices
- Federated Learning
- State-space Models