⚠ Official Notice: www.ijisrt.com is the official website of the International Journal of Innovative Science and Research Technology (IJISRT) Journal for research paper submission and publication. Please beware of fake or duplicate websites using the IJISRT name.



PP-FCIL: A Dual-Backbone Privacy-Preserving Federated Class-Incremental Learning Framework with Adaptive Feature Fusion


Authors : Darsi Venkata Varalakshmi; Garige Sangeetha; Dammu Nikhitha; Dr. Z. Sunitha Bai

Volume/Issue : Volume 11 - 2026, Issue 4 - April


Google Scholar : https://tinyurl.com/ak7ch9t6

Scribd : https://tinyurl.com/dthvarmb

DOI : https://doi.org/10.38124/ijisrt/26apr1499

Note : A published paper may take 4-5 working days from the publication date to appear in PlumX Metrics, Semantic Scholar, and ResearchGate.


Abstract : Federated learning provides an option to train models cooperatively without the need for sharing data centrally. Yet, incorporating class-incremental learning in federated learning gives rise to challenges like catastrophic forgetting, heterogeneity of the data, and data privacy concerns. To solve these problems, this paper presents a new method of Privacy-Preserving Federated Class-Incremental Learning (PP-FCIL), where the method is tested against the CIFAR-100 benchmark. The proposed solution uses a two-head backbone and channel attention mechanism to retain prior knowledge as well as adapt to newly introduced classes in each stage of learning. Also, to prevent forgetting and class imbalance, this paper proposes exemplar memory using herding, knowledge distillation, supervised contrastive learning, and balanced softmax loss as part of one training process. Different from traditional federated learning methods, the proposed framework utilizes Bayesian Differential Privacy in the clients that clips gradients and adds Gaussian noise to protect data privacy. Furthermore, to measure cumulative privacy loss, a strict privacy accounting protocol is used. Also, a multifactor aggregation protocol is adopted to weigh the importance of client contribution in a non-IID setting.

Keywords : Federated Learning; Class-Incremental Learning; Privacy-Preserving Machine Learning; Bayesian Differential Privacy; Continual Learning; Knowledge Distillation; Exemplar Memory; Non-IID Data Distribution; Channel Attention Fusion; CIFAR-100; Distributed Deep Learning; Catastrophic Forgetting.

References :

  1. A. Krizhevsky, “Learning Multiple Layers of Features from Tiny Images,” University of Toronto, 2009.
  2. B. McMahan, E. Moore, D. Ramage, S. Hampson, and B. A. y Arcas, “Communication-Efficient Learning of Deep Networks from Decentralized Data,” in Proc. AISTATS, 2017.
  3. J. Konečný, H. B. McMahan, D. Ramage, and P. Richtárik, “Federated Optimization: Distributed Machine Learning for On-Device Intelligence,” 2016.
  4. D. Lopez-Paz and M. Ranzato, “Gradient Episodic Memory for Continual Learning,” in Advances in Neural Information Processing Systems (NeurIPS), 2017.
  5. S.-A. Rebuffi, A. Kolesnikov, G. Sperl, and C. H. Lampert, “iCaRL: Incremental Classifier and Representation Learning,” in Proc. CVPR, 2017.
  6. Y. Li and Y. Hoiem, “Learning without Forgetting,” in IEEE Transactions on Pattern Analysis and Machine Intelligence, 2018.
  7. M. He, X. Chen, J. Wang, and X. Chen, “Federated Learning with Non-IID Data: A Survey,” IEEE Transactions on Neural Networks and Learning Systems, 2021.
  8. N. A. Smith, “Supervised Contrastive Learning,” in Advances in Neural Information Processing Systems (NeurIPS), 2020.
  9. T. Zhang, Z. Wang, and X. Liu, “Balanced Softmax for Long-Tailed Visual Recognition,” in Advances in Neural Information Processing Systems (NeurIPS), 2020.
  10. C. Dwork, A. Roth, “The Algorithmic Foundations of Differential Privacy,” Foundations and Trends in Theoretical Computer Science, 2014.
  11. M. Abadi et al., “Deep Learning with Differential Privacy,” in Proc. ACM CCS, 2016.
  12. I. Goodfellow, Y. Bengio, and A. Courville, Deep Learning, Cambridge, MA, USA: MIT Press, 2016.
  13. K. He, X. Zhang, S. Ren, and J. Sun, “Deep Residual Learning for Image Recognition,” in Proc. CVPR, 2016.
  14. A. Dosovitskiy et al., “An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale,” in Proc. ICLR, 2021.
  15. H. Brendan McMahan and Daniel Ramage, “Federated Learning: Collaborative Machine Learning without Centralized Training Data,” Google AI Blog, 2017.

Federated learning provides an option to train models cooperatively without the need for sharing data centrally. Yet, incorporating class-incremental learning in federated learning gives rise to challenges like catastrophic forgetting, heterogeneity of the data, and data privacy concerns. To solve these problems, this paper presents a new method of Privacy-Preserving Federated Class-Incremental Learning (PP-FCIL), where the method is tested against the CIFAR-100 benchmark. The proposed solution uses a two-head backbone and channel attention mechanism to retain prior knowledge as well as adapt to newly introduced classes in each stage of learning. Also, to prevent forgetting and class imbalance, this paper proposes exemplar memory using herding, knowledge distillation, supervised contrastive learning, and balanced softmax loss as part of one training process. Different from traditional federated learning methods, the proposed framework utilizes Bayesian Differential Privacy in the clients that clips gradients and adds Gaussian noise to protect data privacy. Furthermore, to measure cumulative privacy loss, a strict privacy accounting protocol is used. Also, a multifactor aggregation protocol is adopted to weigh the importance of client contribution in a non-IID setting.

Keywords : Federated Learning; Class-Incremental Learning; Privacy-Preserving Machine Learning; Bayesian Differential Privacy; Continual Learning; Knowledge Distillation; Exemplar Memory; Non-IID Data Distribution; Channel Attention Fusion; CIFAR-100; Distributed Deep Learning; Catastrophic Forgetting.

Paper Submission Last Date
31 - May - 2026

SUBMIT YOUR PAPER CALL FOR PAPERS
Video Explanation for Published paper

Never miss an update from Papermashup

Get notified about the latest tutorials and downloads.

Subscribe by Email

Get alerts directly into your inbox after each post and stay updated.
Subscribe
OR

Subscribe by RSS

Add our RSS to your feedreader to get regular updates from us.
Subscribe