Log in

Blockfd: blockchain-based federated distillation against poisoning attacks

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Federated learning (FL) is a novel framework that distributes the model training to the participant devices to realize privacy-preserving machine learning. To achieve this, clients upload the parameters of the local model to the central server for aggregation rather than the raw data. Despite the potential of FL, one of the significant challenges in FL applications is the communication constraints caused by the transmission of the high-dimensional parameter. To overcome this, federated distillation (FD) has been widely studied to address the significant communication overhead through transmitting the low-dimensional logits, which is used to assist the training of the local model rather than transmitting the model parameters. However, the traditional FD framework applies the centralized architecture, which is vulnerable to single-point-of-failure. Moreover, the emerging poisoning attacks also significantly impact the security of FD. Specifically, attackers can easily launch poisoning attack by uploading crafted logits, leading to inaccurate global logits aggregation and hazard the accuracy of local models. To address these issues, we propose a federated distillation framework based on blockchain, named BlockFD, by exploiting two mechanisms in blockchain architecture to realize decentralized and security FD. First, we propose a novel multi-dimension consensus algorithm (BlockFD-PoM) that leverages multiple attributions to perform consensus process, solving the existing computation-intensive and unfair problems of traditional consensus algorithms, such as the PoW and the PoS. Second, we introduce an aggregation-based validation algorithm (BFV) such that the legitimacy of local logits can be verified to guarantee the security of FD aggregation. Extensive evaluation results show that the proposed BlockFD framework can effectively and fairly realize decentralized federated distillation. Besides that, the proposed BFV algorithm can efficiently prevent federated distillation from poisoning attacks while maintaining the loss within 2.77%.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (Canada)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Algorithm 1
Fig. 3
Algorithm 2
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Data Availability

The authors declare that the “MNIST dataset”, “FEMNIST dataset”, “CIFAR-10 dataset”, and “CIFAR-100 dataset” supporting the findings of this study are available within the article. The data are not publicly available due to the containing information that could compromise research participant privacy.

Notes

  1. Forking: during miner selection, two or more miners are selected as the winning miner, parallel adding multiple blocks to the blockchain, thus resulting in the resource wasting to rebuild the blockchain to a single chain.

References

  1. Zhang J, Liu Y, Wu D, Lou S, Chen B, Yu S (2023) VPFL: A verifiable privacy-preserving federated learning scheme for edge computing systems. Digital Commun Netw 9(4):981–989

    Article  Google Scholar 

  2. Li, Y., Zhang, J., Zhu, J., Li, W.: HBMD-FL: Heterogeneous Federated Learning Algorithm Based on Blockchain and Model Distillation. In: Emerging Information Security and Applications vol. 1641, pp. 145–159. Springer Nature Switzerland, Cham (2022)

  3. Wu D, Wang N, Zhang J, Zhang Y, **ang Y, Gao L (2022) A blockchain-based multi-layer decentralized framework for robust federated learning. In: 2022 international joint conference on neural networks (IJCNN), pp 1–8 (2022)

  4. Kim H, Park J, Bennis M, Kim S-L (2020) Blockchained on-device federated learning. IEEE Commun Lett 24(6):1279–1283

    Article  Google Scholar 

  5. Feng L, Zhao Y, Guo S, Qiu X, Li W, Yu P (2021) Bafl: a blockchain-based asynchronous federated learning framework. IEEE Trans Comput 71(5):1092–1103

    Article  Google Scholar 

  6. Nakamoto S. Bitcoin: A peer-to-peer electronic cash system, 9

  7. Nguyen DC, Ding M, Pham Q-V, Pathirana PN, Le LB, Seneviratne A, Li J, Niyato D, Poor HV (2021) Federated learning meets blockchain in edge computing: opportunities and challenges. IEEE Internet Things J 8(16):12806–12825

    Article  Google Scholar 

  8. Hazra A, Alkhayyat A, Adhikari M (2024) Blockchain for cybersecurity in edge networks. IEEE Consumer Electron Mag 13(1):97–102

    Article  Google Scholar 

  9. Yaqoob I, Salah K, Jayaraman R, Al-Hammadi Y (2022) Blockchain for healthcare data management: opportunities, challenges, and future recommendations. Neural Comput Appl 34(14):11475–11490

    Article  Google Scholar 

  10. Chamola V, Goyal A, Sharma P, Hassija V, Binh HTT, Saxena V (2023) Artificial intelligence-assisted blockchain-based framework for smart and secure emr management. Neural Comput Appl 35(31):22959–22969

    Article  Google Scholar 

  11. Qu X, Wang S, Hu Q, Cheng X (2021) Proof of federated learning: a novel energy-recycling consensus algorithm. IEEE Trans Parallel Distrib Syst 32(8):2074–2085

    Article  Google Scholar 

  12. Chen H, Asif SA, Park J, Shen C-C, Bennis M (2021) Robust blockchained federated learning with model validation and proof-of-stake inspired consensus. ar**v:2101.03300v1

  13. Wu Z-W, Chen C-T, Huang S-H (2022) Poisoning attacks against knowledge graph-based recommendation systems using deep reinforcement learning. Neural Comput Appl 1–19

  14. Dong L, Zhang H, Yang K, Zhou D, Shi J, Ma J (2022) Crowd counting by using top-k relations: a mixed ground-truth cnn framework. IEEE Trans Consum Electron 68(3):307–316

    Article  Google Scholar 

  15. Blanchard P, El Mhamdi EM, Guerraoui R, Stainer J Machine learning with adversaries: Byzantine tolerant gradient descent. In: Advances in neural information processing systems, vol 30, pp 119–129. Curran Associates, Inc

  16. Chen Y, Su L, Xu J (2017) Distributed statistical machine learning in adversarial settings: Byzantine gradient descent. Proc ACM Meas Anal Comput Syst 1(2):1–25

    Google Scholar 

  17. Awan S, Luo B, Li F (2021) Contra: Defending against poisoning attacks in federated learning. In: Computer security–ESORICS 2021: 26th European symposium on research in computer security, Darmstadt, Germany, October 4–8, 2021, Proceedings, Part I 26, pp. 455–475. Springer

  18. Chen Q, Chen B, Hu F, Zhang J (2022) Edge-based protection against malicious poisoning for distributed federated learning. In: 2022 IEEE 25th international conference on computer supported cooperative work in design (CSCWD), pp 459–464. IEEE, Hangzhou, China

  19. Rajput S, Wang H, Charles Z, Papailiopoulos D. Detox: a redundancy-based framework for faster and more robust gradient aggregation. In: Advances in neural information processing systems, vol. 32. Curran Associates, Inc

  20. Desai HB, Ozdayi MS, Kantarcioglu M (2021) Blockfla: Accountable federated learning via hybrid blockchain architecture. In: Proceedings of the eleventh ACM conference on data and application security and privacy. CODASPY ’21, pp. 101–112. Association for Computing Machinery, New York, NY, USA

  21. Li Y, Chen C, Liu N, Huang H, Zheng Z, Yan Q (2021) A blockchain-based decentralized federated learning framework with committee consensus. IEEE Netw 35(1):234–241

    Article  Google Scholar 

  22. Salim S, Turnbull B, Moustafa N (2021) A blockchain-enabled explainable federated learning for securing internet-of-things-based social media 3.0 networks. IEEE Trans Comput Soc Syst, 1–17

  23. Liu W, Lin H, Wang X, Hu J, Kaddoum G, Piran MJ, Alamri A (2023) D2mif: a malicious model detection mechanism for federated-learning-empowered artificial intelligence of things. IEEE Internet Things J 10(3):2141–2151

    Article  Google Scholar 

  24. Zhu J, Cao J, Saxena D, Jiang S, Ferradi H (2023) Blockchain-empowered federated learning: challenges, solutions, and future directions. ACM Comput Surv 55(11):1–31

    Article  Google Scholar 

  25. Wang Z, Hu Q (2021) Blockchain-based federated learning: a comprehensive survey. ar**v:2110.02182v1

  26. Shayan M, Fung C, Yoon CJM, Beschastnikh I (2021) Biscotti: a blockchain system for private and secure federated learning. IEEE Trans Parallel Distrib Syst 32(7):1513–1525

    Article  Google Scholar 

  27. Zhao Y, Zhao J, Jiang L, Tan R, Niyato D, Li Z, Lyu L, Liu Y (2021) Privacy-preserving blockchain-based federated learning for iot devices. IEEE Internet Things J 8(3):1817–1829

    Article  Google Scholar 

  28. Qi Y, Hossain MS, Nie J, Li X (2021) Privacy-preserving blockchain-based federated learning for traffic flow prediction. Futur Gener Comput Syst 117:328–337

    Article  Google Scholar 

  29. Awan S, Li F, Luo B, Liu M. Poster: a reliable and accountable privacy-preserving federated learning framework using the blockchain. In: Proceedings of the 2019 ACM SIGSAC conference on computer and communications security, pp 2561–2563. ACM

  30. Qu Y, Gao L, Luan TH, **ang Y, Yu S, Li B, Zheng G (2020) Decentralized privacy using blockchain-enabled federated learning in fog computing. IEEE Internet Things J 7(6):5171–5183

    Article  Google Scholar 

  31. Jeong E, Oh S, Kim H, Park J, Bennis M, Kim S-L (2018) Communication-efficient on-device machine learning: federated distillation and augmentation under non-IID private data. ar**v:1811.11479v2

  32. Li D, Wang J (2019) FedMD: Heterogenous Federated Learning via Model Distillation. ar**v:1910.03581v1

  33. Mo Z, Gao Z, Zhao C, Lin Y (2022) Feddq: a communication-efficient federated learning approach for internet of vehicles. J Syst Architect 131:102690

    Article  Google Scholar 

  34. Sattler F, Marban A, Rischke R, Samek W (2022) Cfd: communication-efficient federated distillation via soft-label quantization and delta coding. IEEE Trans Netw Sci Eng 9(4):2025–2038

    Article  MathSciNet  Google Scholar 

  35. Huang W, Ye M, Du B (2022) Learn from others and be yourself in heterogeneous federated learning. In: 2022 IEEE/CVF conference on computer vision and pattern recognition (CVPR), pp 10133–10143. IEEE, New Orleans, LA, USA

  36. Fang X, Ye M (2022) Robust federated learning with noisy and heterogeneous clients. In: 2022 IEEE/CVF conference on computer vision and pattern recognition (CVPR), pp 10062–10071. IEEE, New Orleans, LA, USA

  37. Hinton G, Vinyals O, Dean J (2015) Distilling the knowledge in a neural network. ar**v:1503.02531v1

  38. Anil R, Pereyra G, Passos A, Ormandi R, Dahl GE, Hinton GE (2018) Large scale distributed neural network training through online distillation. ar**v:1804.03235v2

  39. Itahara S, Nishio T, Koda Y, Morikura M, Yamamoto K (2023) Distillation-based semi-supervised federated learning for communication-efficient collaborative training with non-iid private data. IEEE Trans Mob Comput 22(1):191–205

    Article  Google Scholar 

  40. Shen T, Zhang J, Jia X, Zhang F, Huang G, Zhou P, Kuang K, Wu F, Wu C (2020) Federated mutual learning. ar**v:2006.16765v3

  41. Zhu Z, Hong J, Zhou J (2021) Data-free knowledge distillation for heterogeneous federated learning. In: International conference on machine learning, pp. 12878–12889. PMLR

  42. Shafahi A, Huang WR, Najibi M, Suciu O, Studer C, Dumitras T, Goldstein T Poison frogs! targeted clean-label poisoning attacks on neural networks. In: Advances in Neural Information Processing Systems, vol. 31. Curran Associates, Inc

  43. Fung C, Yoon CJM, Beschastnikh I (2018) Mitigating Sybils in federated learning poisoning. ar**v:1808.04866v5

  44. Gu T, Dolan-Gavitt B, Garg S (2017) Badnets: identifying vulnerabilities in the machine learning model supply chain. ar**v preprint ar**v:1708.06733

  45. Xu Y, Yang X, Zhang J, Zhu J, Sun M, Chen B (2021) Proof of engagement: a flexible blockchain consensus mechanism. Wirel Commun Mob Comput 2021:1–10

    Google Scholar 

  46. Gao D, Yao X, Yang Q (2022) A survey on heterogeneous federated learning. ar**v:2210.04505v1

Download references

Acknowledgements

This work is supported by the National Natural Science Foundation of China (No. 62206238), the Natural Science Foundation of Jiangsu Province (Grant No. BK20220562), and the Natural Science Research Project of Universities in Jiangsu Province (No. 22KJB520010).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jiale Zhang.

Ethics declarations

Conflict of interest

All authors certify that they have no affiliations with or involvement in any organization or entity with any financial interest or non-financial interest in the subject matter or materials discussed in this manuscript.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Li, Y., Zhang, J., Zhu, J. et al. Blockfd: blockchain-based federated distillation against poisoning attacks. Neural Comput & Applic 36, 12901–12916 (2024). https://doi.org/10.1007/s00521-024-09715-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-024-09715-w

Keywords

Navigation