Skip to main content

FedPrune: Personalized and Communication-Efficient Federated Learning on Non-IID Data

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2021)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1516))

Included in the following conference series:

Abstract

Federated learning (FL) has been widely deployed in edge computing scenarios. However, FL-related technologies are still facing severe challenges while evolving rapidly. Among them, statistical heterogeneity (i.e., non-IID) seriously hinders the wide deployment of FL. In our work, we propose a new framework for communication-efficient and personalized federated learning, namely FedPrune. More specifically, under the newly proposed FL framework, each client trains a converged model locally to obtain critical parameters and substructure that guide the pruning of the network participating FL. FedPrune is able to achieve high accuracy while greatly reducing communication overhead. Moreover, each client learns a personalized model in FedPrune. Experimental results has demonstrated that FedPrune achieves the best accuracy in image recognition task with varying degrees of reduced communication costs compared to the three baseline methods.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Alistarh, D., Grubic, D., Li, J., Tomioka, R., Vojnovic, M.: QSGD: Communication-efficient SGD via gradient quantization and encoding, pp. 1709–1720 (2017)

    Google Scholar 

  2. Aljundi, R., Babiloni, F., Elhoseiny, M., Rohrbach, M., Tuytelaars, T.: Memory aware synapses: learning what (not) to forget. In: Ferrari, V., Hebert, M., Sminchisescu, C., Weiss, Y. (eds.) ECCV 2018, Part III. LNCS, vol. 11207, pp. 144–161. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-01219-9_9

    Chapter  Google Scholar 

  3. Finn, C., Abbeel, P., Levine, S.: Model-agnostic meta-learning for fast adaptation of deep networks. In: Precup, D., Teh, Y.W. (eds.) Proceedings of the 34th International Conference on Machine Learning. Proceedings of Machine Learning Research, vol. 70, pp. 1126–1135. PMLR, 06–11 August 2017. http://proceedings.mlr.press/v70/finn17a.html

  4. Frankle, J., Carbin, M.: The lottery ticket hypothesis: finding sparse, trainable neural networks. In: International Conference on Learning Representations (2019). https://openreview.net/forum?id=rJl-b3RcF7

  5. Ivkin, N., Rothchild, D., Ullah, E., Braverman, V., Stoica, I., Arora, R.: Communication-efficient distributed SGD with sketching. In: Proceedings of NeurIPS, pp. 1–23 (2019)

    Google Scholar 

  6. Jiang, Y., Konečný, J., Rush, K., Kannan, S.: Improving Federated Learning Personalization via Model Agnostic Meta Learning. CoRR abs/1909.12488 (2019). http://arxiv.org/abs/1909.12488

  7. Konečnỳ, J., McMahan, H.B., Yu, F.X., Richtárik, P., Suresh, A.T., Bacon, D.: Federated learning: strategies for improving communication efficiency. In: Proceedings of NeurIPS Workshop (2016)

    Google Scholar 

  8. Li, T., Sahu, A.K., Talwalkar, A., Smith, V.: Federated learning: challenges, methods, and future directions. IEEE Signal Process. Mag. 37(3), 50–60 (2020)

    Article  Google Scholar 

  9. Li, T., Sahu, A.K., Zaheer, M., Sanjabi, M., Talwalkar, A., Smith, V.: Federated optimization in heterogeneous networks. In: Proceedings of MLSys (2018)

    Google Scholar 

  10. Liang, P.P., Liu, T., Ziyin, L., Salakhutdinov, R., Morency, L.P.: Think locally, act globally: Federated learning with local and global representations. arXiv preprint arXiv:2001.01523 (2020)

  11. McMahan, B., Moore, E., Ramage, D., Hampson, S., y Arcas, B.A.: Communication-efficient learning of deep networks from decentralized data. In: Proceedings of AISTATS, pp. 1273–1282 (2017)

    Google Scholar 

  12. Reisizadeh, A., Mokhtari, A., Hassani, H., Jadbabaie, A., Pedarsani, R.: FedPAQ: a communication-efficient federated learning method with periodic averaging and quantization. In: Proceedings of AISTATS, pp. 2021–2031 (2020)

    Google Scholar 

  13. Vahidian, S., Morafah, M., Lin, B.: Personalized Federated Learning by Structured and Unstructured Pruning under Data Heterogeneity. CoRR abs/2105.00562 (2021), https://arxiv.org/abs/2105.00562

  14. Wang, S., et al.: Adaptive federated learning in resource constrained edge computing systems. IEEE J. Sel. Areas Commun. (JSAC) 37(6), 1205–1221 (2019)

    Article  Google Scholar 

  15. Zhao, Y., et al.: TDFI: two-stage deep learning framework for friendship inference via multi-source information. In: IEEE INFOCOM 2019 - IEEE Conference on Computer Communications, pp. 1981–1989 (2019). https://doi.org/10.1109/INFOCOM.2019.8737458

  16. Zhao, Y., Xu, K., Wang, H., Li, B., Jia, R.: Stability-based analysis and defense against backdoor attacks on edge computing services. IEEE Netw. 35(1), 163–169 (2021). https://doi.org/10.1109/MNET.011.2000265

    Article  Google Scholar 

  17. Zhao, Y., Xu, K., Wang, H., Li, B., Qiao, M., Shi, H.: MEC-enabled hierarchical emotion recognition and perturbation-aware defense in smart cities. IEEE IoT J. 1 (2021). https://doi.org/10.1109/JIOT.2021.3079304

  18. Zhao, Y., Li, M., Lai, L., Suda, N., Civin, D., Chandra, V.: Federated learning with non-iid data. arXiv preprint arXiv:1806.00582 (2018)

  19. Zhou, G., Xu, K., Li, Q., Liu, Y., Zhao, Y.: AdaptCL: Efficient Collaborative Learning with Dynamic and Adaptive Pruning. CoRR abs/2106.14126 (2021), https://arxiv.org/abs/2106.14126

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yi Zhao .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Liu, Y., Zhao, Y., Zhou, G., Xu, K. (2021). FedPrune: Personalized and Communication-Efficient Federated Learning on Non-IID Data. In: Mantoro, T., Lee, M., Ayu, M.A., Wong, K.W., Hidayanto, A.N. (eds) Neural Information Processing. ICONIP 2021. Communications in Computer and Information Science, vol 1516. Springer, Cham. https://doi.org/10.1007/978-3-030-92307-5_50

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-92307-5_50

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-92306-8

  • Online ISBN: 978-3-030-92307-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics