ABSTRACT
Federated Learning (FL) is a privacy-preserving distributed Machine Learning (ML) paradigm. Here, a global machine learning model is learned by aggregating the local models that were learned over local data at each edge user. Realizing the benefits of FL is challenging in communication-constrained environments, such as the traditional wireless uplink communications characterized by low bandwidth links. A well-known FL protocol over such resource-constrained channels is FL over multiple access channels, also known as FL-MAC, where edge users simultaneously transmit their model parameters, hence avoiding the need for orthogonal resources. Modern-day neural networks are over-parameterized, and hence, in FL, a large number of parameters need to be communicated over the resource-constrained uplink channel between the edge user and the server. However, if such neural networks are trained in the lazy training regime (the training is as per NTK dynamics), the model weights change very slowly across gradient descent epochs while achieving exponential convergence. This motivates the use of incremental model weights since such updates are highly sparse, allowing for algorithms that employ compressive sensing, thus allowing compressed model update communication. Accordingly, we propose Compressed FL-MAC (or CFL-MAC), a framework where local training at the clients is carried over over-parameterized neural networks; however, the model updates sent to the server are compressed before transmission. We empirically demonstrate that with the proposed framework, good performance with a huge saving in terms of parameters can be achieved.
- Emmanuel J Candès and Michael B Wakin. 2008. An introduction to compressive sampling. IEEE signal processing magazine 25, 2 (2008), 21–30.Google ScholarCross Ref
- Lenaic Chizat, Edouard Oyallon, and Francis Bach. 2019. On lazy training in differentiable programming. Advances in neural information processing systems 32 (2019).Google Scholar
- Baihe Huang, Xiaoxiao Li, Zhao Song, and Xin Yang. 2021. Fl-ntk: A neural tangent kernel-based framework for federated learning analysis. In International Conference on Machine Learning. PMLR, 4423–4434.Google Scholar
- Arthur Jacot, Franck Gabriel, and Clément Hongler. 2018. Neural tangent kernel: Convergence and generalization in neural networks. Advances in neural information processing systems 31 (2018).Google Scholar
- Tomer Sery, Nir Shlezinger, Kobi Cohen, and Yonina C Eldar. 2021. Over-the-air federated learning from heterogeneous data. IEEE Transactions on Signal Processing 69 (2021), 3796–3811.Google ScholarDigital Library
Index Terms
- Exploiting Sparsity in Over-parameterized Federated Learning over Multiple Access Channels
Recommendations
Image compressive sensing recovery using adaptively learned sparsifying basis via L0 minimization
From many fewer acquired measurements than suggested by the Nyquist sampling theory, compressive sensing (CS) theory demonstrates that, a signal can be reconstructed with high probability when it exhibits sparsity in some domain. Most of the ...
Image compressive sensing via Truncated Schatten-p Norm regularization
Low-rank property as a useful image prior has attracted much attention in image processing communities. Recently, a nonlocal low-rank regularization (NLR) approach toward exploiting low-rank property has shown the state-of-the-art performance in ...
Video reconstruction based on Intrinsic Tensor Sparsity model
AbstractThe natural images have self-similarities which can be used to improve the image reconstruction. However, the existing video reconstruction algorithms pay more attention to modeling and ignore the importance of priors in the ...
Highlights- We propose a tensor sparsity based reconstruction framework for video CS recovery exploiting the nonlocal structured sparsity via sparsity tensor ...
Comments