Michail Theologitis, "Communication-Efficient federated deep learning via dynamic averaging", Master Thesis, School of Electrical and Computer Engineering, Technical University of Crete, Chania, Greece, 2025
https://doi.org/10.26233/heallink.tuc.103444
The ever-growing volume and decentralized nature of data have led to the extensive use of distributed deep learning (DDL) and Federated Learning (FL), both of which struggle with the high cost of transmitting large models. State-of-the-art techniques typically prescribe rigid communication intervals in arbitrary and non-principled ways. To make matters worse, modern language and vision models are rapidly increasing in size. These limitations call for a more principled, adaptive approach to synchronization. To address this, we propose Federated Dynamic Averaging (FDA), a communication-efficient strategy that dynamically triggers synchronization based on real-time training dynamics by monitoring model variance. Our experiments with well-established vision models and tasks show that FDA significantly reduces communication costs while maintaining robust performance across diverse heterogeneity settings. Building on these insights, we also introduce the FDA-Opt family of algorithms—a unified generalization of both FDA and the widely used FedOpt—designed to work out of the box without any calibration. Our experiments focus on fine-tuning pre-trained Language Models (LMs) to downstream NLP tasks and demonstrate that FDA-Opt consistently outperforms FedOpt, even when configured with hyper-parameters optimized for the latter. These results establish FDA-Opt as a practical, drop-in replacement for FedOpt in modern FL libraries and systems.