Institutional Repository
Technical University of Crete
EN  |  EL

Search

Browse

My Space

Machine learning model hyperparameter fine-tuning in a Federated Learningenvironment

Valavanis Georgios

Full record


URI: http://purl.tuc.gr/dl/dias/6FA02C2C-0C13-469F-B004-8B0B53E50C35
Year 2025
Type of Item Diploma Work
License
Details
Bibliographic Citation Georgios Valavanis, "Machine learning model hyperparameter fine-tuning in a Federated Learning environment", Diploma Work, School of Electrical and Computer Engineering, Technical University of Crete, Chania, Greece, 2025 https://doi.org/10.26233/heallink.tuc.103573
Appears in Collections

Summary

Federated Learning (FL) has emerged as a paradigm for training machine learningmodels on decentralized data while preserving privacy. Despite its advantages, theprocess of hyperparameter fine-tuning remains a critical challenge within FL settings, primarily due to data heterogeneity, communication constraints, and the need for secure collaboration. The present diploma addresses the problem of efficient and privacy-preserving hyperparameter fine-tuning in FL environments by providing a framework that utilizes federated hyperparameter fine-tuning. Here, clients collaboratively explore hyperparameter configurations using local data. Then, after the best hyperparameters are found from the predefined hyperparameter space, a series of secure aggregation rounds takes place at the server. Our system leverages stratified k-fold cross-validation on clients to evaluate hyperparameter combinations locally, encrypted communication to protect model updates, and weighted aggregation to harmonize global model performance. Various classifiers are supported, such as Stochastic Gradient Descent and Gaussian Naive Bayes, providing extended implementation capabilities. Additionally, to ensure data privacy, our framework provides symmetric and asymmetric encryption for the client-server communication. Experimental results demonstrate the efficacy of the approach in achieving similar F1 scores to the implemented non-federated approach while maintaining scalability and security. This work contributes a practical methodology for hyperparameter fine-tuning in FL, balancing performance and privacy.

Available Files

Services

Statistics