Mitigating Issues Caused by Device Dropout in Knowledge Distillation Based Federated Learning

dc.contributor.advisorFitsum Assamnew
dc.contributor.authorEyerusalem Bante
dc.date.accessioned2026-04-04T11:36:28Z
dc.date.available2026-04-04T11:36:28Z
dc.date.issued2026-03
dc.description.abstractFederated learning (FL) is a technique that enables clients collaboratively train a machine learning model on distributed data residing on heterogeneous client devices while preserving data privacy. Recent studies suggest that applying knowledge distillation (KD) in federated learning have the potential to address challenges such as device behavior diversity, communication issues, accuracy bottleneck on resource constrained device, slow convergence and others faced by FL. However, in knowledge Distillation based Federated learning, collaborative training on edge devices is adversely impacted by device dropout and intermittent offline behavior resulting from limited resources and unstable connectivity, leading to reduced robustness and convergence on the global model. This research mitigates these issues, using a dropout resilient KD based FL system that couples data similarity-based client clustering with battery charge level -aware client selection strategy. The clients are grouped according to data distribution similarity of their local data distribution, that creates redundancy pool for compensating dropout by substitute client from same cluster. A battery aware algorithm prioritizes devices with sufficient energy levels, a strategy that reduces the probability of mid round dropouts, ensuring statistical data preservation, and improved global model stability and convergence. Experimental evaluations across MNIST dataset and dropout rates demonstrate improved global model accuracy in non-IID settings among heterogeneous devices. The findings contribute to advancing robust and efficient KD based FL frameworks suitable for resource-constrained environments, including mobile and IoT devices.
dc.identifier.urihttps://etd.aau.edu.et/handle/123456789/8053
dc.language.isoen
dc.publisherAddis Ababa University
dc.subjectFederated Learning
dc.subjectKnowledge Distillation
dc.subjectClient Dropout
dc.subjectBattery-Aware Client Selection
dc.subjectClient Clustering
dc.subjectNon-IID Data
dc.titleMitigating Issues Caused by Device Dropout in Knowledge Distillation Based Federated Learning
dc.typeThesis

Files

Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
Eyerusalem_Bante_2026_ETD.pdf
Size:
975.1 KB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed to upon submission
Description: