Fitsum AssamnewEyerusalem Bante2026-04-042026-04-042026-03https://etd.aau.edu.et/handle/123456789/8053Federated learning (FL) is a technique that enables clients collaboratively train a machine learning model on distributed data residing on heterogeneous client devices while preserving data privacy. Recent studies suggest that applying knowledge distillation (KD) in federated learning have the potential to address challenges such as device behavior diversity, communication issues, accuracy bottleneck on resource constrained device, slow convergence and others faced by FL. However, in knowledge Distillation based Federated learning, collaborative training on edge devices is adversely impacted by device dropout and intermittent offline behavior resulting from limited resources and unstable connectivity, leading to reduced robustness and convergence on the global model. This research mitigates these issues, using a dropout resilient KD based FL system that couples data similarity-based client clustering with battery charge level -aware client selection strategy. The clients are grouped according to data distribution similarity of their local data distribution, that creates redundancy pool for compensating dropout by substitute client from same cluster. A battery aware algorithm prioritizes devices with sufficient energy levels, a strategy that reduces the probability of mid round dropouts, ensuring statistical data preservation, and improved global model stability and convergence. Experimental evaluations across MNIST dataset and dropout rates demonstrate improved global model accuracy in non-IID settings among heterogeneous devices. The findings contribute to advancing robust and efficient KD based FL frameworks suitable for resource-constrained environments, including mobile and IoT devices.enFederated LearningKnowledge DistillationClient DropoutBattery-Aware Client SelectionClient ClusteringNon-IID DataMitigating Issues Caused by Device Dropout in Knowledge Distillation Based Federated LearningThesis