Detecting Cell Outage by Applying Density Based Anomaly Detection Algorithm Using Machine Learning Technique: The Case of Ethio-Telecom UMTS Network

dc.contributor.advisorDereje, Hailemariam (PhD)
dc.contributor.authorAklilu, Bisrat
dc.date.accessioned2020-03-09T04:43:02Z
dc.date.accessioned2023-11-04T15:13:09Z
dc.date.available2020-03-09T04:43:02Z
dc.date.available2023-11-04T15:13:09Z
dc.date.issued2020-02-01
dc.description.abstractThis thesis develops a model to detect cell outage on real ethio telecom network data by using density based anomaly detection algorithms through the application of machine learning technique. Cell outage is the total/partial loss of radio network in a given area and the process of detecting cell outage is called cell outage detection. Cross-Industry Standard Process (CRISPDM) machine learning methodology, which is a six staged open standard process model that describes common approaches for data mining or machine learning was used. The considered data was normal and problematic network environment obtained from ethio telecom UMTS network in Addis Ababa. The proposed detection framework used network performance data (incoming handover (inHO), which is the process of transferring an ongoing call from one cell to the other, and traffic data, originated from base station and terminated to mobile device) of the neighbor cells to capture the normal network state and to detect the outage of the target cell automatically in a pre-set time interval in the UMTS network environment. To profile the normal network operation, the study used two density based anomaly detection algorithms; namely, the K- Nearest Neighbor (K-NN) and Local Outlier Factor (LOF) algorithms, of which one was selected based on its performance during the training. To validate the models, K-fold cross validation technique was used and for the selection of the optimal model, parameter selection was done for different values of K (K=1,2, 3…30). To compare the two algorithms, Receiver Operating Characteristic (ROC) curves were used. Based on the results, the K-NNAD was found to be of a better performance than the LOFAD, thus was selected as a detector in the profiling stage. The proof of the system model was tested by using real problematic network state data and the results of classic data mining metrics were obtained. Based on the results obtained from the testing, the K-NNAD method was found to perform better in detecting outage cells in the proposed framework.en_US
dc.identifier.urihttp://etd.aau.edu.et/handle/123456789/21032
dc.language.isoen_USen_US
dc.publisherAddis Ababa Universityen_US
dc.subjectCell outageen_US
dc.subjectMachine learningen_US
dc.subjectAnomaly detectionen_US
dc.subjectSelf-healingen_US
dc.subjectUMTS networken_US
dc.subjectCRISP-DMen_US
dc.titleDetecting Cell Outage by Applying Density Based Anomaly Detection Algorithm Using Machine Learning Technique: The Case of Ethio-Telecom UMTS Networken_US
dc.typeThesisen_US

Files

Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
Aklilu Bisrat.pdf
Size:
2.01 MB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Plain Text
Description: