Telecommunication Engineering
Permanent URI for this collection
Browse
Browsing Telecommunication Engineering by Title
Now showing 1 - 20 of 170
Results Per Page
Sort Options
Item Alarm Prediction for Fault Management using Deep Learning Approach: The Case of Ethio Telecom(Addis Ababa University, 2021-09) Betelhem, Berhanu; Rosa, Tsegaye (PhD)Telecommunication networks play a critical role in our society. They make it possible to share a massive amount of data across the globe. While networks are complex systems in terms of size and technological diversity, a failure results in numerous alarms from the series of devices. This makes monitoring and maintenance activities challenging due to the growing complexity of alarm management systems and the need for highly educated experts to deploy. Alarms are generated in vast quantities every day by today's large and complicated telecom networks. The alarm sequence offers significant information on the network's activity, but most of it is fragmented and hidden in the massive amount of data. Alarm regularities can be utilized in fault management systems, for example, to filter redundant alarms, locate network problems, and even anticipate catastrophic faults. In the presence of flooding alarms, alarms that are inadequately configured and maintained, and a large number of nuisance alarms, operators are expected to make vital judgments. If the incoming alarms can be correctly predicted before they occur, the operators may be able to address and possibly avoid anomalous behaviors by taking corrective actions in a timely manner. This paper presents an alarm prediction method based on data mining to generate patterns from historical alarm data, and use such patterns to train three deep learning approaches, namely long short-term memory (LSTM), bidirectional LSTM (Bi-LSTM) and gated recurrent unit (GRU). The prediction performance of the three deep learning approaches has been compared. Domain trained word embedding and pretrained word embedding (Word2vec) are used to feed embeddings to neural networks. The relevance of applying different word embedding is to explore the effect of the data preparation on the model performance. The Frequent pattern growth (FP) algorithm implemented in Rapidminer studio has been used to mine five months' worth of alarm logs. Finally, the best performing model is selected based on the accuracy of the model. The models are tested with a sequence of alarms and Bi-LSTM with domain trained word embedding achieves 93% in predicting the target alarms. However, from the results, we can also say that all three deep learning approaches can be used for predicting telecom alarms.Item Analysing Impact of Seamless MPLS on QoS(AAU, 2018-11) Habtamu, Kumera; Yalemzewd, Negash (PhD)The need for timely delivery of real time and mission-critical applications has led to a high demand for end-to-end QoS guarantees. Service providers, including Ethio telecom, have deployed mobile backhaul in addition to IP core network with separate MPLS domains for each network. Internal network topology of one domain and its policy implementations are proprietary information and inter-domain routing must be done without detailed knowledge of the entire network topology, policies and performance of the other domains. Lacks of global coordination between policies used in different domains, limitations of BGP for implementing end-toend QoS in inter-domain routing and its slow convergence during network failure are major challenges of the current inter-domain routing architecture. As stated by different authors such as in RFC 8277 (Using BGP to Bind MPLS Labels to Address Prefixes), Seamless MPLS provides framework for taking MPLS end-to-end in a scalable fashion, extending benefits of traffic engineering (TE) and guaranteed service-level agreements (SLAs) with deterministic network resiliency. It offers an alternative for implementing end-toend MPLS networks by integrating different domains into a single MPLS domain. The motivation of this thesis is to investigate and analyze impacts of implementing Seamless MPLS on QoS parameters. The impact on QoS parameters is analyzed by using two scenarios; MPLS with multiple domains and Seamless MPLS integrating three domains in to single MPLS domain. Simulation tools such as eNSP, Ostinato, NQA and MATLAB are used to compare performances of the two scenarios. The analysis results show that in Seamless MPLS throughput is improved by 36.87%, latency is improved by 15.98%, packet loss is improved by 20% and jitter is improved by 12.5% compared to MPLS. From the results one can understand that any service provider can benefit from deploying Seamless MPLS.Item Analysis and Prediction of Mobile Application Usage Based on location In case of ethiotelecom(Addis Ababa University, 2020-02-21) Bancheamlak, Gebrtsadik; Surafel, Lemma (PhD)The explosive growth of smart devices, network access points, and new mobile application development drives users to use more and more mobile applications and, this has lead to the explosive growth of mobile data traffic. It has a high impact on mobile service providers to manage network data traffic because application usage is different from one location to other with time. Understanding the application-level traffic patterns from a completely different location angle is effective for operators and content providers to create technical and business plans. In this paper, we have established several typical traffic patterns and predict application category traffic demand per clustered location in a mobile cellular network. We explore mobile traffic patterns by clustering each application category into five clusters based on traffic volume and location. Then, we implement a random forest model to predict the traffic demand of three of the most highly utilized applications per cluster location.This outcome could be useful in relevant future applications, with the prospect to achieve average 96% predictive accuracy per application category per cluster. Understanding popular application at the clustered locations and predicting the traffic demand of a popular application could significantly improve user experience, average latency, energy consumption, spectral efficiency, back-haul traffic, and network capacity. Those outcomes are possible via designing and implementing a cache server or planning and optimizing the network resources based on predicted traffic demand.Item Analysis of Availability Using Virtual Machine Placement Algorithms for Cloud IaaS(Addis Ababa University, 2020-02) Ammanuel, Shawel; Mesfin, Kifle (PhD)Cloud Computing paradigm is most popular for its flexibility in resource provisioning by creating virtual machines of the requested specification on the underlying physical infrastructure. Infrastructure-as-a-Service, which requires a provider to allocate virtual machines (VMs), has significant impact on the availability of accommodated applications. Due to this, several algorithms have been proposed for the VM placement problem with little or no objective comparison hiding the fact which one works best or what factors influence the availability of the algorithms. In this thesis work, comparison of four algorithms using metrics of availability is presented. The findings showed that the impact of VM placement algorithms along with memory and CPU utilization play an important role in maintaining the availability of IaaS for cloud. As a result, algorithms which perform well on one metric perform poorly on the other metrics underlining the importance of objectively comparing the availability of competing algorithms and highlighting the importance of thorough empirical studies not only for power consumption and other QoS attributes but also for availability as one important aspect of cloud computing.Item Analysis of Land Mobile Spectrum Occupancy using Spatial Interpolation Algorithms: For the Case of Addis Ababa, Ethiopia(Addis Ababa University, 2021-09) Rufael, Getachew; Dereje, Hailemariam (PhD)An increase in the development of information industry and wireless communication services shows an increase demand for spectrum resource. Spectrum is a limited resource that needs to be managed properly to enable additional new services, without spectrum, none of todays’ wireless communication will be true. The spectrum occupancy map can be a powerful tool for developing a better knowledge on the occupancy status of this scares resource. Traditionally, spectrum measurement is based on drive test, which conveys geographically measuring different spectrum bands with moveable vehicles equipped with mobile spectrum measurement capability. Drive test cannot be conducted in all regions due to constraints such as, buildings, road conditions, and inefficient measurement tools. Therefore, the drive test is inefficient to get information on spectrum occupancy status and it cannot offer a complete and reliable occupancy information. In this thesis, the performance of three spatial interpolation methods, namely Inverse Distance Weight (IDW), Ordinary Kriging (OK) and Natural Neighbor (NN) are evaluated to select which one is best method to develop spectrum occupancy map. The experimental analysis is performed on sample data, collected from TCI 700-series spectrum measurement unit, in the range of 137MHz-174MHz of VHF bands and 400MHz-470MHz of UHF bands for land mobiles in selected area of Addis Ababa. The collected sample data is given for K-means clustering algorithm to classify the occupancy status at different locations. The three interpolation methods are employed for only UHF bands. The prediction performance of the three algorithms is evaluated through cross validation and Root Mean Square Error (RMSE). The obtained results showed that, OK with Gaussian model of semivariogram estimate with a prediction error of 0.807, while IDW and NN estimate values 2.801 and 2.119 respectively. This show OK is more accurate than IDW and NN.Item Analyzing Impact of Segment Routing MPLS on QoS(Addis Ababa University, 2019-12) Kibrab, Alemayehu; Yalemzewd, Negash (PhD)Multiprotocol Label Switching ( MPLS) Segment Routing (SR), SR-MPLS in short, is an MPLS data plane-based source routing paradigm in which a sender of a packet is allowed to partially or completely specify the route the packet takes through the network by imposing stacked MPLS labels to the packet. SR-MPLS could be leveraged to realize a unified source routing mechanism across MPLS, IPv4 and IPv6 data planes by using an MPLS label stack as unified source routing instruction set while preserving backward compatibility with traditional MPLS networks. The Segment Routing is a promising Traffic Engineering (TE) model that provides end-to-end communications. SR can observably improve the network utilization and control the routing path flexibly by encoding route information into a list of segments, i.e., the Segment List (SL). The key feature of SR is that it adopts the source routing paradigm, which implies the routing path followed by a packet is determined and written to the packet header by the first switch of SR networks (called Ingress SR switch). The motivation of this thesis is to investigate and analyze the impact of implementing SRMPLS on Quality of Service (QoS). The impact on QoS parameters is analyzed by using two scenarios; Traditional Unified MPLS integrating four domains into single MPLS domain and Segment Routing over Unified MPLS integrating four domains into a single MPLS domain. Simulation tools such as EVE-NG, Ostinato, Cisco IPSLA technology are used to compare performances of the two scenarios. The analysis results show that in SR-MPLS throughput is improved on average by 32%, latency is improved by 24%, packet loss is improved by 20.4% and jitter is improved by 12.6% compared to Label Distribution Protocol (LDP) based Unified MPLS. From the results, one can understand that any service provider can benefit from deploying Unified SR-MPLS. Segment Routing is a new routing paradigm that provide a better end-to-end QoS guarantee, making traditional MPLS network more efficient and scalable.Item Anomaly Detection of LTE Cells using KNN Algorithms: The Case of Addis Ababa(Addis Ababa University, 2019-12) Teweldebrhan, Mezgebo; Beneyam, Berehanu (PhD)For mobile operators, delivering quality service to their customers is very important as service quality significantly affects their business. To achieve a consistent quality service delivery, they need to continuously monitor and analyze their network performance and timely address any obtained performance drops. Performance drops in mobile networks can be observed in key performance indicators (KPIs) of spatially distributed cells with different magnitude and at different time periods. As it is practically difficult to address all performance drops simultaneously, it is preferable to make prioritized corrective actions starting from cells with critical drops, also called anomaly cells. To detect anomaly cells, different automated methodologies have been proposed and analyzed. Yet, ethio telecom still applies manual and subjective anomaly detection method where measured KPI values are manually compared with fixed thresholds to determine if the measured values are within defined required ranges or not. Cells with KPI values out of the required range are analyzed for identifying performance drop root causes and taking corrective actions. The manual and subjective anomaly detection method is prone to detection errors and is maintenance time, manpower and then cost inefficient. These challenges of manual and subjective detection can be improved by applying advanced automatic methods based on machine learning algorithms. In this thesis work, KNN based anomaly detection algorithms such as KNN classification, local outlier factor (LOF) and connectivity outlier factor (COF) anomaly detection models is implemented, and their comparative evaluation are made for Addis Ababa LTE cells. The comparison is made based on type of output, complexity and their true positive rate (TPR) for time series and cell level detections. Unlike KNN classification, LOF and COF do not need heavy data set training and are able to provide anomaly scores instead of two-class labels. Experimentation results show that COF provides slightly better performance than the other models with negligible performance difference. For instance, the performance of COF with respect to TPR for RRC setup success rate in the experimentation is 97.91% for cell level detection and 88% for time series detection.Item Application-Aware Data Center Network Bandwidth Utilization: the case of ethio telecom(AAU, 2018-11) Zerihun, Mamo; Mesfin, Kifle (PhD)The existing ethio telecom data center network (DCN) is the traditional model which provides only the Best Effort (BE) traffic delivery service on a layer three links with the same priorities for all applications traffic. A diversity of applications is running on the data centers, the scarcity innetwork bandwidth of data centers become the performance bottleneck for the integration of enterprise systems. The most important point is that the existing network has not a dynamic bandwidth management strategy, which lacks of flexible bandwidth utilization among different types of applications. To guarantee the network performance for system integration, an efficient in-network bandwidth management should be considered. In this thesis work, a QoS model for an aggregated applications traffic in the DCN with a constrained bandwidth and with a consideration of the business criticality of the applications has been designed. The model nearly supports guarantees of QoS to real-time traffic without reserved bandwidth, and it assured forwarding high priority class traffic for business and mission critical applications. The design approach is based on the Internet Engineering Task Force (IETF) Differentiated Service(DS) Per-Hop-Behavior (PHB) group. Different types of Active queue management(AQM) and packet scheduler algorithms have been compared on the proposed design to minimize the packet loss and delay of each aggregated traffic class. In addition, the dynamic Benefit weighted scheduling (DB-WS) algorithm is modified to adapt with our solution, the algorithm dynamically allocated bandwidth based on the average queue length of each service class. Extensive simulation results have shown that our proposed design is capable of improving the bandwidth utilization as well as providing desirable network performance for real-time and business-critical applications on a bottleneck link and also reduce the total packet loss by 1.34 %.Item Assessment and Optimization of Infrastructure and Radio Resource Utilization Efficiency for LTE/LTE-A Cellular Networks, the Case of Ethio Telecom(Addis Ababab university, 2024-12) Elias Hagos; Yelemzewd Negash (PhD)Customers' quick requests for a variety of primary (basic and required, such as call, SMS/MMS, and data) and secondary (luxury and business class enterprises, such as high BW quality video accessing, video and camera surveillance, and online systematic monitoring & accessing) services via mobile networks necessitate thoughtful, timely responses from suppliers. The existence of many standards makes it difficult for 3G mobile networks to roam and cooperate with other mobile networks. However, LTE is a global standard that provides global mobility and service portability without binding customers to companies' proprietary hardware. Moreover, LTE is primarily a synthesis of multiple previous technologies rather than a completely unique standard. Since LTE/LTE-A/LTE-A pro is the most recent and enhanced service, it can currently satisfy consumer requests for high-speed data, low latency, bandwidth efficiency, multimedia broadcast and unicast, and high-quality cellular broadband services with affordable prices and revenue structures. This will be achieved by utilizing key technologies including orthogonal frequency-division multiplexing (OFDM), RF analysis and carrier aggregation, adaptive modulations, multiple-input multiple-output (MIMO), Hetnets and CoMP, and others. For instance, MIMO technology can significantly increase channel and spectrum efficiency. The physical layer channel includes modulation, cyclic prefix, frequency, bandwidth, and coding rate, all of which contribute to the high throughput. To increase spectral efficiency, the system uses OFDMA as an access mechanism in the downlink and Single Carrier Frequency Division Multiplexing (SC-FDMA) in the uplink. The Sphere Decoding (SD) detector, QR decomposition with M-algorithm maximum likelihood detector (QRM-MLD), Maximum Likelihood (ML) detector, Zero-Forcing (ZF) detector, and Minimum Mean Square Error (MMSE) detector are the superior signal detection methods that are employed. The ideal signal detector has a higher bit-error-rate (BER) in a correlated MIMO-OFDM scenario. even if the suppliers are testing and researching possible future R&D optimizations and service enhancements. In order to upgrade to LTE-4.5G and LTE-5G and offer opulent services like high BW, high QOS, low cost and jitter, error-free, and customer-acceptable, operators and service providers are now innovating, analyzing, and optimizing their LTE/LTE-A systems. As a result, consumers are sharing the globe and enjoying the newest innovations. Furthermore, engineers and technologists are researching and presenting their results and breakthroughs from theses, research, and seminars. In addition to discussing the evaluations, visualizations, and/or analyses utilizing various research techniques and algorithms to trace the availability, this thesis can also discuss LTE/LTE-A in the context of the aforementioned situations. After that, it will make a determination regarding the analysis. Based on the analysis, it is feasible to optimize the structural infrastructures and available resources to improve quick, dependable, and accessible services like SMS/MMS, calls, high bandwidth data, cheap cost, very little jitter, and delay with flawless system integrations. In the framework, market analysis and LTE-Advanced radio network link budget dimensioning are conducted using the COST-231 Hata model; existing LTE traffic, standard, and demographic studies are conducted for macro and small cells, respectively. Additionally, the required system bandwidth is estimated and computed correctly. As a result, it will offer critiques, suggestions, and conversations regarding the outcome, followed by reports and presentations of the thesis. Based on this thesis study, it will be feasible to conduct a detailed survey, observations, and optimizations of the efficiency of LTE/LTE-A infrastructures and the consumption of radio resources in Ethiopian telecom cellular mobile networks. Additionally, it will analyze the optimizations of LTE/LTE-A upgrades to LTE-A pro and LTE-5G innovations. In addition, this thesis will demonstrate how LTE/LTE-A resource optimizations can serve as technical references and support as well as a means of disseminating engineering research and findings.Item Availability and Maintainability Analysis for Mobile Networks in Addis Ababa(AAU, 2018-11) Leake, Gebrekidan; Yalemzewd, Negash (PhD)Modern society has a high dependence on communication technology with a wider range of applications. So, service users demand an always available network to use it whenever they wish. Thus, network operators are expected to continuously analyze the performance of their networks and take corrective actions to make them as available as possible. Reliability Engineering provides methodologies and techniques to analyze the reliability, availability, and maintainability (RAM) of systems including telecommunication networks. The main objective of this research is to perform availability, maintainability, and downtime impact analysis as a case study of Ethio telecom (ET) mobile sites deployed in Addis Ababa, Ethiopia. Four months mobile sites outage, active 3G mobile service users and average revenue per user (ARPU) data are used for the analysis. The mean time between failures (MTBF) and the mean time to repair (MTTR) are metrics used for availability analysis while maintainability is analyzed based on the lognormal distribution of the time to repair (TTR) data. Analysis results have shown that: (a) 91.7% of the mobile sites had an outage over four months. (b) 69% of the total mobile sites were in an outage for 114, 592 times and 25, 676.72 hours (c) The average availability of the mobile sites is 99.75% with a median, mean, and maximum maintenance times of 1.9, 3.3 & 10.4 hours respectively. (d) Longer downtimes have a significant impact on revenue, Service Level Agreement (SLA) and on customers. The results of the research can be used as an input in network maintenance policy, strategy, and resource planning to ensure high network availability. The MATLAB, Easy Fit, and MS Excel software tools are used for analysis.Item Bandwidth Optimization of IP Core Network Using MPLS Traffic Engineering and Quality of Service: The Case of Ethio Telecom Backbone Network(Addis Ababa University, 2021-10) Samrawit, Eshetu; Yalemzewd, Negash (PhD)The rapid growth of the number of different telecom service users, the significant inventions of services, and web-based applications make the service providers expected to have a huge number of subscribers. People also show their interest to use the newly emerging services and applications since these services and applications are making life easier in day-to-day activities. Consequently, this makes the service providers have an enormous number of users. Even though it is good to increase the number of subscribers for the providers, the traffic that is forwarded from these subscribers to the provider’s network is also vast. As a result, this will make the service provider’s network become congested. Providers should have to consider their backbone network capacity while offering the different types of services. The increase in the number of customers without the timely deployment of the network expansion project will make the network to be exposed to congestion. Network expansion or deployment of new projects takes high cost as well as is time-consuming. But before the implementation of new projects, network bandwidth optimization should be implemented to use the available network resources effectively and efficiently. The motivation of this paper is to investigate and analyze the performance and link bandwidth utilization of the core network by implementing MPLS TE + QoS. The impact of QoS parameters is being analyzed by using two scenarios; MPLS LDP + BE and MPLS TE + QoS scenarios. Simulation tools such as GNS3, Ostinato, and PRTG are used to compare the performance of the two scenarios. The analysis result shows that in MPLS TE + QoS scenario packet loss is improved by 29%, throughput improved by 17.4%, and latency improved by 15.3%. From this result, it is highly recommended that service providers should have to deploy MPLS TE + QoS on their core (backbone) networks, which makes them more beneficiary.Item Building Univariate Time Series Forecasting Models for Network Traffic by Evaluating Statistical and Machine Leaning Techniques: The Case of Ethio Telecom Broadband VSAT Hub Networks(Addis Ababa University, 2022-01) Ermias Nigussie; Ephrem, Teshale (PhD)Network traffic congestion is the major challenge in telecom service providers where usually they use a limited resource to deliver the services to their customers. That leads to network performance and Quality of Services (QoS) degradation so that do not meet customer satisfaction. The Very Small Aperture Terminal (VSAT) network in Ethio Telecom delivers broadband services through satellite with a limited capacity. Therefore, this thesis aims to study the VSAT network traffic patterns to propose the traffic forecasting model. That will be used as a solution to enhance the network resources based on the prediction of the future traffic demand and as input for network planning and optimization works. In this study, the VSAT data traffic recorded for one year from 01-Mar-2020 to 28-Feb-2021 was collected. The dataset is used for data preprocessing, statistical analysis, model training and testing. All the tasks are performed by Python software. In addition, existing time series forecasting methods are selected from statistical and machine learning models that includes the Exponential Smoothing Methods (ESM), Autoregressive Integrated Moving Averages (ARIMA), Seasonal ARIMA (SARIMA), Artificial Neural Network (ANN) variants Multilayer Perceptron (MLP), Recurrent Neural Network (RNN) and Long Short Term Memory (LSTM). The forecasting accuracy metric of Root Mean Square Error (RMSE), Mean Absolute Error (MAE) and Mean Absolute Percentage Error (MAPE) are used to evaluate the forecasting performance of the models. These are applied to examine and choose the model having the minimum forecasting errors. As a result, the RNN model is identified the best model and improved the forecasting performance by 44.94% than the Triple Exponential Smoothing (TES) model which is a variant of ESM. Therefore, the RNN model is proposed to Ethio Telecom for future network planning and optimization to VSAT networks.Item Capacity Enhanced-Energy Efficient Base Station Deployment Using Genetic Algorithm(Addis Ababa University, 2020-02) Tabor, Birru; Yihenew, Wondie (PhD)Unparalleled increasing demands for high capacity and consistent service quality on cellular network have been challenging for telecom operators. However, current deployed and existing radio access network is significantly behind the growth. This promotes operators to network upgrade, expansion and base station (BS) densification. Operators are also trying to mix macro and small cell on their radio access network as a potential solution to meet their customer demands by enhancing their network capacity and coverage extension. However, addition of the small cell on existing network increases energy consumption. This thesis study considers energy efficient BS deployment for enhancement of LTE network capacity. For this purpose, small cell deployment underlay to the existing macro BS is used in outdoor scenario in 2x2 square kilometer area located in Addis Ababa. Candidate locations for the small cell first selected mainly based on traffic distributions in the selected area of study. Then radio propagation simulations performed using WinProp radio planning and simulation tool followed by Genetic algorithm based optimization to find out the optimal number and locations of the small cell obtaining enhanced capacity and improved energy efficiency with minimized additional power consumption. The result analysis is observed in MatLab implementation. Finally, aggregate capacity and energy efficiency have been evaluated. The result shows that both the aggregate network capacity and energy efficiency increased with number of small cell. There is 77.94% capacity and 24.7% energy efficiency improvement as compared to the original macro BS only network, which respectively requires 82 and 59 small cells transmitting at 0.5-watt power. At the same time, improvement of capacity requires 18 small cell while it takes 23 and above small cell for energy efficiency to be improved. For small cell more than 59, the energy efficiency start declining which indicates small cell deployment beyond this value has no importance as it declines energy efficiency. The aggregate network capacity has improved by 66.99% when selection of the small cell limited on its impact on energy efficiency. The maximum energy efficiency achieved is 24.7%, 22.32%, 17.88% and 11.12% respectively for small cell transmitting at 0.5watt, 2watt, 5watt and 10watt. This result can possibly be improved further by using different techniques such as sleep mode and cell zooming operations.Item CDR Based Recommender System for Mobile Package Service Users(Addis Ababa University, 2023-09) Saba Mulugeta; Sosina Mengistu (PhD)Due to increased competition, telecom operators are continually introducing new products and services. To make their services easy to use, to meet customer requirements, and to satisfy their customers' needs in terms of payment, operators have launched various telecommunication packages. Telecom operators have so many packages that customers are unaware of; some packages may go unnoticed even if they are useful. To overcome such problems, we need a recommender system that directly notifies customers based on their interests. Most research has been conducted on recommendation systems for web service users based on user ratings. In this paper, we propose a mobile package service recommendation system for customers. The proposed recommendation system has two phases. The first is creating a relationship between customer usage and mobile packages by grouping customers based on their usage. To create a relationship between customers' usage and mobile packages, we have used the k-means clustering algorithm. The elbow method is used to determine the number of clusters for each service. The second phase is building a classification model that will recommend mobile packages for users. Two-month CDR data was used to build a classification model by using random forest (RF) and K-nearest neighbor (KNN) classifier algorithms. The evaluation result shows KNN outperformed RF for weekly and monthly data usage plans with F1 scores of 90.4% and 96%, respectively, whereas RF outperformed for daily plans with an F1 score of 86.9%. On the other hand, RF outperformed KNN with F1 scores of 95.10% and 99.60% for daily and monthly voice usage plans, respectively. Similarly, KNN showcased better performance than RF on the weekly voice usage plan with an F1 score of 94.30%. Generally, the strengths of each algorithm differ for different usage scenarios within the voice and data service domains.Item Cell Outage Detection Through Density-based Local Outlier Data Mining Approach: In case of Ethio telecom UMTS Network(AAU, 2018-11) Solomon, Bekele; Dereje, Hailemariam (PhD)Mobile traffic growth increases exponentially over the years. To gratify the growing traffic, which requires capacity and coverage, densification of a network is a key solution. As mobile network becomes larger and larger, it is difficult to manage the network manually rather it requires automated network management. Self-healing is one of self-organizing network (SON’s) functionalities that implements automatic fault management in radio access network (RAN). In practice, mobile cell outage is the major problem in the radio access network and leads to the lack of network service. The automated and timely detection of a malfunctioning cell is one of the crucial challenges for network operators. In this thesis, data mining model has been introduced to detect cell outage automatically. Density-based Local Outlier Factor (LOF) detection algorithm, which is a decisive part of the model, has been adopted and implemented using incoming handover statistical data to detect cell outage and sleeping cells in self-organizing manner. For this purpose, statistical handover data has been collected from real UMTS network and then preprocessed using filtering, aggregation, normalization and then profiling. Moreover, an improved version of LOF algorithm, fast anomaly detection with duplication (FADD), has also been implemented to improve the detection capability. Receiver Operating Characteristic (ROC) curve is used to show the degree of the performance of the algorithms. The study shows that the two versions of LOF cell outage detectors have detected most cells in outage and locate their positions. But, FADD has detected 89% compared to 75% of the original LOF.Item Circuit Switched UMTS to GSM Handover Neighbor List and Parameter Optimization: The Case of Ethio Telecom Addis Ababa Network(Addis Ababa University, 2022-02) Muluken, Bekele; Sosina, Mengistu (PhD)The rapid growth of mobile service users is forcing telecom operators to invest in new infrastructure. However, looking for techniques for optimum network utilization that can optimize the interoperability features among different technologies has the potential to reduce investment costs for the deployment of infrastructure. Optimizing the circuit switched inter radio access technology between universal mobile telephone system (UMTS) and global system for mobile (GSM)) handover success rate failure allows the operator to use the ubiquitous 2G network to provide service in the coverage discontinuity of the 3G network, as well as load sharing between the networks. In this thesis, based on the circuit switched inter radio access technology (CS IRAT) 3G-to-2G handover success rate report data of the Addis Ababa network, a problematic sample IRAT network has been chosen, and the handover is optimized for the sample network and interpreted for the whole Addis Ababa network. First, with the help of Pareto techniques, the root cause of the handover (HO) failure rate is analyzed using two months report data, and it is found that physical channel failure is the main cause of the IRAT handover failure. Second, a sample network (study area) is selected based on the HO success rate and the number of handover attempts (success rate of 70% and number of attempts > 10,000 per day). Then methods to solve the root cause of the problems were suggested (adjusting the neighbor relations and tuning the target 2G network handover threshold) based on the different literature reviewed and the effects of the parameters on the IRAT handover. With Atoll (planning tool), the sample network (real network) identified is designed using engineering parameters of the network obtained from Ehtio Telecom. Finally, using the sample network as an input, the neighbor relation is redefined and the handover coverage of the network is optimized. It is found that the handover coverage area (a possible location where a handover could occur) shrinks with the increase in the target 2G network handover thresholds, which will lead to first a decrease in the total number of handover attempts and, second, from the total, an increase in the success of those attempted handovers, since the 2G cell gets stronger. The above two conditions result in an improvement in handover success rate since it is the ratio of successful handovers to the total number of handovers.Item Classification of Top Call Reasons using Machine Learning in Call Center Service(Addis Ababa University, 2021-12) Atsede, Abebe; Dereje, Hailemariam (PhD)A call center (CC) connects customers to a service provider, such as telecom operators. The CC receives of customer requests, feedback, and complaints. These inputs from the customers provide an opportunity to comprehend the customer’s needs, problems the customers face when using services, and the performance of the service provider. Meeting customer expectations by responding to complaints increases customer satisfaction, which translates to revenue maximization. Hence, the CC is critical to the success of the service provider. Ethio-telecom, a telecom service provider in Ethiopia, operates a large CC that provides telecom-related services throughout the country. The CC accepts over two million calls per day via a service-free line. The center records and maintains a huge amount of customer-related data, which can further be analyzed using state-of-the-art machine learning algorithms for the purpose of proactively estimating call types and reasons. This thesis proposes to map features from customer profile information into top call reasons so as to better understand customer call requests and map future calls to specific top call reasons. Data was extracted from Ethio-telecom’s IP contact center, customer relational management, and customer billing system servers. To construct the classification models, J48, Random Forest (RF), and Naive Bayes (NB) algorithms are used. Accuracy, time to build a model, and model interpretation of each algorithm are used to compare their performance. Results show that RF and J48 algorithms outperform NB, with scores of 97.46% and 97.4%, respectively. The NB model is the least accurate, with an accuracy of 83.6%. However, the time spent building a model for NB is less compared to J48. During the model's interpretation, J48 algorithm is more interpretable than the NB and RF. J48 algorithm are best.Item Clear-Air Radioclimatic Modeling of Terrestrial Line-of-Sight Radio-Link in Ethiopia(Addis Ababa University, 2021-09) Feromsa, Girma; Ephrem, Teshale (PhD); Feyisa, Debo (PhD) Co-AdviserTerrestrial radio link is challenged by clear air environmental factors like multipath fading, diffraction fading and atmospheric gaseous losses. These radio link impairments need to be discovered using right meteorological parameters and applying appropriate models so that its result will be used in design of terrestrial radio link of line-of-sight communications. Ethiopia and other countries found in tropical regions are relying on ITU-R database for designing of terrestrial line-of-sight link communication which could end up to over prediction or under prediction of impairments. ITU-R data source was initially prepared from climates of countries located in Temperate region that might not fit for a tropical region. The investigation of finding clear-air radioclimatic parameters for the terrestrial radio-link requires intensive data collection from local meteorological sources. The input parameters to design radio link that comes from investigation depend on local data is expected to be more reliable when it compares to the ITU-R database. Most researchers have used their local data to investigate secondary radioclimatic parameters, that are used as inputs for the design of terrestrial radio-link. But, the meteorological data are limited to their local weather condition, that may not be applied at different geographic locations. This thesis has utilized five to ten years, 15 minutes intervals, surface data of temperature and humidity with one year daily data of Radiosonde were collected from the National Meteorological Agency of Ethiopia. Satellite’s monthly pressure data of "FLDAS Noah Land Surface Model L4 Global" was used and integrated using the Haversine formula with surface data from the meteorological agency to have full information about temperature, pressure, and humidity of selected 58 geographical places in Ethiopia. The Radiosonde data were used for developing a new vertical atmospheric profile model in Addis Ababa. The secondary parameter of the modeled atmospheric profile result was compared with the result of the ITU-R law latitude model and with the ITU-R database. MAE and RMSE are used as the error comparison between the results of the ITU-R results and the proposed model. The performance of the proposed models is found to be better compared to the other models. ArcMap, ExceLab, Ms-Excel, and Python programming are the tools that are used to analyze and demonstrate results in graphs, charts, and maps. This thesis work produces results from the investigation of refractivity, point refractivity gradient, geoclimatic factor K, and k-factor at the selected locations in Ethiopia. The distribution of each parameter is presented in monthly, seasonal, and average year time. The curve fitting approximated for the k-factor distribution is Gaussian normal distribution and the corresponding Integral Square Error approach is used for error checking. The maps and databases are generated using Ordinary-kriging with ArcGIS software tool. The resulting database can be used for consumption by radio link designers.Item Cloud Adoption Framework for Small and Medium-Size Enterprises (SMEs) in Ethiopia(AAU, 2018-11-16) Birhane, Woldegebreal; Fekade, Getahun (PhD)Cloud Computing (CC) is being accepted as one of the emerging technologies that create a new opportunity for enterprises with limited resources. It is promised that cloud can allow firms to reduce costs and to operate more flexibly than with a traditional Information Technology (IT) infrastructure. However, the promising opportunities go hand-in-hand with several challenges that can pose risks for a firm planning to adopt CC. A number of frameworks and models have been developed by prior studies to support cloud adoption process. However, these models and frameworks have been formulated to the context of developed countries. Though worldwide research may be applicable to at least some extent research regarding the adoption of cloud is not fully generalizable to corporate environment. In this study, we have proposed a conceptual cloud adoption model grounded by Technology Organization and Environment (TOE) and Diffusion of Innovation (DOI) frameworks. According to this model, six variables are influencing the Small and Medium Enterprises (SMEs)’ decision to adopt CC. To test the proposed model, we have used dataset collected from 136 SMEs and analyzed using Multiple Linear Regression (MLR). The analysis revealed that the model is statistically significant [F (6,129) 73.631 and p <0.001]. Four out of the six determinants were found to be significant for their influence of cloud adoption. Moreover, a Cloud Deployment Selection Model (CDSM) has been also proposed to support SMEs’ in predicting an appropriate cloud model for their business need. The model is generated by refining the MLR output and using the established principles of Analytic Hierarchy Process (AHP).Item A Cloud Adoption Readiness Model for the Case of the National Data Center of Ethiopia with Mixed Approaches(Addis Ababa University, 2021-12) Habtamu, Bekele; Mesfin, Kifle (PhD)The deployment of the right innovative technologies to government organizations can provide competitive advantages to the citizens. Cloud computing (CC) is an emerging and transformative technology that offers IT services as utility online. Collaborating to its vital features total cost reduction, increase of productivity and scalability are the top motivators to adopt CC. Prior research works show that before rushing to adoption, organizations must assess their readiness at least once. But as studies show many, including Ethiopia, adopt technologies without considering readiness study – the key step. Readiness study is a must to reduce or avoid risk of wastage and/or failure. Moreover, several research works are more focusing on an individual or a firm level theoretical model using either quantitative or qualitative approach. However, research works on cloud readiness assessment model for e-government (e-gov) services are insufficient, particularly for developing countries. In Ethiopia there is not a cloud readiness study on e-gov services. The purpose of this study is to design a cloud adoption readiness assessment model (CARAM) that fits to the context of the National Data Center of Ethiopia (NDCE) based on the empirical study of factors affecting adoption of CC to e-gov services with mixed methods. The mixed approaches increase the reliability of the result of the study. A total of 78 respondents from seven government organizations responded for the questionnaires and the collected data was analyzed with SPSS tool. Furthermore, the data collected from interview of twelve senior IT experts of the Ministry of Innovation and Technology (MInT) and document analysis were analyzed using thematic technique. Besides, the performance of Internet and the IT infrastructure of the NDCE were observed. The status of NDCE to adopt CC to e-gov services is low. Moreover, poor connectivity, inconsistency of power supply and lack of CC knowledge to users are identified as barriers though some enabler factors. Accordingly, the CARAM has proposed as a solution to help assessing readiness of NDCE to adopt CC to its e-gov services, since there is nothing earlier. Subsequently, the proposed model was evaluated twice: firstly, it was evaluated by senior IT experts of the NDCE for refinement and it was improved according to the comments provided. Secondly, the improved model was evaluated for confirmation, and it is highly (90%) accepted. Keywords: cloud adoption, e-gov services, mixed approaches, national data center, readiness assessment model