Author Archives: Jihan Jihan

DETECTION AND CLASSIFICATION OF BONE FRACTURE BASED ON MACHINE LEARNING TECHNIQUES

  • Kosrat Dlshad Ahmed
  • [email protected]
  • +9647506307788
  • Bone Fracture - Thesis_compressed
  • Computers have demonstrated their significance in all areas of human existence, including financial services, e-commerce, communication, education, research, innovation, and healthcare. The use of computers to make medical diagnoses has expanded rapidly in recent years. An efficient approach that uses modern technology while requiring fewer man-hours and materials to provide accurate diagnoses. While diagnosing an injury to a human body, doctors often obtain X-ray images of the affected area.

    Numerous advanced technological tools have been created to aid doctors and medical facilities in enhancing patient care. The conventional X-ray scanners, known for generating unclear images of bone structures, pose a risk of inaccurate diagnoses of bone fractures when employed by surgeons.

    The approach detailed in this research outlines a thorough procedure aimed at improving the precision of image examination within the realm of diagnosing medical conditions using X-ray images. Real-life medical data often suffers from noise, inconsistency, and incompleteness, necessitating robust preprocessing. The initial step involves applying a Gaussian filter to the X-ray images for noise elimination, followed by the utilization of adaptive histogram equalization to improve contrast. Edge detection is then conducted using the Canny technique, integrating adaptive histogram equalization to enhance contrast quality.

    The process of feature extraction utilizes the Gray-Level Co-occurrence Matrix (GLCM) for textural feature extractions, yielding a set of 100 features for each image. To classify bone fractures, a range of machine learning algorithms (Decision Tree, Naïve Bayes, k-Nearest Neighbors, Random Forest, and Support Vector Machine) along with deep learning algorithms, such as Convolutional Neural Network (CNN) and transfer learning via VGG-16, are employed to improve overall performance.

    As an additional point, the outlined approaches were applied for bone fracture detection using both national and international datasets. In this particular case, 80% of the dataset was employed for training, while the remaining 20% was allocated for testing. The research assessed the effectiveness of diverse algorithms, including Naïve Bayes, Decision Tree, Nearest Neighbors, Random Forest, SVM, and CNN, resulting in accuracy scores spanning from 0.64 to 0.96. Among these, statistically, the highest accuracies were observed in SVM, Random Forest, and CNN in this thesis.

  • Erbil Technical Engineering College
  • Information Systems Engineering
  • Machine Learning

Metaheuristic Optimization Algorithms in Applied Science and Engineering Applications

  • Azad Abdullah Ameen
  • [email protected]
  • +9647721068874
  • Metaheuristic Optimazation Algorithms in Applied Science and Engineering Applications
  • The objective of this study is to delineate the challenges associated with addressing complex optimization issues, with a specific focus on metaheuristic algorithms. A comprehensive investigation was undertaken to explore the principles and categories of these algorithms to gain a deeper understanding of the issues they present and develop effective strategies to overcome them. To challenge these issues, the study explores metaheuristic algorithms, which are known for their effectiveness in solving such problems. However, these algorithms often struggle with getting stuck in local optima and maintaining a balance between exploration and exploitation. Additionally, they exhibit poor searchability and exploitation performance.

    To address these challenges, this research work introduces three different algorithms: a modified version of child drawing development optimization MCDDO, a hybrid algorithm combining child drawing development optimization with harmony search CDDO-HS, and a novel metaheuristic called the social psychology interaction behavior algorithm SPIBA, inspired by human social psychology interactions.

    The performance of these algorithms is evaluated using various benchmark test functions, including classical and CEC-C06 2019 benchmark functions. Statistical methods, such as ranking and the Wilcoxon rank-sum test, are used to compare the results of these algorithms with the original algorithms, CDDO, HS, and other popular algorithms.

    In the beginning, two different approaches were proposed, namely MCDDO and CDDO-HS. The main objective of both techniques is to overcome the issues that the CDDO faces. The CDDO is an example of a human-based metaheuristic approach that may encounter challenges such as getting trapped in local optima, demonstrating suboptimal performance in the exploration phase, and experiencing stagnation in the nearest optimal solution.

    The first proposed MCDDO incorporates four key mechanisms: iterative pattern memory PM updating during the exploitation phase, where new experiences are compared with the child's current drawings; a change in the primary rule employed during the exploitation phase; parameter tuning to strike a balance between exploration and exploitation phases; and preservation of the best solution obtained in each iteration and comparing new solutions with the best solution during the exploration phase. Following the completion of the evaluation, the statistical findings indicate a consistent superiority of the proposed approach over standard algorithms, as evident in both average and p-value results. Specifically, out of the nineteen classical test functions and ten CEC-2019 benchmark test functions, the proposed approach demonstrated better performance in thirteen and nine instances, respectively. These results were then compared with those obtained from the JAYA, SCA, ChOA, DA, GPSO, and BOA algorithms. The comparative analysis confirmed that the proposed approach outperformed all other metaheuristic algorithms in four out of the ten CEC-2019 benchmark test functions.

    The second proposed method, CDDO-HS, represents a hybridization between CDDO and HS and integrates two crucial mechanisms. Firstly, it relocates the PM to the algorithm's core, updating it with each iteration using the HS algorithm. Secondly, it establishes the PM size at 80% of the overall population, aiming for optimal exploration. After the evaluation, the statistical results reveal that the hybridization approach consistently outperforms standard algorithms in both average and p-value outcomes. Specifically, in comparison with CDDO, it achieves better results in eleven out of nineteen classical test functions and all functions from the CEC-2019 benchmark. When compared with HS, the hybrid approach excels in sixteen out of nineteen classical test functions and seven out of ten CEC-2019 benchmark test functions. These results were then pitted against the ChOA, BOA, FOX, GWO-WOA, WOA-BAT, and DCSO algorithms. The study proved beyond a reasonable doubt that the suggested method is better than all other metaheuristic algorithms in six of the ten CEC-2019 test functions.

    In the subsequent phase, SPIBA, an innovative metaheuristic optimization algorithm inspired by social psychology interaction behavior and social interaction—processes involving the stimulus or response of two or more individuals—was developed. These fundamental ideas have been easily incorporated into SPIBA's core, which operates as a single-object and population-based algorithm. SPIBA's performance was compared to that of the ChOA, BOA, FOX, GWO-WOA, WOA-BAT, and DCSO algorithms. The exploration and convergence measures were utilized to assess its success. Their analytical results definitively indicated that the proposed approach beat all other metaheuristic algorithms in six of ten CEC-2019 benchmark test functions.

    Additionally, SPIBA was applied to equipment real-world engineering and applied science challenges, specifically in pressure vessel design and the analysis of the pathological IgG fraction in the nervous system. When working in pressure vessel design and compared with eight other algorithms—WOA, GWO, FDO, CFDO, WOAGWO, KMGWO, RFSO, and MFDO—SPIBA appeared as the top-performing algorithm. It showed an average solution quality of 6.01E-05 and the lowest standard deviation of 2.00E-04, guaranteeing the ahead position. In the context of the "Nervous System's Pathological IgG Fraction" application problem, a comparison between SPIBA and Leo revealed a significant improvement in the proposed algorithm's performance.

  • Erbil Technical Engineering College
  • Information Systems Engineering
  • AI-Optimization

Lagrange Elementary Optimization Algorithm Based on New Crossover Operator

  • Aso Mohammed Aladdin
  • [email protected]
  • +9647725426616
  • phd-dissertation-last-draft-Aso
  • The evolutionary sophistication method solves optimization problems; however, its effectiveness and scalability can be challenged as problem complexity increases. Population-based evolutionary metaheuristic algorithms heavily rely on operators that determine their overall performance. These operators enhance exploration and exploitation, crucial for effective search and optimization. The research introduces the crossover operator, Lagrangian Problem Crossover (LPX), to boost evolutionary algorithms' performance in tackling new optimization problems. Additionally, it presents Lagrange Elementary Optimization (LEO), a single-objective algorithm where LPX plays a significant role.

    The crossover operator in population-based algorithms is crucial for selecting suitable solutions in optimization processes. Its efficiency saves time, minimizes errors, and reduces costs in engineering applications. The initial phase of the study presents an overview of the current crossover methods utilized in engineering operations and problem representation. Furthermore, presenting LPX, it is a fresh and inventive hybrid technique that draws inspiration from the principles of the Lagrangian Dual Function (LDF). Experimental evaluations compare LPX with other standards such as Simulated Binary Crossover (SBX), Blended Crossover (BX), and Qubit-Crossover (Qubit-X) in real-coded crossovers. The results indicate that LPX generally outperforms other methods and shows comparable performance in remaining cases. Specifically, in TF7, LPX demonstrates superior performance and shorter computation time across all three random values compared to Mean (α=0.2) at 0.0048, Standard Deviation (α=0.2) at 0.0031, and time computation (α=0.2) at 143.005 units. Statistical analysis validates the significance and reliability of LPX compared to other crossover standards.

    In the second phase of the research, a novel evolutionary method named Leo is introduced. Leo is inspired by the accurate vaccination process that utilizes the human blood albumin quotient.  Leo utilizes a self-adaptive approach, evolving intelligent agents through gene crossover based on fitness function values. The algorithm's accuracy and precision are extensively validated through rigorous testing on diverse benchmark functions, including both traditional and CECC06 2019 benchmarks. Leo's performance is benchmarked against well-known algorithms like Dragonfly, Genetic Algorithm, Practical Swarm Optimization, and others across multiple functions. A comprehensive comparison evaluates Leo's effectiveness and efficiency in solving optimization problems against these established algorithms. In optimizing multimodal test functions (TF8-TF13), particularly TF11, the proposed approach outperformed other algorithms, with an average TF11 value of (2.7393E-08). Notably, across the composite test functions (TF14-TF19), the proposed method exhibited consistently high performance compared to the base algorithms. The statistical analysis supports the research conclusions, and real-world applications of Leo are also showcased. The stability of Leo is confirmed using standard metrics for exploration and exploitation.

  • Erbil Technical Engineering College
  • Information Systems Engineering
  • Information Systems Engineering

Tracking Mobile Devices Using Crowdsourcing Technique in IoT Infrastructures for Outdoors and Indoors Positioning

  • Safar Maghdid Asaad
  • [email protected]
  • +9647501206882
  • In the Internet of Things (IoT) era, tracking humans’ daily life activities has faced a remarkable transformation, especially in terms of indoor positioning. Similarly, wireless positioning technologies, including Wi-Fi and LoRa, have been employed as an alternative to Global Navigation Satellite System (GNSS) technologies for indoor tracking. Wi-Fi and LoRa positioning frequently employs the received signal strength indicators (RSSI) of the Wi-Fi and LoRa signals. However, the RSSI-based approachs suffer from Multipath, Non-Line-Of-Sight (NLOS), and fluctuating RSSI measurements via Wi-Fi and LoRa chipsets. When these issues have the direct impact on the accuracy and reliability of the positioning techniques. In addition, the fingerprinting procedure is one of the most widely known positioning methods for RSSI- based techniques. Due to the absence of a stable matching algorithm, the fingerprinting-based method has an additional issue.

    There are a number of matching algorithms, for example, weighted k- nearest neighbour (WkNN), k-mean clustering, decision tree, and deep learning algorithms such as Long-Short-Term Memory (LSTM). Two algorithms are proposed in this study to provide adequate positioning services.

    The first algorithm is a novel integrated matching algorithm for Wi-Fi fingerprint-positioning technique, which is known as Norm_MSATE_LSTM, as a means of mitigating the drawbacks of the RSSI-based fingerprinting method. It is based on the Wi-Fi fingerprinting and proposed augmentation techniques with considering LSTM as a matching technique. To address the problem of a large number of RPs/classes in the LSTM, we first conduct the augmentation process to boost the RSSI data records using the Mean Standerd deviation Augmentation TEchnique (MSATE). The RSSI data are normalised (Norm), and the long short-term memory (LSTM) method is used to estimate the accurate positions. Finally, the recommended matching algorithm is compared with the stand-alone matching algorithms, including weighted k- nearest neighbours (WkNN) and LSTM.

    The second algorithm is the hybrid positioning technique using existing Wi-Fi and LoRa technologies, which is known as Wi-Lo, and it is aimed at improving the outcomes of the first suggested algorithm. This one is based on the combination of the Wi-Fi and LoRa technologies and considering MM and trilateration techniques to provide seamless positioning from outdoor to indoor via building identifications. The approach is divided into two phases. The LoRa RSSI is used to identify buildings in the first phase. The second phase is known as Wi-Lo, and it combines LoRa and Wi-Fi technologies to improve Norm_MSATE_LSTM positioning accuracy outcomes.

    Experiments and simulated investigations indicate that the proposed matching algorithm, Norm_MSATE_LSTM, may increase positioning accuracy of the LSTM by 45.83% and 72% when only augmentation and augmentation with normalisation are applied, respectively. On the other hand, the proposed Wi-Lo can increase the first algorithm’s accurate by 39.74% in terms of the positioning improvement.

  • Erbil Technical Engineering College
  • Information Systems Engineering