Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Series Title
      Series Title
      Clear All
      Series Title
  • Reading Level
      Reading Level
      Clear All
      Reading Level
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Content Type
    • Item Type
    • Is Full-Text Available
    • Subject
    • Publisher
    • Source
    • Donor
    • Language
    • Place of Publication
    • Contributors
    • Location
4,142 result(s) for "Metaheuristics."
Sort by:
A Modified Emperor Penguin Algorithm for Solving Stagnation in Multi-Model Functions
Metaheuristic algorithms have gained attention in recent years for their ability to solve complex problems that cannot be solved using classical mathematical techniques. This paper proposes an improvement to the Emperor Penguin Optimizer algorithm, a population-based metaheuristic. The original algorithm often gets stuck in local optima for multi-modal functions. To address this issue, this paper presents a modification in the relocating procedures that allows the algorithm to utilize information gained from the previous positions of each penguin. To demonstrate the effectiveness of the modified algorithm, 20 test optimization functions from well-known benchmarks were selected. The results show that the proposed algorithm is highly efficient, especially in multi-modal functions.
Handbook of AI-based metaheuristics
\"At the heart of the optimization domain are mathematical modelling of the problem and the solution methodologies. In recent times, the problems are becoming larger, with growing complexity. Such problems are becoming cumbersome when handled by traditional optimization methods. This has motivated researchers to resort to Artificial Intelligence (AI) based nature-inspired solution methodologies or algorithms. The Handbook of AI-based Metaheuristics provides a wide-ranging reference to the theoretical and mathematical formulations of metaheuristics, including bio-inspired, swarm-based, socio-cultural and physics-based methods or algorithms; their testing and validation, along with detailed illustrative solutions and applications, as well as newly devised metaheuristic algorithms. The book will be a valuable reference to researchers from industry and academia, as well as Masters and PhD students around the globe working in the metaheuristics and applications domain\"-- Provided by publisher.
Initialisation Approaches for Population-Based Metaheuristic Algorithms: A Comprehensive Review
A situation where the set of initial solutions lies near the position of the true optimality (most favourable or desirable solution) by chance can increase the probability of finding the true optimality and significantly reduce the search efforts. In optimisation problems, the location of the global optimum solution is unknown a priori, and initialisation is a stochastic process. In addition, the population size is equally important; if there are problems with high dimensions, a small population size may lie sparsely in unpromising regions, and may return suboptimal solutions with bias. In addition, the different distributions used as position vectors for the initial population may have different sampling emphasis; hence, different degrees of diversity. The initialisation control parameters of population-based metaheuristic algorithms play a significant role in improving the performance of the algorithms. Researchers have identified this significance, and they have put much effort into finding various distribution schemes that will enhance the diversity of the initial populations of the algorithms, and obtain the correct balance of the population size and number of iterations which will guarantee optimal solutions for a given problem set. Despite the affirmation of the role initialisation plays, to our knowledge few studies or surveys have been conducted on this subject area. Therefore, this paper presents a comprehensive survey of different initialisation schemes to improve the quality of solutions obtained by most metaheuristic optimisers for a given problem set. Popular schemes used to improve the diversity of the population can be categorised into random numbers, quasirandom sequences, chaos theory, probability distributions, hybrids of other heuristic or metaheuristic algorithms, Lévy, and others. We discuss the different levels of success of these schemes and identify their limitations. Similarly, we identify gaps and present useful insights for future research directions. Finally, we present a comparison of the effect of population size, the maximum number of iterations, and ten (10) different initialisation methods on the performance of three (3) population-based metaheuristic optimizers: bat algorithm (BA), Grey Wolf Optimizer (GWO), and butterfly optimization algorithm (BOA).
A conceptual comparison of several metaheuristic algorithms on continuous optimisation problems
The field of continuous optimisation has witnessed an explosion of the so-called new or novel metaheuristic algorithms. Though not all of these algorithms are efficient as proclaimed by their inventors, a few of them have proved to be very efficient and thus have become popular tools for solving complex optimisation problems. Therefore, there is a need for a systematic analysis approach to fairly evaluate and compare the results of some of these optimisation algorithms. In this paper, a set of well-known mathematical benchmark functions are compiled to provide an easily accessible collection of standard benchmark test problems for continuous global optimisation. This set of test problems are used to investigate the computational capabilities and the microscopic behaviour of twelve different metaheuristic algorithms. The required number of function evaluations for reaching the best solution and the run-time complexity of the algorithms are compared. Furthermore, statistical tests are conducted to validate the concluding remarks.
Sand Cat swarm optimization: a nature-inspired algorithm to solve global optimization problems
This study proposes a new metaheuristic algorithm called sand cat swarm optimization (SCSO) which mimics the sand cat behavior that tries to survive in nature. These cats are able to detect low frequencies below 2 kHz and also have an incredible ability to dig for prey. The proposed algorithm, inspired by these two features, consists of two main phases (search and attack). This algorithm controls the transitions in the exploration and exploitation phases in a balanced manner and performed well in finding good solutions with fewer parameters and operations. It is carried out by finding the direction and speed of the appropriate movements with the defined adaptive strategy. The SCSO algorithm is tested with 20 well-known along with modern 10 complex test functions of CEC2019 benchmark functions and the obtained results are also compared with famous metaheuristic algorithms. According to the results, the algorithm that found the best solution in 63.3% of the test functions is SCSO. Moreover, the SCSO algorithm is applied to seven challenging engineering design problems such as welded beam design, tension/compression spring design, pressure vessel design, piston lever, speed reducer design, three-bar truss design, and cantilever beam design. The obtained results show that the SCSO performs successfully on convergence rate and in locating all or most of the local/global optima and outperforms other compared methods.
Quantum-inspired metaheuristic algorithms: comprehensive survey and classification
Metaheuristic algorithms are widely known as efficient solutions for solving problems of optimization. These algorithms supply powerful instruments with significant engineering, industry, and science applications. The Quantum-inspired metaheuristic algorithms were developed by integrating Quantum Computing (QC) concepts into metaheuristic algorithms. The QC-inspired metaheuristic algorithms solve combinational and numerical optimization problems to achieve higher-performing results than conventional metaheuristic algorithms. The QC is used more than any other strategy for accelerating convergence, enhancing exploration, and exploitation, significantly influencing metaheuristic algorithms’ performance. The QC is a new field of research that includes elements from mathematics, physics, and computing. QC has attracted increasing attention among scientists, technologists, and industrialists. During the current decade, it has provided a research platform for the scientific, technical, and industrial areas. In QC, metaheuristic algorithms can be improved by the parallel processing feature. This feature helps to find the best solutions for optimization problems. The Quantum-inspired metaheuristic algorithms have been used in the optimization fields. In this paper, a review of different usages of QC in metaheuristics has been presented. This review also shows a classification of the Quantum-inspired metaheuristic algorithms in optimization problems and discusses their applications in science and engineering. This review paper’s main aims are to give an overview and review the Quantum-inspired metaheuristic algorithms applications.
MODIFIED INDIVIDUAL EXPERIENCE MAYFLY ALGORITHM
An algorithm that modifies the individual experience of mayflies in the mayfly algorithm (MA) to enhance its performance, is proposed. The proposed algorithm called the Modified Individual Experience Mayfly Algorithm (MIE-MA) calculates the experience of a mayfly by finding an average of the positions the mayfly has been to instead of just using the best position. A chaotic decreasing gravity coefficient is also employed to enhance the balance between the exploitation and exploration of the algorithm. The proposed algorithm was compared to the original MA, and two recent variants named, PGB-IMA and ModMA, on eight benchmark functions. The parameters used for comparison were Mean Absolute Error, Standard Deviation, and convergence rate. The results validate the superior performance of the MIE-MA over the other three algorithms. The MIE-MA yields better optimal values with minimal iterations.
An Efficient Metaheuristic-Based Clustering with Routing Protocol for Underwater Wireless Sensor Networks
In recent years, the underwater wireless sensor network (UWSN) has received a significant interest among research communities for several applications, such as disaster management, water quality prediction, environmental observance, underwater navigation, etc. The UWSN comprises a massive number of sensors placed in rivers and oceans for observing the underwater environment. However, the underwater sensors are restricted to energy and it is tedious to recharge/replace batteries, resulting in energy efficiency being a major challenge. Clustering and multi-hop routing protocols are considered energy-efficient solutions for UWSN. However, the cluster-based routing protocols for traditional wireless networks could not be feasible for UWSN owing to the underwater current, low bandwidth, high water pressure, propagation delay, and error probability. To resolve these issues and achieve energy efficiency in UWSN, this study focuses on designing the metaheuristics-based clustering with a routing protocol for UWSN, named MCR-UWSN. The goal of the MCR-UWSN technique is to elect an efficient set of cluster heads (CHs) and route to destination. The MCR-UWSN technique involves the designing of cultural emperor penguin optimizer-based clustering (CEPOC) techniques to construct clusters. Besides, the multi-hop routing technique, alongside the grasshopper optimization (MHR-GOA) technique, is derived using multiple input parameters. The performance of the MCR-UWSN technique was validated, and the results are inspected in terms of different measures. The experimental results highlighted an enhanced performance of the MCR-UWSN technique over the recent state-of-art techniques.
Development and evaluation of hybrid harris hawks optimization algorithms for advanced engineering applications
Harris Hawk Optimizer (HHO) is a recent revolutionary algorithm developed in the literature that simulates the cooperative hunting behaviour of Parabuteo Unicinctus. Despite its simplicity, the standard HHO often suffers from slow convergence, limited exploitation capacity and performance degradation on high-dimensional and constrained problems. This study aims to develop seven novel Harris Hawk Optimizer (HHO) variants, HHO-ADAP, HHO-CHAOS, HHO-Elite, HHO-GA, HHO-Inertia, HHO-PSO, and HHO-ULTRA, that integrate adaptive mechanisms, chaotic dynamics, elite preservation, and cross-algorithmic hybridization to improve the balance between exploration and exploitation. The proposed methods were rigorously tested on the CEC 2014 benchmark suite for dimensions 10, 30, 50, and 100, as well as ten constrained engineering design problems, and results are compared against state-of-the-art optimizers CMA-ES, L-SHADE, LSHADE-cnEpSin, SPS-L-SHADE-EIG, EBOwithCMAR, WMA, and OWMA. Quantitative results demonstrate that the hybrids consistently outperform the baseline HHO and classical optimizers. HHO-PSO and HHO-Elite achieved up to 35% faster convergence and reached solution values as small as 10⁻ 216 , compared with much weaker values (10⁻ 42 –10⁻ 47 ) for classical baselines. On multimodal and fixed-dimension functions, HHO-Elite, HHO-CHAOS, and HHO-ADAP effectively delayed stagnation and preserved diversity, avoiding premature convergence. For engineering problems, the hybrids produced near-optimal designs: pressure vessel (≈5885.2), spring (≈0.01267), welded beam (≈1.7257), gear train (= 0), and Belleville spring (≈1.9795). Variance was as low as 10⁻ 16 (multiple disk clutch, gear train), while average runtimes remained below 0.01 s for most hybrids, markedly faster than champion algorithms such as SPS-L-SHADE-EIG (> 1.4 s) and WMA (> 1.8 s). The results highlight that hybridization significantly enhances HHO’s robustness, solution accuracy, and adaptability for solving large-scale, nonlinear, and constrained optimization problems in engineering and scientific domains.