Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
96,592
result(s) for
"Algorithm design"
Sort by:
Probability-boosting technique for combinatorial optimization
In many combinatorial optimization problems we want a particular set of k out of n items with some certain properties (or constraints). These properties may involve the k items. In the worst case a deterministic algorithm must scan n−k items in the set to verify the k items. If we pick a set of k items randomly and verify the properties, it will take about (n/k) k verifications, which can be a really large number for some values of k and n. In this article we introduce a significantly faster randomized strategy with very high probability to pick the set of such k items by amplifying the probability of obtaining a target set of k items and show how this probability boosting technique can be applied to solve three different combinatorial optimization problems efficiently. In all three applications algorithms that use the probability boosting technique show superiority over their deterministic counterparts.
Journal Article
Predicting resistance mutations using protein design algorithms
2010
Drug resistance resulting from mutations to the target is an unfortunate common phenomenon that limits the lifetime of many of the most successful drugs. In contrast to the investigation of mutations after clinical exposure, it would be powerful to be able to incorporate strategies early in the development process to predict and overcome the effects of possible resistance mutations. Here we present a unique prospective application of an ensemble-based protein design algorithm, K*, to predict potential resistance mutations in dihydrofolate reductase from Staphylococcus aureus using positive design to maintain catalytic function and negative design to interfere with binding of a lead inhibitor. Enzyme inhibition assays show that three of the four highly-ranked predicted mutants are active yet display lower affinity (18-, 9-, and 13-fold) for the inhibitor. A crystal structure of the top-ranked mutant enzyme validates the predicted conformations of the mutated residues and the structural basis of the loss of potency. The use of protein design algorithms to predict resistance mutations could be incorporated in a lead design strategy against any target that is susceptible to mutational resistance.
Journal Article
Fast and Accurate Approximation Methods for Trigonometric and Arctangent Calculations for Low-Performance Computers
2022
In modern computers, complicated signal processing is highly optimized with the use of compilers and high-speed processing using floating-point units (FPUs); therefore, programmers have little opportunity to care about each process. However, a highly accurate approximation can be processed in a small number of computation cycles, which may be useful when embedded in a field-programmable gate array (FPGA) or micro controller unit (MCU), or when performing many large-scale operations on a graphics processing unit (GPU). It is necessary to devise algorithms to obtain the desired calculated values without an accelerator or compiler assistance. The residual correction method (RCM) developed here can produce simple and accurate approximations of certain nonlinear functions with minimal multiply–add operations. In this study, we designed an algorithm for the approximate computation of trigonometric and inverse trigonometric functions, which are nonlinear elementary functions, to achieve their fast and accurate computation. A fast first approximation and a more accurate second approximation of each function were created using RCM with a less than 0.001 error using multiply–add operations only. This achievement is particularly useful for MCUs, which have a low power consumption but limited computational power, and the proposed approximations are candidate algorithms that can be used to stabilize the attitude control of robots and drones, which require real-time processing.
Journal Article
A Distributed Framework for Indoor Product Design Using VR and Intelligent Algorithms
2024
This paper presents an innovative approach to the digital design of indoor home products by integrating virtual reality (VR) technology with intelligent algorithms to enhance design accuracy and efficiency. A model combining the Red deer Optimization Algorithm with a Simple Recurrent Unit (SRU) network is proposed to evaluate and optimize the design process. The study develops a digital design framework that incorporates key evaluation factors, optimizing the SRU network through the Red deer Optimization Algorithm to achieve higher precision in design applications. The model’s performance is validated through extensive experiments using metrics such as Mean Absolute Error (MAE), Root Mean Square Error (RMSE), and Mean Absolute Percentage Error (MAPE). Results show that the RDA-SRU model outperforms other methods, with the smallest MAE of 0.133, RMSE of 0.02, and MAPE of 0.015. Additionally, the model achieved an R² value of 0.968 and the shortest evaluation time of 0.028 seconds, demonstrating its superior performance in predicting and evaluating digital design applications for home products. These findings indicate that the integration of VR with intelligent algorithms significantly improves user experience, customizability, and the overall accuracy of digital design processes. This approach offers a robust solution for designers to create more efficient and user-centric home product designs, meeting growing customer demands for immersive and interactive design experiences.
Journal Article
DECOR: A Method for the Specification and Detection of Code and Design Smells
by
Duchien, L.
,
Gueheneuc, Y.-G.
,
Le Meur, A.-F.
in
Algorithm design and analysis
,
Algorithms
,
Antipatterns
2010
Code and design smells are poor solutions to recurring implementation and design problems. They may hinder the evolution of a system by making it hard for software engineers to carry out changes. We propose three contributions to the research field related to code and design smells: (1) DECOR, a method that embodies and defines all the steps necessary for the specification and detection of code and design smells, (2) DETEX, a detection technique that instantiates this method, and (3) an empirical validation in terms of precision and recall of DETEX. The originality of DETEX stems from the ability for software engineers to specify smells at a high level of abstraction using a consistent vocabulary and domain-specific language for automatically generating detection algorithms. Using DETEX, we specify four well-known design smells: the antipatterns Blob, Functional Decomposition, Spaghetti Code, and Swiss Army Knife, and their 15 underlying code smells, and we automatically generate their detection algorithms. We apply and validate the detection algorithms in terms of precision and recall on XERCES v2.7.0, and discuss the precision of these algorithms on 11 open-source systems.
Journal Article
Do the Ends Justify the Means? Variation in the Distributive and Procedural Fairness of Machine Learning Algorithms
by
Awwad, Yazeed
,
Morse, Lily
,
Teodorescu, Mike Horia M
in
Algorithms
,
Attitudes
,
Business ethics
2022
Recent advances in machine learning methods have created opportunities to eliminate unfairness from algorithmic decision making. Multiple computational techniques (i.e., algorithmic fairness criteria) have arisen out of this work. Yet, urgent questions remain about the perceived fairness of these criteria and in which situations organizations should use them. In this paper, we seek to gain insight into these questions by exploring fairness perceptions of five algorithmic criteria. We focus on two key dimensions of fairness evaluations: distributive fairness and procedural fairness. We shed light on variation in the potential for different algorithmic criteria to facilitate distributive fairness. Subsequently, we discuss procedural fairness and provide a framework for understanding how algorithmic criteria relate to essential aspects of this construct, which helps to identify when a specific criterion is suitable. From a practical standpoint, we encourage organizations to recognize that managing fairness in machine learning systems is complex, and that adopting a blind or one-size-fits-all mentality toward algorithmic criteria will surely damage people’s attitudes and trust in automated technology. Instead, firms should carefully consider the subtle yet significant differences between these technical solutions.
Journal Article
Are generative design algorithms truly generative? Comparing two genetic algorithms by the degrees of freedom they offer designers
by
Weil, Benoît
,
Thomas, Maxime
,
Le Masson, Pascal
in
CAE) and Design
,
Computer-Aided Engineering (CAD
,
Degrees of freedom
2026
This paper explores the generativity of generative design algorithms (GDAs). Generative design (GD) is a process in which designers assign some of their tasks to a computational tool to generate a set of design solutions. While GDAs have been heavily studied, few studies have focused on assessing their generativity; that is, their capacity to help designers create novel proposals that go beyond their initial knowledge. To address this gap, this research compares two GDAs, namely, NSGA-II and MAP-Elites, in terms of their capacity to generate Pareto fronts composed of highly varied design solutions (Pareto fronts with this property are called “splitting Pareto fronts” in this paper). Both algorithms are applied to the industrial design problem of constructing a battery layout for an electric vehicle. A statistical and empirical analysis of the design solutions generated is conducted. The results show that the Pareto fronts generated by MAP-Elites offer designers more degrees of freedom than those generated by NSGA-II do. Thus, the study highlights that the degrees of freedom afforded by GDAs depend on the working principles of the algorithms. From a practical point of view, the results of this study indicate that a GDA can artificially reduce the degrees of freedom of designers. This pressing issue is discussed to help designers make the best use of GDAs.
Journal Article
A novel artificial hummingbird algorithm improved by natural survivor method
2024
The artificial hummingbird algorithm (AHA) has been applied in various fields of science and provided promising solutions. Although the algorithm has demonstrated merits in the optimization area, it suffers from local optimum stagnation and poor exploration of the search space. To overcome these drawbacks, this study redesigns the update mechanism of the original AHA algorithm with the natural survivor method (NSM) and proposes a novel metaheuristic called NSM-AHA. The strength of the developed algorithm is that it performs population management not only according to the fitness function value but also according to the NSM score value. The adopted strategy contributes to NSM-AHA exhibiting powerful local optimum avoidance and unique exploration ability. The optimization ability of the proposed NSM-AHA algorithm was compared with 21 state-of-the-art algorithms over CEC 2017 and CEC 2020 benchmark functions with dimensions of 30, 50, and 100, respectively. Based on the Friedman test results, it was observed that NSM-AHA ranked 1st out of 22 competitive algorithms, while the original AHA ranked 8th. This result highlights that the NSM update mechanism provides a remarkable evolution in the convergence performance of the original AHA algorithm. Furthermore, two constrained engineering problems including the optimization of single-diode solar cell model (SDSCM) parameters and the design of a power system stabilizer (PSS) are solved with the proposed algorithm. The NSM-AHA algorithm provided better results compared to other algorithms with a value of 9.86E − 04 root mean square error for SDSCM and 1.43E − 03 integral time square error for PSS. The experimental results showed that the proposed NSM-AHA is a competitive optimizer for solving global and engineering problems.
Journal Article
A Systematic Review of the Application and Empirical Investigation of Search-Based Test Case Generation
by
Panesar-Walawege, Rajwinder Kaur
,
Ali, Shaukat
,
Briand, Lionel C
in
Algorithm design and analysis
,
Algorithms
,
Automatic testing
2010
Metaheuristic search techniques have been extensively used to automate the process of generating test cases, and thus providing solutions for a more cost-effective testing process. This approach to test automation, often coined \"Search-based Software Testing\" (SBST), has been used for a wide variety of test case generation purposes. Since SBST techniques are heuristic by nature, they must be empirically investigated in terms of how costly and effective they are at reaching their test objectives and whether they scale up to realistic development artifacts. However, approaches to empirically study SBST techniques have shown wide variation in the literature. This paper presents the results of a systematic, comprehensive review that aims at characterizing how empirical studies have been designed to investigate SBST cost-effectiveness and what empirical evidence is available in the literature regarding SBST cost-effectiveness and scalability. We also provide a framework that drives the data collection process of this systematic review and can be the starting point of guidelines on how SBST techniques can be empirically assessed. The intent is to aid future researchers doing empirical studies in SBST by providing an unbiased view of the body of empirical evidence and by guiding them in performing well-designed and executed empirical studies.
Journal Article
Anomalous Traffic Filtering Algorithm for Power Wireless Sensor Networks Based on Feature Clustering
2025
Conventional power wireless sensor network anomalous traffic filtering algorithm measurement structure is generally set as a unidirectional structure, the filtering efficiency is low, resulting in an increase in the absolute error of the filtering measurement, which puts forward the design and analysis of the feature clustering-based power wireless sensor network anomalous traffic filtering algorithm. According to the current measurement requirements, first extract the abnormal traffic features, adopt the multi-order approach to improve the filtering efficiency, design the multi-order power wireless sensing network abnormal traffic filtering measurement structure, based on this, construct the feature clustering network abnormal traffic filtering algorithm model, and use the adaptive checking processing to realize the filtering measurement. The test results show that the absolute error of the final filtering algorithm is well controlled below 0.7, which indicates that the designed abnormal traffic filtering algorithm of electric power wireless sensor network combined with feature clustering is more flexible, versatile, and more targeted, and has practical application value.
Journal Article