Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Series TitleSeries Title
-
Reading LevelReading Level
-
YearFrom:-To:
-
More FiltersMore FiltersContent TypeItem TypeIs Full-Text AvailableSubjectPublisherSourceDonorLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
401,428
result(s) for
"analysis of algorithms"
Sort by:
Trading the measured move : a path to trading success in a world of algos and high-frequency trading
\"A timely guide to profiting in markets dominated by high frequency trading and other computer driven strategiesStrategies employing complex computer algorithms, and often utilizing high frequency trading tactics, have placed individual traders at a significant disadvantage in today's financial markets. It's been estimated that high-frequency traders--one form of computerized trading--accounts for more than half of each day's total equity market trades. In this environment, individual traders need to learn new techniques that can help them navigate modern markets and avoid being whipsawed by larger, institutional players.Trading the Measured Move offers a blueprint for profiting from the price waves created by computer-driven algorithmic and high-frequency trading strategies. The core of author David Halsey's approach is a novel application of Fibonnaci retracements, which he uses to set price targets and low-risk entry points. When properly applied, it allows traders to gauge market sentiment, recognize institutional participation at specific support and resistance levels, and differentiate between short-term and long-term trades at various price points in the market. Provides guidance for individual traders who fear they can't compete in today's high-frequency dominated markets Outlines specific trade set ups, including opening gap strategies, breakouts and failed breakout strategies, range trading strategies, and pivot trading strategies Reveals how to escape institutional strategies designed to profit from slower-moving market participants Engaging and informative, Trading the Measured Move will provide you with a new perspective, and new strategies, to successfully navigate today's computer driven financial markets\"-- Provided by publisher.
Optimizing type 2 diabetes management: AI-enhanced time series analysis of continuous glucose monitoring data for personalized dietary intervention
by
Saher, Raazia
,
Anjum, Madiha
,
Saeed, Muhammad Noman
in
Algorithms and Analysis of Algorithms
,
Artificial Intelligence
,
Blood sugar
2024
Despite advanced health facilities in many developed countries, diabetic patients face multifold health challenges. Type 2 diabetes mellitus (T2DM) go along with conspicuous symptoms due to frequent peaks, hypoglycemia <=70 mg/dL (while fasting), or hyperglycemia >=180 mg/dL two hours postprandial, according to the American Diabetes Association (ADA)). The worse effects of Type 2 diabetes mellitus are precisely associated with the poor lifestyle adopted by patients. In particular, a healthy diet and nutritious food are the key to success for such patients. This study was done to help T2DM patients improve their health by developing a favorable lifestyle under an AI-assisted Continuous glucose monitoring (CGM) digital system. This study aims to reduce the blood glucose level fluctuations of such patients by rectifying their daily diet and maintaining their exertion vs. food consumption records. In this study, a well-precise prediction is obtained by training the ML model on a dataset recorded from CGM sensor devices attached to T2DM patients under observation. As the data obtained from the CGM sensor is time series, to predict blood glucose levels, the time series analysis and forecasting are done with XGBoost, SARIMA, and Prophet. The results of different Models are then compared based on performance metrics. This helped in monitoring various trends, specifically irregular patterns of the patient’s glucose data, collected by the CGM sensor. Later, keeping track of these trends and seasonality, the diet is adjusted accordingly by adding or removing particular food and keeping track of its nutrients with the intervention of a commercially available all-in-one AI solution for food recognition. This created an interactive assistive system, where the predicted results are compared to food contents to bring the blood glucose levels within the normal range for maintaining a healthy lifestyle and to alert about blood glucose fluctuations before the time that are going to occur sooner. This study will help T2DM patients get in managing diabetes and ultimately bring HbA1c within the normal range (<= 5.7%) for diabetic and pre-diabetic patients, three months after the intervention.
Journal Article
Advanced clustering and transfer learning based approach for rice leaf disease segmentation and classification
by
Yousafzai, Samia Nawaz
,
Alsolai, Hadeel
,
Ebad, Shouki A.
in
Algorithms
,
Algorithms and Analysis of Algorithms
,
Artificial Intelligence
2025
Rice, the world’s most important food crop, requires an early and accurate identification of the diseases that infect rice panicles and leaves to increase production and reduce losses. Most conventional methods of diagnosing diseases involve the use of manual instruments, which are ineffective, imprecise, and time-consuming. In light of such drawbacks, this article introduces an improved deep learning and transfer learning method for diagnosing and categorizing rice leaf diseases proficiently. First, all input images are preprocessed; the images are resized to a fixed size before applying a sophisticated contrast enhanced adaptive histogram equalization procedure. Diseased regions are then segmented through the developed gravity weighted kernelised density clustering algorithm. In terms of feature extraction, EfficientNetB0 is fine-tuned by subtracting the last fully connected layers, and the classification is conducted with the new fully connected layers. Also, the tent chaotic particle snow ablation optimizer is added into the learning process in order to improve the learning process and shorten the time of convergence. The performance of the proposed framework was tested on two benchmark datasets and presented accuracy results of 98.87% and 97.54%, respectively. Comparisons of the proposed method with six fine-tuned models show the performance advantage and validity of the proposed method.
Journal Article
Improving Gaussian Naive Bayes classification on imbalanced data through coordinate-based minority feature mining
by
Liu, Fen
,
Li, Yanxi
,
Wang, Wei
in
Algorithms
,
Algorithms and Analysis of Algorithms
,
Analysis
2025
As a widely used classification model, the Gaussian Naive Bayes (GNB) classifier experiences a significant decline in performance when handling imbalanced data. Most traditional approaches rely on sampling techniques; however, these methods alter the quantity and distribution of the original data and are prone to issues such as class overlap and overfitting, thus presenting clear limitations. This article proposes a coordinate transformation algorithm based on radial local relative density changes (RLDC). A key feature of this algorithm is that it preserves the original dataset’s quantity and distribution. Instead of modifying the data, it enhances classification performance by generating new features that more prominently represent minority classes. The algorithm transforms the dataset from absolute coordinates to RLDC-relative coordinates, revealing latent local relative density change features. Due to the imbalanced distribution, sparse feature space, and class overlap, minority class samples can exhibit distinct patterns in these transformed features. Based on these new features, the GNB classifier can increase the conditional probability of the minority class, thereby improving its classification performance on imbalanced datasets. To validate the effectiveness of the proposed algorithm, this study conducts comprehensive comparative experiments using the GNB classifier on 20 imbalanced datasets of varying scales, dimensions, and characteristics. The evaluation includes 10 oversampling algorithms, two undersampling algorithms, and two hybrid sampling algorithms. Experimental results show that the RLDC-based coordinate transformation algorithm ranks first in the average performance across three classification evaluation metrics. Compared to the average values of the comparison algorithms, it achieves improvements of 21.84%, 33.45%, and 54.63% across the three metrics, respectively. This algorithm offers a novel approach to addressing the imbalanced data problem in GNB classification and holds significant theoretical and practical value.
Journal Article
GAT TransPruning: progressive channel pruning strategy combining graph attention network and transformer
by
Lin, Yu-Cheng
,
Wang, Chia-Hung
,
Lin, Yu-Chen
in
Algorithms and Analysis of Algorithms
,
Analysis
,
Artificial Intelligence
2024
Recently, large-scale artificial intelligence models with billions of parameters have achieved good results in experiments, but their practical deployment on edge computing platforms is often subject to many constraints because of their resource requirements. These models require powerful computing platforms with a high memory capacity to store and process the numerous parameters and activations, which makes it challenging to deploy these large-scale models directly. Therefore, model compression techniques are crucial role in making these models more practical and accessible. In this article, a progressive channel pruning strategy combining graph attention network and transformer, namely GAT TransPruning, is proposed, which uses the graph attention networks (GAT) and the attention of transformer mechanism to determine the channel-to-channel relationship in large networks. This approach ensures that the network maintains its critical functional connections and optimizes the trade-off between model size and performance. In this study, VGG-16, VGG-19, ResNet-18, ResNet-34, and ResNet-50 are used as large-scale network models with the CIFAR-10 and CIFAR-100 datasets for verification and quantitative analysis of the proposed progressive channel pruning strategy. The experimental results reveal that the accuracy rate only drops by 6.58% when the channel pruning rate is 89% for VGG-19/CIFAR-100. In addition, the lightweight model inference speed is 9.10 times faster than that of the original large model. In comparison with the traditional channel pruning schemes, the proposed progressive channel pruning strategy based on the GAT and Transformer cannot only cut out the insignificant weight channels and effectively reduce the model size, but also ensure that the performance drop rate of its lightweight model is still the smallest even under high pruning ratio.
Journal Article
The Bounded and Precise Word Problems for Presentations of Groups
by
Ivanov, S. V.
in
Geometric group theory [See also 05C25, 20E08, 57Mxx]
,
Group theory and generalizations
,
Presentations of groups (Mathematics)
2020
We introduce and study the bounded word problem and the precise word problem for groups given by means of generators and defining
relations. For example, for every finitely presented group, the bounded word problem is in
Learning label smoothing for text classification
by
Ren, Han
,
Zhao, Yajie
,
Sun, Wei
in
Algorithms and Analysis of Algorithms
,
Analysis
,
Computational Linguistics
2024
Training with soft labels instead of hard labels can effectively improve the robustness and generalization of deep learning models. Label smoothing often provides uniformly distributed soft labels during the training process, whereas it does not take the semantic difference of labels into account. This article introduces discrimination-aware label smoothing, an adaptive label smoothing approach that learns appropriate distributions of labels for iterative optimization objectives. In this approach, positive and negative samples are employed to provide experience from both sides, and the performances of regularization and model calibration are improved through an iterative learning method. Experiments on five text classification datasets demonstrate the effectiveness of the proposed method.
Journal Article
Multilevel Monte Carlo Path Simulation
2008
We show that multigrid ideas can be used to reduce the computational complexity of estimating an expected value arising from a stochastic differential equation using Monte Carlo path simulations. In the simplest case of a Lipschitz payoff and a Euler discretisation, the computational cost to achieve an accuracy of O ( ) is reduced from O ( –3 ) to O ( –2 (log ) 2 ). The analysis is supported by numerical results showing significant computational savings.
Journal Article
A new optimization algorithm based on mimicking the voting process for leader selection
by
Trojovský, Pavel
,
Dehghani, Mohammad
in
Algorithms
,
Algorithms and Analysis of Algorithms
,
Applied mathematics
2022
Stochastic-based optimization algorithms are effective approaches to addressing optimization challenges. In this article, a new optimization algorithm called the Election-Based Optimization Algorithm (EBOA) was developed that mimics the voting process to select the leader. The fundamental inspiration of EBOA was the voting process, the selection of the leader, and the impact of the public awareness level on the selection of the leader. The EBOA population is guided by the search space under the guidance of the elected leader. EBOA’s process is mathematically modeled in two phases: exploration and exploitation. The efficiency of EBOA has been investigated in solving thirty-three objective functions of a variety of unimodal, high-dimensional multimodal, fixed-dimensional multimodal, and CEC 2019 types. The implementation results of the EBOA on the objective functions show its high exploration ability in global search, its exploitation ability in local search, as well as the ability to strike the proper balance between global search and local search, which has led to the effective efficiency of the proposed EBOA approach in optimizing and providing appropriate solutions. Our analysis shows that EBOA provides an appropriate balance between exploration and exploitation and, therefore, has better and more competitive performance than the ten other algorithms to which it was compared.
Journal Article
Benchmarking a fast, satisficing vehicle routing algorithm for public health emergency planning and response: “Good Enough for Jazz”
by
McDaniel, Emma L.
,
Mikler, Armin R.
,
Akwafuo, Sampson
in
Algorithms
,
Algorithms and Analysis of Algorithms
,
Analysis
2023
Due to situational fluidity and intrinsic uncertainty of emergency response, there needs to be a fast vehicle routing algorithm that meets the constraints of the situation, thus the receiving-staging-storing-distributing (RSSD) algorithm was developed. Benchmarking the quality of this satisficing algorithm is important to understand the consequences of not engaging with the NP-Hard task of vehicle routing problem. This benchmarking will inform whether the RSSD algorithm is producing acceptable and consistent solutions to be used in decision support systems for emergency response planning. We devise metrics in the domain space of emergency planning, response, and medical countermeasure dispensing in order to assess the quality of RSSD solutions. We conduct experiments and perform statistical analyses to assess the quality of the RSSD algorithm’s solutions compared to the best known solutions for selected capacitated vehicle routing problem (CVRP) benchmark instances. The results of these experiments indicate that even though the RSSD algorithm does not engage with finding the optimal route solutions, it behaves in a consistent manner to the best known solutions across a range of instances and attributes.
Journal Article