Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Reading LevelReading Level
-
Content TypeContent Type
-
YearFrom:-To:
-
More FiltersMore FiltersItem TypeIs Full-Text AvailableSubjectPublisherSourceDonorLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
2,293
result(s) for
"Elm"
Sort by:
Web applications with Elm : functional programming for the Web
Learn the basics of the Elm platform for web applications. This book covers the language as of version 0.18 and the most important libraries. After reading this book you will have an understanding what Elm can do for you. Also, you will be able to build on the example in the book to develop advanced web applications with Elm. What You'll Learn: Work with Elm and its development environment Learn the language and libraries in examples Use the Elm architecture to create applications with the Elm platform Put it all together with a sample application and explanation that covers the implementation details Who This Book Is For: Web developers new to Elm, with some experience in JavaScript recommended. This book is also for others curious about Elm and its potential beyond web development.
Extreme learning machines: a new approach for modeling dissolved oxygen (DO) concentration with and without water quality variables as predictors
2017
In this paper, several extreme learning machine (ELM) models, including standard extreme learning machine with sigmoid activation function (S-ELM), extreme learning machine with radial basis activation function (R-ELM), online sequential extreme learning machine (OS-ELM), and optimally pruned extreme learning machine (OP-ELM), are newly applied for predicting dissolved oxygen concentration with and without water quality variables as predictors. Firstly, using data from eight United States Geological Survey (USGS) stations located in different rivers basins, USA, the S-ELM, R-ELM, OS-ELM, and OP-ELM were compared against the measured dissolved oxygen (DO) using four water quality variables, water temperature, specific conductance, turbidity, and pH, as predictors. For each station, we used data measured at an hourly time step for a period of 4 years. The dataset was divided into a trainin
g
set (70%) and a validation set (30%). We selected several combinations of the water quality variables as inputs for each ELM model and six different scenarios were compared. Secondly, an attempt was made to predict DO concentration without water quality variables. To achieve this goal, we used the year numbers, 2008, 2009, etc., month numbers from (1) to (12), day numbers from (1) to (31) and hour numbers from (00:00) to (24:00) as predictors. Thirdly, the best ELM models were trained using validation dataset and tested with the training dataset. The performances of the four ELM models were evaluated using four statistical indices: the coefficient of correlation (
R
), the Nash-Sutcliffe efficiency (NSE), the root mean squared error (RMSE), and the mean absolute error (MAE). Results obtained from the eight stations indicated that: (i) the best results were obtained by the S-ELM, R-ELM, OS-ELM, and OP-ELM models having four water quality variables as predictors; (ii) out of eight stations, the OP-ELM performed better than the other three ELM models at seven stations while the R-ELM performed the best at one station. The OS-ELM models performed the worst and provided the lowest accuracy; (iii) for predicting DO without water quality variables, the R-ELM performed the best at seven stations followed by the S-ELM in the second place and the OP-ELM performed the worst with low accuracy; (iv) for the final application where training ELM models with validation dataset and testing with training dataset, the OP-ELM provided the best accuracy using water quality variables and the R-ELM performed the best at all eight stations without water quality variables. Fourthly, and finally, we compared the results obtained from different ELM models with those obtained using multiple linear regression (MLR) and multilayer perceptron neural network (MLPNN). Results obtained using MLPNN and MLR models reveal that: (i) using water quality variables as predictors, the MLR performed the worst and provided the lowest accuracy in all stations; (ii) MLPNN was ranked in the second place at two stations, in the third place at four stations, and finally, in the fourth place at two stations, (iii) for predicting DO without water quality variables, MLPNN is ranked in the second place at five stations, and ranked in the third, fourth, and fifth places in the remaining three stations, while MLR was ranked in the last place with very low accuracy at all stations. Overall, the results suggest that the ELM is more effective than the MLPNN and MLR for modelling DO concentration in river ecosystems.
Journal Article
Two novel ELM-based stacking deep models focused on image recognition
by
Dai Qun
,
Song, Gang
,
Han Xiaomeng
in
Algorithms
,
Artificial neural networks
,
Computational geometry
2020
Extreme learning machine (ELM) and its variants have been widely used in the field of object recognition and other complex classification tasks. Traditional deep learning architectures like Convolutional Neural Network (CNN) are capable of extracting high-level features, which are the key for the models to make right decisions. However, traditional deep architectures are confronted with solving a tough, non-convex optimization problem, which is a time-consuming process. In this paper, we propose two hierarchical models, i.e., Random Recursive Constrained ELM (R2CELM) and Random Recursive Local- Receptive-Fields-Based ELM (R2ELM-LRF), which are constructed by stacking with CELM or ELM-LRF, respectively. Besides, inspired by the stacking generalization philosophy, random projection and kernelization are incorporated as their constitutive elements. R2CELM and R2ELM-LRF not only fully inherit the merits of ELM, but also take advantage of the superiority of CELM and ELM-LRF in the field of image recognition, respectively. The essence of CELM is to constrain the weight vectors from the input layer to the hidden layer to be consistent with the directions from one class to another class, while ELM-LRF is adept at exploiting the local structures in images through many local receptive fields. In the empirical results, R2CELM and R2ELM-LRF demonstrate their better performance in testing accuracy on the six benchmark image recognition datasets, compared with their basic learners and other state-of-the-art algorithms. Moreover, the proposed two deep ELM models need less training time when compared with traditional Deep Neural Network (DNN) based models.
Journal Article
Dutch elm disease and elm bark beetles: a century of association
2015
Bark beetles of the genus Scolytus Geoffroy are the main vectors of the fungus Ophiostoma ulmi s.l., which causes the Dutch elm disease. The large and small elm bark beetles - S. scolytus (F.) and S. multistriatus (Marsham), respectively - are the most common and important species spreading the pathogen worldwide. The success of the pathogen-insect interactions is mainly due to the characteristic reproductive behavior of the elm bark beetles, which, however, largely depends on the occurrence of infected trees. During feeding activity on elm twigs, callow adults carrying pathogen conidia on their bodies contaminate healthy trees and facilitate pathogen development and movement within the wood vessels. Infected trees become then suitable for insect breeding in the stem bark. This well-known mutualistic association has devastating consequences for elm survival. Although much is known about insect-pathogen interactions and transmission mechanisms, many topics still deserve additional attention, as, for example, beetle systematic based on new molecular tools and morphological characters; selection of European elm clones based on disease avoidance; consequences of global warming on life-history of the three organisms (fungus-insect-tree) involved in the pathosystem; new problems resulting from the rapid increase of international trade among continents, leading to the accidental introduction of new vector species or new pathogen species or races, or to the introduction of new highly susceptible elm species in gardens and public parks. A holistic approach to tackle the problem is highly recommended, taking into account how these organisms interact with each other and the environment, and how their interactions could be modified in order to face one of the most destructive diseases ever known in plant pathology.
Journal Article
Plenodomus tracheiphilus, but not Dothiorella ulmi, causes wilt disease on elm trees in Alberta, Canada
by
Harding, Michael W
,
Zahr, Kher
,
Feng, Jie
in
Actin
,
Citrus trees
,
DNA-directed RNA polymerase
2024
Annual monitoring of wilt pathogens on elm trees in Alberta is part of a provincially-regulated prevention and control program for Dutch elm disease. Over the past eight years (2016–2023), twig samples with wilt symptoms from 200 elm trees across Alberta were tested for the presence of wilt pathogens. Plenodomus tracheiphilus, the causal agent of Mal secco disease of citrus trees, was isolated from 116 out of the 200 elm trees. The identification of this fungus was confirmed morphologically by comparison with the type culture, and sequencing the internal transcribed spacer region, the β-tubulin gene, the DNA-directed RNA polymerase II second largest subunit gene, the actin gene and eight protein-coding genes exclusively present in the P. tracheiphilus genome. The pathogenicity of P. tracheiphilus isolated from Alberta was tested by artificial inoculation on elm trees to fulfill Koch's postulates based on symptom observation and fungal re-isolation. Our data indicated that P. tracheiphilus is commonly present in Alberta’s elm trees especially in the Edmonton area and that the previously described Dothiorella elm wilt is actually caused by P. tracheiphilus and not Dothiorella ulmi.
Journal Article
A novel online sequential extreme learning machine with L2,1-norm regularization for prediction problems
by
Preeti
,
Bala Rajni
,
Singh, Ram Pal
in
Algorithms
,
Artificial neural networks
,
Batch processing
2021
In today’s world, data is produced at a very high speed and used in a large number of prediction problems. Therefore, the sequential nature of learning algorithms is in demand for batch learning algorithms. This paper presents a novel online sequential algorithm for extreme learning machine with l2,1-norm regularization (LR21OS-ELM) to handle the real-time sequential data. Wang et al. have given ELM with l2,1-norm based regularization namely LR21-ELM. This method is a batch processing model which takes data in a single chunk. So, whenever a new chunk of data arrives the model has to be retrained which takes a lot of time and memory. The proposed sequential algorithm does not require building a new model each time data arrives. This will update the previous model with new data that will save time and memory. The l2,1-norm regularization is a structural sparse-inducing norm which is integrated with an online sequential learning algorithm to diminish the complexity of the learning model by eliminating the redundant neurons of OS-ELM model. This paper proposes an iterative bi-objective optimization algorithm to solve l2,1 norm-based minimization problem and to handle the real time sequential data. The proposed model can learn sequentially arriving data in the form of chunks where chunk size can be fixed or varying. The experimental study has been conducted on several benchmark datasets collected from different research domains to prove the generalization ability of the proposed algorithm. The obtained results show that LR21OS-ELM combines the advantages of l2,1-norm regularization and online sequential learning of data and improves the prediction performance of the system.
Journal Article
Multilayer extreme learning machine: a systematic review
by
Batra, Shalini
,
Kaur, Ravneet
,
Roul, Rajendra Kumar
in
Algorithms
,
Artificial neural networks
,
Back propagation networks
2023
Majority of the learning algorithms used for the training of feedforward neural networks (FNNs), such as backpropagation (BP), conjugate gradient method, etc. rely on the traditional gradient method. Such algorithms have a few drawbacks, including slow convergence, sensitivity to noisy data, local minimum problem, etc. One of the alternatives to overcome such issues is Extreme Learning Machine (ELM), which requires less training time, ensures global optimum and enhanced generalization in neural networks. ELM has a single hidden layer, which poses memory constraints in some problem domains. An extension to ELM, Multilayer ELM (ML-ELM) performs unsupervised learning by utilizing ELM autoencoders and eliminates the need of parameter tuning, enabling better representation learning as it consists of multiple layers. This paper provides a thorough review of ML-ELM architecture development and its variants and applications. The state-of-the-art comparative analysis between ML-ELM and other machine and deep learning classifiers demonstrate the efficacy of ML-ELM in the niche domains of Computer Science which further justifies its competency and effectiveness.
Journal Article
Cryopreservation and Micropropagation Methods for Conservation of Genetic Resources of Ulmus laevis and Ulmus glabra
2021
Elms are threatened by Dutch elm disease, and conservation methods are needed to protect their genetic diversity. Cryopreservation of dormant buds allows large numbers of genotypes to be conserved with small space requirements and minimal upkeep. Cryopreservation through slow controlled cooling was tested for both elm species native to Finland, Ulmus glabra and Ulmus laevis. Regeneration of the thawed buds by micropropagation was studied on different basal media and using different growth regulators. Multiple surface sterilisation methods were tried out for bud explants. The multiplication of U. glabra was investigated with Driver and Kuniyuki walnut medium with either 0.5 mg/L meta-topolin or 0.5 mg/L 6-benzylaminopurine. Rooting with short indole-6-butyric acid induction in liquid medium and direct transplantation of the shoots to peat ex vitro after induction were tested. For initiation, either Murashige and Skoog or Driver and Kuniyuki walnut medium with 0.02 mg/L gibberellic acid 4 + 7 and 0.5 mg/L 6-benzylaminopurine were found to best promote shoot formation. Surface sterilisation remains the most challenging step. No significant differences were found between the multiplication media in either shoot production or rooting success. Rooting by direct transplanting was achieved in both species, but further development is required before application on a larger scale. With further improvements to sterilisation success especially in U. glabra, the method can be applied to the conservation of genetic resources of both U. laevis and U. glabra, and knowledge of regeneration success can be used to design the cryoconservation plan and optimise the sampling.
Journal Article
Extreme learning machines: a survey
by
Huang, Guang-Bin
,
Lan, Yuan
,
Wang, Dian Hui
in
Algorithms
,
Approximation
,
Artificial Intelligence
2011
Computational intelligence techniques have been used in wide applications. Out of numerous computational intelligence techniques, neural networks and support vector machines (SVMs) have been playing the dominant roles. However, it is known that both neural networks and SVMs face some challenging issues such as: (1) slow learning speed, (2) trivial human intervene, and/or (3) poor computational scalability. Extreme learning machine (ELM) as emergent technology which overcomes some challenges faced by other techniques has recently attracted the attention from more and more researchers. ELM works for generalized single-hidden layer feedforward networks (SLFNs). The essence of ELM is that the hidden layer of SLFNs need not be tuned. Compared with those traditional computational intelligence techniques, ELM provides better generalization performance at a much faster learning speed and with least human intervene. This paper gives a survey on ELM and its variants, especially on (1) batch learning mode of ELM, (2) fully complex ELM, (3) online sequential ELM, (4) incremental ELM, and (5) ensemble of ELM.
Journal Article