Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Series Title
      Series Title
      Clear All
      Series Title
  • Reading Level
      Reading Level
      Clear All
      Reading Level
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Content Type
    • Item Type
    • Is Full-Text Available
    • Subject
    • Publisher
    • Source
    • Donor
    • Language
    • Place of Publication
    • Contributors
    • Location
21,890 result(s) for "Classifiers"
Sort by:
Two-Stage Hybrid Data Classifiers Based on SVM and kNN Algorithms
The paper considers a solution to the problem of developing two-stage hybrid SVM-kNN classifiers with the aim to increase the data classification quality by refining the classification decisions near the class boundary defined by the SVM classifier. In the first stage, the SVM classifier with default parameters values is developed. Here, the training dataset is designed on the basis of the initial dataset. When developing the SVM classifier, a binary SVM algorithm or one-class SVM algorithm is used. Based on the results of the training of the SVM classifier, two variants of the training dataset are formed for the development of the kNN classifier: a variant that uses all objects from the original training dataset located inside the strip dividing the classes, and a variant that uses only those objects from the initial training dataset that are located inside the area containing all misclassified objects from the class dividing strip. In the second stage, the kNN classifier is developed using the new training dataset above-mentioned. The values of the parameters of the kNN classifier are determined during training to maximize the data classification quality. The data classification quality using the two-stage hybrid SVM-kNN classifier was assessed using various indicators on the test dataset. In the case of the improvement of the quality of classification near the class boundary defined by the SVM classifier using the kNN classifier, the two-stage hybrid SVM-kNN classifier is recommended for further use. The experimental results approve the feasibility of using two-stage hybrid SVM-kNN classifiers in the data classification problem. The experimental results obtained with the application of various datasets confirm the feasibility of using two-stage hybrid SVM-kNN classifiers in the data classification problem.
Nearest neighbors distance ratio open-set classifier
In this paper, we propose a novel multiclass classifier for the open-set recognition scenario. This scenario is the one in which there are no a priori training samples for some classes that might appear during testing. Usually, many applications are inherently open set. Consequently, successful closed-set solutions in the literature are not always suitable for real-world recognition problems. The proposed open-set classifier extends upon the Nearest-Neighbor (NN) classifier. Nearest neighbors are simple, parameter independent, multiclass, and widely used for closed-set problems. The proposed Open-Set NN (OSNN) method incorporates the ability of recognizing samples belonging to classes that are unknown at training time, being suitable for open-set recognition. In addition, we explore evaluation measures for open-set problems, properly measuring the resilience of methods to unknown classes during testing. For validation, we consider large freely-available benchmarks with different open-set recognition regimes and demonstrate that the proposed OSNN significantly outperforms their counterparts in the literature.
Distance classifier ensemble based on intra-class and inter-class scatter
Distance classifier ensemble method based on Intra-class and Inter-class Scatter is proposed in this paper. By Bootstrap technology, the training samples are sampled repeatedly to generate several subsample set, define Intra-class and Inter-class Scatter matrix with subsample set, train subsample set with scatter matrix, generate individual classifier. In the classifier ensemble, the results are integrated with the relative majority voting method. Experiment is tested on UCI standard database, the experimental results show that the proposed ensemble method based on Intra-class and Inter-class Scatter for distance classifier is effective, and it is superior to other methods in classification performance.
A Comprehensive Integration Method Based on Unbalanced Data Classification Problems
This paper first introduces the characteristics of non-equilibrium data and the main problems brought by its classification. Based on this, it proposes a comprehensive integration method based on non-equilibrium data classification problems, and the method improvement background and strategy. The construction of base classifiers, and the selective integration methods of base classifiers are described in detail. Finally, the method proposed in this chapter is verified by experiments. The experimental results show that the proposed method is effective.
An effective combining classifier approach using tree algorithms for network intrusion detection
In this paper, we developed a combining classifier model based on tree-based algorithms for network intrusion detection. The NSL-KDD dataset, a much improved version of the original KDDCUP’99 dataset, was used to evaluate the performance of our detection algorithm. The task of our detection algorithm was to classify whether the incoming network traffics are normal or an attack, based on 41 features describing every pattern of network traffic. The detection accuracy of 89.24 % was achieved using the combination of random tree and NBTree algorithms based on the sum rule scheme, outperforming the individual random tree algorithm. This result represents the highest result achieved so far using the complete NSL-KDD dataset. Therefore, combining classifier approach based on the sum rule scheme can yield better results than individual classifiers, giving us hope of better anomaly based intrusion detection systems in the future.
Multiple classifier system for remote sensing image classification: a review
Over the last two decades, multiple classifier system (MCS) or classifier ensemble has shown great potential to improve the accuracy and reliability of remote sensing image classification. Although there are lots of literatures covering the MCS approaches, there is a lack of a comprehensive literature review which presents an overall architecture of the basic principles and trends behind the design of remote sensing classifier ensemble. Therefore, in order to give a reference point for MCS approaches, this paper attempts to explicitly review the remote sensing implementations of MCS and proposes some modified approaches. The effectiveness of existing and improved algorithms are analyzed and evaluated by multi-source remotely sensed images, including high spatial resolution image (QuickBird), hyperspectral image (OMISII) and multi-spectral image (Landsat ETM+). Experimental results demonstrate that MCS can effectively improve the accuracy and stability of remote sensing image classification, and diversity measures play an active role for the combination of multiple classifiers. Furthermore, this survey provides a roadmap to guide future research, algorithm enhancement and facilitate knowledge accumulation of MCS in remote sensing community.
The great multivariate time series classification bake off: a review and experimental evaluation of recent algorithmic advances
Time Series Classification (TSC) involves building predictive models for a discrete target variable from ordered, real valued, attributes. Over recent years, a new set of TSC algorithms have been developed which have made significant improvement over the previous state of the art. The main focus has been on univariate TSC, i.e. the problem where each case has a single series and a class label. In reality, it is more common to encounter multivariate TSC (MTSC) problems where the time series for a single case has multiple dimensions. Despite this, much less consideration has been given to MTSC than the univariate case. The UCR archive has provided a valuable resource for univariate TSC, and the lack of a standard set of test problems may explain why there has been less focus on MTSC. The UEA archive of 30 MTSC problems released in 2018 has made comparison of algorithms easier. We review recently proposed bespoke MTSC algorithms based on deep learning, shapelets and bag of words approaches. If an algorithm cannot naturally handle multivariate data, the simplest approach to adapt a univariate classifier to MTSC is to ensemble it over the multivariate dimensions. We compare the bespoke algorithms to these dimension independent approaches on the 26 of the 30 MTSC archive problems where the data are all of equal length. We demonstrate that four classifiers are significantly more accurate than the benchmark dynamic time warping algorithm and that one of these recently proposed classifiers, ROCKET, achieves significant improvement on the archive datasets in at least an order of magnitude less time than the other three.
Fabric Defect Detection Using a One-class Classification Based on Depthwise Separable Convolution Autoencoder
Abstract Fabric defect detection is anomaly detection, which is widely studied in the textile industry. Like most anomaly detection tasks, there are some problems hindering detection results, such as class imbalance, defective sample scarcity, and feature selection. This paper proposes a method applying depthwise separable convolution autoencoder on dimensionality reduction and one-class classifier support vector data description (SVDD) to detect fabric defects. A depthwise separable convolution autoencoder can effectively extract sample features with less computation and fewer parameters than the regular convolution, which will be easily used in industrial production. SVDD can only use non-defective samples to train the classifier and solve the difficulty and heavy cost of collecting negative samples (defective samples). In this paper, we will demonstrate the effectiveness of the method on polyester fibers by using accuracy and AUC as evaluation criteria.
Simple and Effective Complementary Label Learning Based on Mean Square Error Loss
Abstract In this paper, we propose a simple and effective complementary label learning approach to address the label noise problem for deep learning model. Different surrogate losses have been proposed for complementary label learning, however, are often sophisticated designed, as the losses are required to satisfy the classifier consistency property. We propose an effective square loss for complementary label learning under unbiased and biased assumptions. We also show theoretically that our method assurances that the optimal classifier under complementary labels is also the optimal classifier under ordinary labels. Finally, we test our method on three different benchmark datasets with biased and unbiased assumptions to verify the effectiveness of our method.