Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Reading Level
      Reading Level
      Clear All
      Reading Level
  • Content Type
      Content Type
      Clear All
      Content Type
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Item Type
    • Is Full-Text Available
    • Subject
    • Publisher
    • Source
    • Donor
    • Language
    • Place of Publication
    • Contributors
    • Location
9,764 result(s) for "Kernel functions"
Sort by:
Kernel methods and machine learning
\"Offering a fundamental basis in kernel-based learning theory, this book covers both statistical and algebraic principles. It provides over 30 major theorems for kernel-based supervised and unsupervised learning models. The first of the theorems establishes a condition, arguably necessary and sufficient, for the kernelization of learning models. In addition, several other theorems are devoted to proving mathematical equivalence between seemingly unrelated models. With over 25 closed-form and iterative algorithms, the book provides a step-by-step guide to algorithmic procedures and analysing which factors to consider in tackling a given problem, enabling readers to improve specifically designed learning algorithms, build models for new applications and develop efficient techniques suitable for green machine learning technologies. Numerous real-world examples and over 200 problems, several of which are Matlab-based simulation exercises, make this an essential resource for graduate students and professionals in computer science, electrical and biomedical engineering. Solutions to problems are provided online for instructors\"-- Provided by publisher.
Algebras of Singular Integral Operators with Kernels Controlled by Multiple Norms
The authors study algebras of singular integral operators on \\mathbb R^n and nilpotent Lie groups that arise when considering the composition of Calderón-Zygmund operators with different homogeneities, such as operators occuring in sub-elliptic problems and those arising in elliptic problems. These algebras are characterized in a number of different but equivalent ways: in terms of kernel estimates and cancellation conditions, in terms of estimates of the symbol, and in terms of decompositions into dyadic sums of dilates of bump functions. The resulting operators are pseudo-local and bounded on L^p for 1 \\lt p \\lt \\infty . While the usual class of Calderón-Zygmund operators is invariant under a one-parameter family of dilations, the operators studied here fall outside this class, and reflect a multi-parameter structure.
Stability of heat kernel estimates for symmetric non-local Dirichlet forms
In this paper, we consider symmetric jump processes of mixed-type on metric measure spaces under general volume doubling condition, and establish stability of two-sided heat kernel estimates and heat kernel upper bounds. We obtain their stable equivalent characterizations in terms of the jumping kernels, variants of cut-off Sobolev inequalities, and the Faber-Krahn inequalities. In particular, we establish stability of heat kernel estimates for
Szegő kernel asymptotics for high power of CR line bundles and Kodaira embedding theorems on CR manifolds
Let X be an abstract not necessarily compact orientable CR manifold of dimension 2n-1, n\\geqslant 2, and let L^k be the k-th tensor power of a CR complex line bundle L over X. Given q\\in \\{0,1,\\ldots ,n-1\\}, let \\Box ^{(q)}_{b,k} be the Gaffney extension of Kohn Laplacian for (0,q) forms with values in L^k. For \\lambda \\geq 0, let \\Pi ^{(q)}_{k,\\leq \\lambda} :=E((-\\infty ,\\lambda ]), where E denotes the spectral measure of \\Box ^{(q)}_{b,k}. In this work, the author proves that \\Pi ^{(q)}_{k,\\leq k^{-N_0}}F^*_k, F_k\\Pi ^{(q)}_{k,\\leq k^{-N_0}}F^*_k, N_0\\geq 1, admit asymptotic expansions with respect to k on the non-degenerate part of the characteristic manifold of \\Box ^{(q)}_{b,k}, where F_k is some kind of microlocal cut-off function. Moreover, we show that F_k\\Pi ^{(q)}_{k,\\leq 0}F^*_k admits a full asymptotic expansion with respect to k if \\Box ^{(q)}_{b,k} has small spectral gap property with respect to F_k and \\Pi^{(q)}_{k,\\leq 0} is k-negligible away the diagonal with respect to F_k. By using these asymptotics, the authors establish almost Kodaira embedding theorems on CR manifolds and Kodaira embedding theorems on CR manifolds with transversal CR S^1 action.
Generalized Mercer Kernels and Reproducing Kernel Banach Spaces
This article studies constructions of reproducing kernel Banach spaces (RKBSs) which may be viewed as a generalization of reproducing kernel Hilbert spaces (RKHSs). A key point is to endow Banach spaces with reproducing kernels such that machine learning in RKBSs can be well-posed and of easy implementation. First we verify many advanced properties of the general RKBSs such as density, continuity, separability, implicit representation, imbedding, compactness, representer theorem for learning methods, oracle inequality, and universal approximation. Then, we develop a new concept of generalized Mercer kernels to construct
A Comparison Study of Kernel Functions in the Support Vector Machine and Its Application for Termite Detection
Termites are the most destructive pests and their attacks significantly impact the quality of wooden buildings. Due to their cryptic behavior, it is rarely apparent from visual observation that a termite infestation is active and that wood damage is occurring. Based on the phenomenon of acoustic signals generated by termites when attacking wood, we proposed a practical framework to detect termites nondestructively, i.e., by using the acoustic signals extraction. This method has the pros to maintain the quality of wood products and prevent higher termite attacks. In this work, we inserted 220 subterranean termites into a pine wood for feeding activity and monitored its acoustic signal. The two acoustic features (i.e., energy and entropy) derived from the time domain were used for this study’s analysis. Furthermore, the support vector machine (SVM) algorithm with different kernel functions (i.e., linear, radial basis function, sigmoid and polynomial) were employed to recognize the termites’ acoustic signal. In addition, the area under a receiver operating characteristic curve (AUC) was also adopted to analyze and improve the performance results. Based on the numerical analysis, the SVM with polynomial kernel function achieves the best classification accuracy of 0.9188.
On kernel functions for bi-fidelity Gaussian process regressions
This paper investigates the impact of kernel functions on the accuracy of bi-fidelity Gaussian process regressions (GPR) for engineering applications. The potential of composite kernel learning (CKL) and model selection is also studied, aiming to ease the process of manual kernel selection. Using the autoregressive Gaussian process as the base model, this paper studies four kernel functions and their combinations: Gaussian, Matern-3/2, Matern-5/2, and Cubic. Experiments on four engineering test problems show that the best kernel is problem dependent and sometimes might be counter-intuitive, even when a large amount of low-fidelity data already aids the model. In this regard, using CKL or automatic kernel selection via cross validation and maximum likelihood can reduce the tendency to select a poor-performing kernel. In addition, the CKL technique can create a slightly more accurate model than the best-performing individual kernel. The main drawback of CKL is its significantly expensive computational cost. The results also show that, given a sufficient amount of samples, tuning the regression term is important to improve the accuracy and robustness of bi-fidelity GPR, while decreasing the importance of the proper kernel selection.
Fast feature selection for interval-valued data through kernel density estimation entropy
Kernel density estimation, which is a non-parametric method about estimating probability density distribution of random variables, has been used in feature selection. However, existing feature selection methods based on kernel density estimation seldom consider interval-valued data. Actually, interval-valued data exist widely. In this paper, a feature selection method based on kernel density estimation for interval-valued data is proposed. Firstly, the kernel function in kernel density estimation is defined for interval-valued data. Secondly, the interval-valued kernel density estimation probability structure is constructed by the defined kernel function, including kernel density estimation conditional probability, kernel density estimation joint probability and kernel density estimation posterior probability. Thirdly, kernel density estimation entropies for interval-valued data are proposed by the constructed probability structure, including information entropy, conditional entropy and joint entropy of kernel density estimation. Fourthly, we propose a feature selection approach based on kernel density estimation entropy. Moreover, we improve the proposed feature selection algorithm and propose a fast feature selection algorithm based on kernel density estimation entropy. Finally, comparative experiments are conducted from three perspectives of computing time, intuitive identifiability and classification performance to show the feasibility and the effectiveness of the proposed method.
Spectral-Similarity-Based Kernel of SVM for Hyperspectral Image Classification
Spectral similarity measures can be regarded as potential metrics for kernel functions, and can be used to generate spectral-similarity-based kernels. However, spectral-similarity-based kernels have not received significant attention from researchers. In this paper, we propose two novel spectral-similarity-based kernels based on spectral angle mapper (SAM) and spectral information divergence (SID) combined with the radial basis function (RBF) kernel: Power spectral angle mapper RBF (Power-SAM-RBF) and normalized spectral information divergence-based RBF (Normalized-SID-RBF) kernels. First, we prove these spectral-similarity-based kernels to be Mercer’s kernels. Second, we analyze their efficiency in terms of local and global kernels. Finally, we consider three hyperspectral datasets to analyze the effectiveness of the proposed spectral-similarity-based kernels. Experimental results demonstrate that the Power-SAM-RBF and SAM-RBF kernels can obtain an impressive performance, particularly the Power-SAM-RBF kernel. For example, when the ratio of the training set is 20 % , the kappa coefficient of Power-SAM-RBF kernel (0.8561) is 1.61 % , 1.32 % , and 1.23 % higher than that of the RBF kernel on the Indian Pines, University of Pavia, and Salinas Valley datasets, respectively. We present three conclusions. First, the superiority of the Power-SAM-RBF kernel compared to other kernels is evident. Second, the Power-SAM-RBF kernel can provide an outstanding performance when the similarity between spectral signatures in the same hyperspectral dataset is either extremely high or extremely low. Third, the Power-SAM-RBF kernel provides even greater benefits compared to other commonly used kernels when the sizes of the training sets increase. In future work, multiple kernels combining with the spectral-similarity-based kernel are expected to be provide better hyperspectral classification.
Functionally Graded Piezoelectric Medium Exposed to a Movable Heat Flow Based on a Heat Equation with a Memory-Dependent Derivative
The current work deals with the study of a thermo-piezoelectric modified model in the context of generalized heat conduction with a memory-dependent derivative. The investigations of the limited-length piezoelectric functionally graded (FGPM) rod have been considered based on the presented model. It is assumed that the specific heat and density are constant for simplicity while the other physical properties of the FGPM rod are assumed to vary exponentially through the length. The FGPM rod is subject to a moving heat source along the axial direction and is fixed to zero voltage at both ends. Using the Laplace transform, the governing partial differential equations have been converted to the space-domain, and then solved analytically to obtain the distributions of the field quantities. Numerical computations are shown graphically to verify the effect of memory presence, graded material properties, time-delay, Kernel function, and the thermo-piezoelectric response on the physical fields.