Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Fast Learning Rates for Plug-In Classifiers
by
Audibert, Jean-Yves
, Tsybakov, Alexandre B.
in
62G07
/ 62G08
/ 62H05
/ 68T10
/ Acceleration of convergence
/ Bayesian analysis
/ Classification
/ Convergence
/ Decision theory
/ Density
/ Estimators
/ Exact sciences and technology
/ excess risk
/ fast rates of convergence
/ General topics
/ Integers
/ Learning rate
/ Lebesgue measures
/ Markov processes
/ Mathematical functions
/ Mathematical models
/ Mathematics
/ Minimax
/ minimax lower bounds
/ Numerical analysis
/ Numerical analysis. Scientific computation
/ Perceptron convergence procedure
/ plug-in classifiers
/ Polynomials
/ Probability
/ Probability and statistics
/ Probability distributions
/ Probability theory and stochastic processes
/ Sciences and techniques of general use
/ statistical learning
/ Statistical Learning Theory
/ Statistics
/ Studies
2007
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Fast Learning Rates for Plug-In Classifiers
by
Audibert, Jean-Yves
, Tsybakov, Alexandre B.
in
62G07
/ 62G08
/ 62H05
/ 68T10
/ Acceleration of convergence
/ Bayesian analysis
/ Classification
/ Convergence
/ Decision theory
/ Density
/ Estimators
/ Exact sciences and technology
/ excess risk
/ fast rates of convergence
/ General topics
/ Integers
/ Learning rate
/ Lebesgue measures
/ Markov processes
/ Mathematical functions
/ Mathematical models
/ Mathematics
/ Minimax
/ minimax lower bounds
/ Numerical analysis
/ Numerical analysis. Scientific computation
/ Perceptron convergence procedure
/ plug-in classifiers
/ Polynomials
/ Probability
/ Probability and statistics
/ Probability distributions
/ Probability theory and stochastic processes
/ Sciences and techniques of general use
/ statistical learning
/ Statistical Learning Theory
/ Statistics
/ Studies
2007
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Fast Learning Rates for Plug-In Classifiers
by
Audibert, Jean-Yves
, Tsybakov, Alexandre B.
in
62G07
/ 62G08
/ 62H05
/ 68T10
/ Acceleration of convergence
/ Bayesian analysis
/ Classification
/ Convergence
/ Decision theory
/ Density
/ Estimators
/ Exact sciences and technology
/ excess risk
/ fast rates of convergence
/ General topics
/ Integers
/ Learning rate
/ Lebesgue measures
/ Markov processes
/ Mathematical functions
/ Mathematical models
/ Mathematics
/ Minimax
/ minimax lower bounds
/ Numerical analysis
/ Numerical analysis. Scientific computation
/ Perceptron convergence procedure
/ plug-in classifiers
/ Polynomials
/ Probability
/ Probability and statistics
/ Probability distributions
/ Probability theory and stochastic processes
/ Sciences and techniques of general use
/ statistical learning
/ Statistical Learning Theory
/ Statistics
/ Studies
2007
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Journal Article
Fast Learning Rates for Plug-In Classifiers
2007
Request Book From Autostore
and Choose the Collection Method
Overview
It has been recently shown that, under the margin (or low noise) assumption, there exist classifiers attaining fast rates of convergence of the excess Bayes risk, that is, rates faster than $n^{-1/2}$. The work on this subject has suggested the following two conjectures: (i) the best achievable fast rate is of the order n⁻¹, and (ii) the plug-in classifiers generally converge more slowly than the classifiers based on empirical risk minimization. We show that both conjectures are not correct. In particular, we construct plug-in classifiers that can achieve not only fast, but also super-fast rates, that is, rates faster than n⁻¹. We establish minimax lower bounds showing that the obtained rates cannot be improved.
Publisher
Institute of Mathematical Statistics,The Institute of Mathematical Statistics
MBRLCatalogueRelatedBooks
Related Items
Related Items
We currently cannot retrieve any items related to this title. Kindly check back at a later time.
This website uses cookies to ensure you get the best experience on our website.