Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Series TitleSeries Title
-
Reading LevelReading Level
-
YearFrom:-To:
-
More FiltersMore FiltersContent TypeItem TypeIs Full-Text AvailableSubjectPublisherSourceDonorLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
8
result(s) for
"Machine learning-Congresses"
Sort by:
Advances in analytics and applications
by
International Conference on Advanced Data Analysis, Business Analytics and Intelligence (5th : 2017 : Ahmedabad, India)
,
Laha, Arnab Kumar, editor
,
Indian Institute of Management, Ahmedabad, sponsoring body
in
Business Data processing Congresses.
,
Mathematical statistics Congresses.
,
Machine learning Congresses.
Nearest-Neighbor Methods in Learning and Vision
by
Darrell, Trevor
,
Shakhnarovich, Gregory
,
Indyk, Piotr
in
Algorithms
,
Algorithms -- Congresses
,
Computer Science
2006,2005
Regression and classification methods based on similarity of the input to stored examples have not been widely used in applications involving very large sets of high-dimensional data. Recent advances in computational geometry and machine learning, however, may alleviate the problems in using these methods on large data sets. This volume presents theoretical and practical discussions of nearest-neighbor (NN) methods in machine learning and examines computer vision as an application domain in which the benefit of these advanced methods is often dramatic. It brings together contributions from researchers in theory of computation, machine learning, and computer vision with the goals of bridging the gaps between disciplines and presenting state-of-the-art methods for emerging applications.The contributors focus on the importance of designing algorithms for NN search, and for the related classification, regression, and retrieval tasks, that remain efficient even as the number of points or the dimensionality of the data grows very large. The book begins with two theoretical chapters on computational geometry and then explores ways to make the NN approach practicable in machine learning applications where the dimensionality of the data and the size of the data sets make the naïve methods for NN search prohibitively expensive. The final chapters describe successful applications of an NN algorithm, locality-sensitive hashing (LSH), to vision tasks.
Advances in learning theory : methods, models and applications
by
NATO Advanced Study Institute on Learning Theory and Practice
,
Suykens, Johan A. K.
in
Computational learning theory
,
Congresses
,
Machine learning
2003
This text details advances in learning theory that relate to problems studied in neural networks, machine learning, mathematics and statistics.
Optimization for Machine Learning
by
Wright, Stephen J.
,
Nowozin, Sebastian
,
Sra, Suvrit
in
Artificial Intelligence
,
Computer Science
,
Machine learning
2012
The interplay between optimization and machine learning is one of the most important developments in modern computational science. Optimization formulations and methods are proving to be vital in designing algorithms to extract essential knowledge from huge volumes of data. Machine learning, however, is not simply a consumer of optimization technology but a rapidly evolving field that is itself generating new optimization ideas. This book captures the state of the art of the interaction between optimization and machine learning in a way that is accessible to researchers in both fields.Optimization approaches have enjoyed prominence in machine learning because of their wide applicability and attractive theoretical properties. The increasing complexity, size, and variety of today's machine learning models call for the reassessment of existing assumptions. This book starts the process of reassessment. It describes the resurgence in novel contexts of established frameworks such as first-order methods, stochastic approximations, convex relaxations, interior-point methods, and proximal methods. It also devotes attention to newer themes such as regularized optimization, robust optimization, gradient and subgradient methods, splitting techniques, and second-order methods. Many of these techniques draw inspiration from other fields, including operations research, theoretical computer science, and subfields of optimization. The book will enrich the ongoing cross-fertilization between the machine learning community and these other fields, and within the broader optimization community.
Combinatorial and global optimization (Series on applied mathematics-vol. 14)
by
Burkard, Rainer E
,
Pardalos, Panos M
,
Migdalas, Athanasios
in
Artificial Intelligence (Machine Learning, Neural Networks, Fuzzy Logic)
,
Combinatorial optimization
,
Combinatorial optimization -- Congresses
2002
Combinatorial and global optimization problems appear in a wide range of applications in operations research, engineering, biological science, and computer science. In combinatorial optimization and graph theory, many approaches have been developed that link the discrete universe to the continuous universe through geometric, analytic, and algebraic techniques. Such techniques include global optimization formulations, semidefinite programming, and spectral theory. Recent major successes based on these approaches include interior point algorithms for linear and discrete problems, the celebrated Goemans-Williamson relaxation of the maximum cut problem, and the Du-Hwang solution of the Gilbert-Pollak conjecture. Since integer constraints are equivalent to nonconvex constraints, the fundamental difference between classes of optimization problems is not between discrete and continuous problems but between convex and nonconvex optimization problems. This volume is a selection of refereed papers based on talks presented at a conference on \"Combinatorial and Global Optimization\" held at Crete, Greece.