A probabilistic theory of pattern recognition
Pattern recognition presents one of the most significant challenges for scientists and engineers, and many different approaches have been proposed. The aim of this book is to provide a self-contained account of probabilistic analysis of these approaches. The book includes a discussion of distance measures, nonparametric methods based on kernels or nearest neighbors, Vapnik-Chervonenkis theory, epsilon entropy, parametric classification, error estimation, tree classifiers, and neural networks
xv, 636 pages : illustrations ; 24 cm
9780387946184, 0387946187
33276839
Introduction
The Bayes Error
Inequalities and alternate distance measures
Linear discrimination
Nearest neighbor rules
Consistency
Slow rates of convergence
Error estimation
The regular histogram rule
Kernel rules
Consistency of the k-nearest neighbor rule
Vapnik-Chervonenkis theory
Combinatorial aspects of Vapnik-Chervonenkis theory
Lower bounds for empirical classifier selection
The maximum likelihood principle
Parametric classification
Generalized linear discrimination
Complexity regularization
Condensed and edited nearest neighbor rules
Tree classifiers
Data-dependent partitioning
Splitting the data
The resubstitution estimate
Deleted estimates of the error probability
Automatic kernel rules
Automatic nearest neighbor rules
Hypercubes and discrete spaces
Epsilon entropy and totally bounded sets
Uniform laws of large numbers
Neural networks
Other error estimates
Feature extraction