A Probabilistic Theory of Pattern Recognition

by ; ;
Format: Hardcover
Pub. Date: 1996-01-01
Publisher(s): Springer Verlag
List Price: $119.99

Buy Used

Usually Ships in 24-48 Hours
$91.64

Rent Textbook

Select for Price
There was a problem. Please try again later.

New Textbook

We're Sorry
Sold Out

eTextbook

We're Sorry
Not Available

How Marketplace Works:

  • This item is offered by an independent seller and not shipped from our warehouse
  • Item details like edition and cover design may differ from our description; see seller's comments before ordering.
  • Sellers much confirm and ship within two business days; otherwise, the order will be cancelled and refunded.
  • Marketplace purchases cannot be returned to eCampus.com. Contact the seller directly for inquiries; if no response within two days, contact customer service.
  • Additional shipping costs apply to Marketplace purchases. Review shipping costs at checkout.

Summary

Pattern recognition presents one of the most significant challenges for scientists and engineers, and many different approaches have been proposed. The aim of this book is to provide a self-contained account of probabilistic analysis of these approaches. The book includes a discussion of distance measures, nonparametric methods based on kernels or nearest neighbors, Vapnik-Chervonenkis theory, epsilon entropy, parametric classification, error estimation, free classifiers, and neural networks. Wherever possible, distribution-free properties and inequalities are derived. A substantial portion of the results or the analysis is new. Over 430 problems and exercises complement the material.

Table of Contents

Preface
Introductionp. 1
The Bayes Errorp. 9
Inequalities and Alternate Distance Measuresp. 21
Linear Discriminationp. 39
Nearest Neighbor Rulesp. 61
Consistencyp. 91
Slow Rates of Convergencep. 111
Error Estimationp. 121
The Regular Histogram Rulep. 133
Kernel Rulesp. 147
Consistency of the k-Nearest Neighbor Rulep. 169
Vapnik-Chervonenkis Theoryp. 187
Combinatorial Aspects of Vapnik-Chervonenkis Theoryp. 215
Lower Bounds for Empirical Classifier Selectionp. 233
The Maximum Likelihood Principlep. 249
Parametric Classificationp. 263
Generalized Linear Discriminationp. 279
Complexity Regularizationp. 289
Condensed and Edited Nearest Neighbor Rulesp. 303
Tree Classifiersp. 315
Data-Dependent Partitioningp. 363
Splitting the Datap. 387
The Resubstitution Estimatep. 397
Deleted Estimates of the Error Probabilityp. 407
Automatic Kernel Rulesp. 423
Automatic Nearest Neighbor Rulesp. 451
Hypercubes and Discrete Spacesp. 461
Epsilon Entropy and Totally Bounded Setsp. 479
Uniform Laws of Large Numbersp. 489
Neural Networksp. 507
Other Error Estimatesp. 549
Feature Extractionp. 561
Appendixp. 575
Notationp. 591
Referencesp. 593
Author Indexp. 619
Subject Indexp. 627
Table of Contents provided by Blackwell. All Rights Reserved.

An electronic version of this book is available through VitalSource.

This book is viewable on PC, Mac, iPhone, iPad, iPod Touch, and most smartphones.

By purchasing, you will be able to view this book online, as well as download it, for the chosen number of days.

Digital License

You are licensing a digital product for a set duration. Durations are set forth in the product description, with "Lifetime" typically meaning five (5) years of online access and permanent download to a supported device. All licenses are non-transferable.

More details can be found here.

A downloadable version of this book is available through the eCampus Reader or compatible Adobe readers.

Applications are available on iOS, Android, PC, Mac, and Windows Mobile platforms.

Please view the compatibility matrix prior to purchase.