Algorithmic learning in a random world

by Vladimir Vovk, Alex Gammerman, and Glenn Shafer

Springer, 2005 (first edition), 2022 (second edition)


Click to enlarge The main topic of this book is conformal prediction, a method of prediction recently developed in machine learning. Conformal predictors are among the most accurate methods of machine learning, and unlike other state-of-the-art methods, they provide information about their own accuracy and reliability.

The book integrates mathematical theory and revealing experimental work. It demonstrates mathematically the validity of the reliability claimed by conformal predictors when they are applied to independent and identically distributed data, and it confirms experimentally that the accuracy is sufficient for many practical problems. Later chapters generalize these results to models called repetitive structures, which originate in the algorithmic theory of randomness and statistical physics. The approach is flexible enough to incorporate most existing methods of machine learning, including newer methods such as boosting and support vector machines and older methods such as nearest neighbours and the bootstrap.

Topics and Features:

Researchers in computer science, statistics, and artificial intelligence will find the book an authoritative and rigorous treatment of some of the most promising new developments in machine learning. Practitioners and students in all areas of research that use quantitative prediction or machine learning will learn about important new methods.

The book may be purchased directly from Springer and from many on-line booksellers, including amazon.com.

Vladimir Vovk and Alex Gammerman are Professors of Computer Science at Royal Holloway, University of London. Glenn Shafer is Professor in the Rutgers School of Business - Newark and New Brunswick. All three authors are affiliated with the Centre for Reliable Machine Learning at Royal Holloway, University of London.

Errata etc.

On-line Compression Modelling Project (New Series)

This section links to the authors' papers that develop and review the theory presented in the book. They are often also published on arXiv, in conference proceedings, or in journals, but the version given in this section is usually most up-to-date. The name of this project derives from the title of Part IV of the second edition and refers to a new kind of modelling uncertainty going back to Kolmogorov's complexity modelling. The most standard online compression model is the exchangeability model, and many of the papers in this project assume exchangeability.

The following working papers develop the ideas presented in the first edition:

The second edition of the book (December 2022) incorporated many of the results in these working papers (major exceptions are Working Papers 26 and 29 about conformal e-prediction and conformal e-testing, respectively). The following working papers have appeared after the publication of the second edition:

The working papers in the OCM project by topic

Conformal set prediction: 1, 2 (review), 3 (tutorial), 5, 6, 8, 9, 10, 11, 26 (conformal e-prediction), 28, 40; Venn prediction: 7, 13; conformal predictive distributions: 17, 18, 19, 20, 22, 23, 30; conformal testing (apart from change detection): 4, 24, 29 (conformal e-testing; also includes change detection), 31, 33, 36, 37, 39; change detection: 32, 41; miscellanea: 12, 14, 15, 16, 21, 25, 27, 34, 35, 38.

Other papers etc.

For the old working papers, mainly superseded by the 1st edition of the book, follow this link.

For other research on conformal prediction and testing, see the on-line prediction wiki. See also the events page maintained by Khuong Nguyen; for a list of events related to conformal prediction that happened before December 2022 (the publication of the second edition), click here.

This page is maintained by Vladimir Vovk.   Last modified on 10 December 2024