Home
Bayes Home
Jaynes Errata
Articles
Books
Software
Contact

Books on Probability Theory and Applications
Introductory

Data Analysis: A Bayesian Tutorial, by
D. S. Sivia.
A book on maximum entropy and Bayesian methods aimed at
senior undergraduates in science and engineering. Shows
how a few fundamental rules can be used to tackle a wide
variety of problems in data analysis. After explaining the basic principles of
Bayesian probability theory, their use is illustrated with a variety of
examples ranging from elementary parameter estimation to image
processing. Other topics covered include reliability analysis, multivariate
optimization, least squares and maximum likelihood, errorpropagation,
hypothesis testing, maximum entropy, and experimental design.
Jaynes said of this book that it should be considered an adjunct to his work on
probability theory.


Bayesian Statistics: An Introduction, Third Edition,
Peter M. Lee.
This new edition of Peter Lee's wellestablished introduction to Bayesian
statistics maintains the clarity of exposition and use of examples for which
this text is known and praised. Aimed at thirdyear statistics students,
this edition includes coverage of hierarchical models, Markov Chain Monte
Carlo methods, Bernardo's theory of reference distributions, and a brief
treatment of generalized linear models. There is also an introduction to the
use of BUGS (Bayesian Inference Using Gibbs Sampling), now the standard
computational tool for numerical evaluation of complex Bayesian models, and
a variety of examples and functions written in R, the free version of the S
language for statistical computation.


Statistics: A Bayesian Perspective,
by Donald A. Berry.
An elementary introduction that only assumes familiarity with college algebra. Presents statistics as a means of integrating data into the scientific process, and introduces early on ideas of data analysis and experimental design. Includes a data disk with Minitab macros. Though the Bayesian approach is used throughout, the text still has a definite frequentist feeling to it in many parts.

Foundations

Probability Theory: The Logic of Science,
by E. T. Jaynes.
Perhaps the clearest exposition of the fundamentals of probability theory to
be found. The standard rules of probability can be interpreted as uniquely
valid principles in logic, and Jaynes shows that the range of application for these principles is far greater than was supposed in frequentist probability theory. The imaginary distinction between 'probability theory' and 'statistical inference' is removed, leaving a logical unity and simplicity. The material is aimed at readers who are already familiar with applied mathematics at an advanced undergraduate level or higher.


Bayesian Theory,
by Jose M. Bernardo and Adrian F. M. Smith.
This highly acclaimed text provides a thorough account of key concepts and theoretical results, with particular emphasis on viewing statistical inference as a special case of decision theory. Includes a detailed discussion of the problem of specification of socalled prior ignorance. The level of mathematics used is such that most material is accessible to readers with knowledge of advanced calculus. In particular, no knowledge of abstract measure theory is assumed, and the emphasis throughout is on statistical concepts rather than rigorous mathematics. Intended for students and researchers in statistics, mathematics, decision analysis, economic and business studies, and all branches of science and engineering.


Algebra of Probable Inference,
by Richard T. Cox.
"[This book] is, in my opinion one of the most important ever written on the foundations of probability theory, and the greatest advance in the conceptual, as opposed to the purely mathematical, formulation of the theory since Laplace." — E. T. Jaynes, American Journal of Physics. Develops and demonstrates that probability theory is the only theory of inductive inference that abides by logical consistency. Cox does so through a functional derivation of probability theory as the unique extension of Boolean algebra for handling degrees of plausibility given an incomplete state of knowledge. Cox also begins to formulate a theory of logical questions through his consideration of systems of assertions — a theory that he more fully developed some years later.

Computational methods

Markov Chain Monte Carlo in Practice,
by W. R. Gilks, S. Richardson, and D. J. Spiegelhalter (Editors).
Markov Chain Monte Carlo methods for computing posterior expectations have
made it practical to apply Bayesian methods to a much wider variety of
problems than were previously possible. This widely recommended book
introduces MCMC methodology at a level suitable for applied
statisticians. It explains the methodology and its theoretical background,
summarizes application areas, and presents illustrative applications in many
areas including archaeology, astronomy, biostatistics, genetics, and
epidemiology. This book should serve as an introductory text for applied statisticians in epidemiology, biostatiticians, and computer scientists.


Monte Carlo Methods in Bayesian Computation,
by MingHui Chen, QiMan Shao, and Joseph George Ibrahim.
Discusses how to compute posterior quantities of interest using Markov chain
Monte Carlo. Topics addressed include marginal posterior
density estimation, estimation of normalizing constants, constrained
parameter problems, highest posterior density interval
calculations, model comparisons, marginal likelihood methods, ratios of
normalizing constants, Bayes factors,
stochastic search variable selection, Bayesian model averaging, the reverse
jump algorithm, and model adequacy. Presents an equal mixture of theory and applications, and is
intended as a reference book or graduate textbook at the advanced masters or
PhD level.


Advanced Mean Field Methods: Theory and Practice,
by Manfred Opper and David Saad (Editors).
This book discusses some approximate methods for Bayesian inference that are an
important alternative to MCMC when the number of random variables is large.
One of the simplest approximations is based on the mean field method; other approaches include the variational approach; the TAP approach, which incorporates correlations by including effective reaction terms in the mean field theory; and the more general methods of graphical models. This book covers the theoretical foundations of advanced mean field methods, explores the relation between the different approaches, examines the quality of the approximation obtained, and demonstrates their application to various areas of probabilistic modeling.

Applied

Bayesian Data Analysis, Second Edition,
by Andrew Gelman, John B. Carlin, Hal S. Stern, and Donald B. Rubin.
A bestselling graduatelevel text in Bayesian data analysis. Emphasizes practice over theory, clearly describing how to conceptualize, perform, and critique statistical analyses from a Bayesian perspective. Provides guidance on all aspects of Bayesian data analysis and includes examples of real statistical analyses that demonstrate how to solve complicated problems. There is considerable coverage of Monte Carlo methods, including some very good practical advice on how to avoid errors when using Markov Chain Monte Carlo methods. Advocates of the Jaynesian perspective (probability theory as extended logic) will find some conceptual issues to argue with, but for practical advice on carrying out Bayesian analyses this book is hard to beat.


Bayesian Statistical Modelling,
by Peter Congdon.
A popular, practical guide to constructing Bayesian models, with nearly 200 worked examples for a variety of applications. Covers recent innovations in Bayesian modelling, including Markov Chain Monte Carlo methods. Data and WinBUGS code for examples are available on the Internet. Provides a general overview of Bayesian modelling, with emphasis on the principles of prior selection, model identification and interpretation of findings, in a range of modelling innovations, focussing on their implementation with real data, with advice as to appropriate computing choices and strategies. Researchers and graduate students in applied statistics, medical science, public health and the social sciences will benefit greatly from the examples and applications featured.


Applied Bayesian Modelling,
by Peter Congdon.
A followup to Bayesian Statistical Modelling, focusing on applications of Bayesian techniques in a wide range of important topics in the social and health sciences. The applications are illustrated through many reallife examples and software implementation in WINBUGS — a popular software package that offers a simplified and flexible approach to analyzing Bayesian models. The book provides a good introduction to Bayesian modelling and data analysis for a wide range of people involved in applied statistical analysis, including researchers and students from statistics, and the health and social sciences. The wealth of examples makes this book an ideal reference for anyone involved in statistical modelling and analysis.


Uncertainty: A Guide to Dealing with Uncertainty in Quantitative Risk and Policy Analysis,
by Millett Granger Morgan and Max Henrion.
The authors explain the ways in which uncertainty is an important factor in the problems of risk and policy analysis. This book outlines the source and nature of uncertainty, discusses techniques for obtaining and using expert judgment, and reviews a variety of simple and advanced methods for analyzing uncertainty. Powerful computer environments and good graphical techniques for displaying uncertainty are just two of the more advanced topics addressed in later chapters.


Statistical Decision Theory and Bayesian Analysis, Second Edition,
by James O. Berger.
A classic text on its subject, written for graduate students and researchers. Although it lacks material on the recent advances in Bayesian computation, it provides clear exposition of Bayesian theory from a decision theoretic viewpoint, as a means of making optimal decisions under conditions of uncertainty.

Miscellaneous

Bayesian Learning for Neural Networks (Lecture Notes in Statistics 118),
by Radford M. Neal.
This book demonstrates how Bayesian methods allow complex neural network models to be used without fear of the "overfitting" that can occur with traditional training methods. Insight into the nature of these complex Bayesian models is provided by a theoretical investigation of the priors over functions that underlie them. A practical implementation of Bayesian neural network learning using Markov chain Monte Carlo methods is also described, and software for it is freely available over the Internet. Presupposing only basic knowledge of probability and statistics, this book should be of interest to researchers in statistics, engineering, and artificial intelligence.


Probability, Random Variables, and Stochastic Processes, Fourth Edition,
by Athanasios Papoulis and S. Unnikrishna Pillai.
A classic in probability, statistics, and estimation and in the application of
these fields to modern engineering problems. The book assumes a strong
college mathematics background; it is intended for a senior/graduate level
course in probability and is aimed at students in electrical engineering,
math, and physics departments. The first half of the text develops the basic
machinery of probability and statistics from first principles while the
second half develops applications of the basic theory. Topics includes
mean square estimation, likelihood tests, maximum entropy methods, Monte Carlo techniques, spectral representations and estimation, sampling theory, bispectra and system identification, cyclostationary processes, deterministic signals in noise, and the Wiener and Kalman filters.


Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference,
by Judea Pearl.
Discusses the theoretical foundations and computational methods that underlie plausible reasoning under uncertainty. Discusses networkpropagation techniques for belief networks, that serve as a mechanism for combining the theoretical coherence of probability theory with modern demands of reasoningsystems technology: modular declarative inputs, conceptually meaningful inferences, and parallel distributed computation. Application areas include diagnosis, forecasting, image interpretation, multisensor fusion, decision support systems, plan recognition, planning, speech recognitionin short, almost every task requiring that conclusions be drawn from uncertain clues and incomplete information.


Causality: Models, Reasoning, and Inference,
by Judea Pearl.
This book provides a comprehensive exposition of modern analysis of causation. Pearl presents a unified account of the probabilistic, manipulative, counterfactual and structural approaches to causation, and devises simple mathematical tools for analyzing the relationships between causal connections, statistical associations, actions and observations. The book will open the way for including causal analysis in the standard curriculum of statistics, artifical intelligence, business, epidemiology, social science and economics. Anyone who wishes to elucidate meaningful relationships from data, predict effects of actions and policies, assess explanations of reported events, or form theories of causal understanding and causal speech will find this book stimulating and invaluable.

