Maximum Entropy: The Universal Method for Inference

Date
Authors
Giffin, Adom
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
Description
In this thesis we start by providing some detail regarding how we arrived at our present understanding of probabilities and how we manipulate them - the product and addition rules by Cox. We also discuss the modern view of entropy and how it relates to known entropies such as the thermodynamic entropy and the information entropy. Next, we show that Skilling's method of induction leads us to a unique general theory of inductive inference, the ME method and precisely how it is that other entropies such as those of Renyi or Tsallis are ruled out for problems of inference. We then explore the compatibility of Bayes and ME updating. We show that ME is capable of producing every aspect of orthodox Bayesian inference and proves the complete compatibility of Bayesian and entropy methods. The realization that the ME method incorporates Bayes' rule as a special case allows us to go beyond Bayes' rule and to process both data and expected value constraints simultaneously. We discuss the general problem of non-commuting constraints, when they should be processed sequentially and when simultaneously. The generic "canonical" form of the posterior distribution for the problem of simultaneous updating with data and moments is obtained. This is a major achievement since it shows that ME is not only capable of processing information in the form of constraints, like MaxEnt and information in the form of data, as in Bayes' Theorem, but also can process both forms simultaneously, which Bayes and MaxEnt cannot do alone. Finally, we illustrate some potential applications for this new method by applying ME to potential problems of interest.
Comment: Doctoral Thesis, 111 Pages, 2 figures
Keywords
Physics - Data Analysis, Statistics and Probability
Citation
Collections