Bayesian probability
From Wikipedia, the free encyclopedia
This article or section includes a list of references or external links, but its sources remain unclear because it lacks in-text citations. You can improve this article by introducing more precise citations. |
Bayesian probability interprets the concept of probability as 'a measure of a state of knowledge' [1]. Broadly speaking, there are two views on Bayesian probability that interpret the 'state of knowledge' concept in different ways. For the objectivist school, the rules of Bayesian statistics can be justified by desiderata of rationality and consistency and interpreted as an extension of Aristotelian logic[2][1]. For the subjectivist school, the state of knowledge corresponds to a 'personal belief' [3]. Many of the modern machine learning methods are based on objectivist Bayesian principles [4].
Contents |
[edit] History
The term "Bayesian" refers to Thomas Bayes (1702–1761), who proved a special case of what is now called Bayes's theorem. Laplace proved a more general version of the theorem and used it to approach problems in celestial mechanics, medical statistics, and jurisprudence.
Although Bayes's theorem has been in use for more than two hundred years (Bayesian inference), the Bayesian interpretation of probability is more recent. The idea that 'probability' should be interpreted as 'subjective degree of belief in a proposition' was proposed independently by Bruno de Finetti in Italy, in Fondamenti Logici del Ragionamento Probabilistico (1930) and by Frank Ramsey in Cambridge, in The Foundations of Mathematics (1931).[5] It was devised to solve problems with the classical definition of probability.
The word "Bayesian" appeared in the 1950s, but by the 1960s, it became the term preferred by people who sought to escape the strictures of the narrower "frequentist" approach to probability theory. [6] [7]
Bayesian analysis has been explored further by Harold Jeffreys, Richard T. Cox, I. J. Good, L. J. Savage and Edwin Jaynes. Other well-known proponents of Bayesian probability theory have included John Maynard Keynes, B.O. Koopman and Dennis Lindley, and many 20th-century philosophers.
In Bayesian theory, the assessment of probability can be approached in several ways. One is based on betting: the degree of belief in a proposition is reflected in the odds that the assessor is willing to bet on the success of a trial of its truth. Richard T. Cox showed that Bayesian inference is the only inductive inference that is logically consistent.[2]
[edit] Varieties
Subjective Bayesian probability interprets 'probability' as 'the degree of belief (or strength of belief) an individual has in the truth of a proposition', and is in that respect subjective. Some people who call themselves Bayesians do not accept this subjectivity, whereby they would regard this article's definition of Bayesian probability as mistaken. The chief exponents of this objectivist school were Edwin Thompson Jaynes and Harold Jeffreys. Perhaps the main objectivist Bayesian now living is James Berger of Duke University. Jose Bernardo and others accept some degree of subjectivity but believe a need exists for "reference priors" in many practical situations.
[edit] Applications
Since the 1950s, Bayesian theory and Bayesian probability have been widely applied through Cox's theorem, Jaynes' principle of maximum entropy and the Dutch book argument. In many applications, Bayesian methods are more general and appear to give better results than frequency probability [1]. Bayes factors have also been applied with Occam's Razor. See Bayesian inference and Bayes' theorem for mathematical applications.
Some regard the scientific method as an application of Bayesian probabilist inference [1]. In this view, Bayes's theorem is explicitly or implicitly used to update the strength of prior scientific beliefs in the truth of hypotheses in the light of new information from observation or experiment.
Bayesian techniques have recently been applied to filter spam e-mail. A Bayesian spam filter uses a reference set of e-mails to define what is originally believed to be spam. After the reference has been defined, the filter then uses the characteristics in the reference to define new messages as either spam or legitimate e-mail. New e-mail messages act as new information, and if mistakes in the definitions of spam and legitimate e-mail are identified by the user, this new information updates the information in the original reference set of e-mails with the hope that future definitions are more accurate. See Bayesian inference and Bayesian filtering.
[edit] Footnotes
- ^ a b c d ET. Jaynes. Probability Theory: The Logic of Science Cambridge University Press, (2003). ISBN 0-521-59271-2
- ^ a b Richard T. Cox, Algebra of Probable Inference, The Johns Hopkins University Press, 2001
- ^ de Finetti, B. (1974) Theory of probability (2 vols.), J. Wiley & Sons, Inc., New York
- ^ Bishop, CM., Pattern Recognition and Machine Learning. Springer, 2007
- ^ See p50-1, Gillies 2000 "The subjective theory of probability was discovered independently and at about the same time by Frank Ramsey in Cambridge and Bruno de Finetti in Italy." See Gillies' discussion for its explanation of how the wrong impression came about that Ramsey proposed it first.
- ^ Jeff Miller, "Earliest Known Uses of Some of the Words of Mathematics (B)"
- ^ Stephen. E. Fienberg, When did Bayesian Inference become "Bayesian"? Bayesian Analysis (2006).
[edit] See also
- Probability interpretations
- Frequency probability
- Uncertainty
- Inference
- Bayesian network
- Doomsday argument for a controversial use of Bayesian inference
- Maximum entropy thermodynamics - Bayesian view of thermodynamics
- Philosophy of mathematics
[edit] External links and references
- tutorial on Bayesian probabilities
- On-line textbook: Information Theory, Inference, and Learning Algorithms, by David MacKay, has many chapters on Bayesian methods, including introductory examples; arguments in favour of Bayesian methods (in the style of Edwin Jaynes); state-of-the-art Monte Carlo methods, message-passing methods, and variational methods; and examples illustrating the intimate connections between Bayesian inference and data compression.
- A nice on-line introductory tutorial to Bayesian probability from Queen Mary University of London
- An Intuitive Explanation of Bayesian Reasoning A very gentle introduction by Eliezer Yudkowsky
- Giffin, A. and Caticha, A. 2007 Updating Probabilities with Data and Moments
- Gillies, D.Philosophical theories of probability Routledge 2000
- Hacking, I. 1965 The Logic of Statistical Inference CUP
- Hacking, I. 1967 'Slightly More Realistic Personal Probability' Philosophy of Science vol34
- Hacking, I. 2006 The Emergence of Probability: A Philosophical Study of Early Ideas about Probability, Induction and Statistical Inference: A Philosophical Study of Early ... on Statistical and Probabilistic Mathematics Cambridge University Press
- Jaynes, E.T. (2003) Probability Theory : The Logic of Science Cambridge University Press.
- Jaynes, E.T. (1998) Probability Theory : The Logic of Science.
- Jeffrey, R.C. 1983 The Logic of Decision University of Chicago Press
- Jeffrey, R.C. 2004 Subjective Probability: The Real Thing, Cambridge University Press
- Kyburg, H.E. 1974 The Logical Foundations of Statistical Inference Reidel
- Kyburg, H.E. 1983 Epistemology and Inference University of Minnesota Press
- Kyburg, H.E. 1987 'Bayesian versus non-Bayesian Evidential Updating' Artificial Intelligence 31
- Kyburg & Smokler (eds) 1980 Studies in Subjective Probability Robert E. Krieger
- Lakatos, I. 1968 'Changes in the Problem of Inductive Logic' published as Chapter 8 of Philosophical Papers Volume 2 Cambridge University Press 1978
- Bretthorst, G. Larry, 1988, Bayesian Spectrum Analysis and Parameter Estimation in Lecture Notes in Statistics, 48, Springer-Verlag, New York, New York;
- http://www-groups.dcs.st-andrews.ac.uk/history/Mathematicians/Ramsey.html
- David Howie: Interpreting Probability, Controversies and Developments in the Early Twentieth Century, Cambridge University Press, 2002, ISBN 0-521-81251-8
- Colin Howson and Peter Urbach: Scientific Reasoning: The Bayesian Approach, Open Court Publishing, 2nd edition, 1993, ISBN 0-8126-9235-7, focuses on the philosophical underpinnings of Bayesian and frequentist statistics. Argues for the subjective interpretation of probability.
- Luc Bovens and Stephan Hartmann: Bayesian Epistemology. Oxford: Oxford University Press 2003. Extends the Bayesian program to more complex decision scenarios (e.g. dependent and partially reliable witnesses and measurement instruments) using Bayesian Network models. The book also proofs an impossibility theorem for coherence orderings over information sets and offers a measure that induces a partial coherence ordering.
- Jeff Miller "Earliest Known Uses of Some of the Words of Mathematics (B)"
- James Franklin The Science of Conjecture: Evidence and Probability Before Pascal, history from a Bayesian point of view.
- Paul Graham "Bayesian spam filtering"
- Howard Raiffa Decision Analysis: Introductory Lectures on Choices under Uncertainty. McGraw Hill, College Custom Series. (1997) ISBN 0-07-052579-X
- Devender Sivia, Data Analysis: A Bayesian Tutorial. Oxford: Clarendon Press (1996), pp. 7-8. ISBN 0-19-851889-7
- Skyrms, B. 1987 'Dynamic Coherence and Probability Kinematics' Philosophy of Science vol 54
- Henk Tijms: Understanding Probability, Cambridge University Press, 2004
- Is the portrait of Thomas Bayes authentic? Who Is this gentleman? When and where was he born? The IMS Bulletin, Vol. 17 (1988), No. 3, pp. 276-278
- Ask the experts on Bayes's Theorem, from Scientific American