Law of large numbers
From the Simple English Wikipedia, the free encyclopedia that anyone can change
The Law of large numbers (LLN) is a theorem from statistics. If there is a random variable, it will be be stable, in the long run. This means that in the long run, the observed value will get ever closer to the expected value.
Rollinge a dice, the numbers 1,2,3,4,5 and 6 are possible outcomes. They are all equally likely. The population mean (or "expected value") of the outcomes is:
- (1 + 2 + 3 + 4 + 5 + 6) / 6 = 3.5.
The following graph shows the results of an experiment of rolls of a die. In this experiment it can be seen that the average of die rolls vary wildly at first. As predicted by LLN the average stabilizes around the expected value of 3.5 as the number of observations become large.
[change] History
Jakob Bernoulli first described the LLN. He sais it was so simple that even the stupidest man instinctively knows it is true. [1] Despite this, it took him over 20 years to develop a good mathematical proof. Once he had found it, he published the proof in Ars Conjectandi (The Art of Conjecturing) in 1713. He named this his "Golden Theorem". It became generally known as "Bernoulli's Theorem" (not to be confused with the Law in Physics with the same name.) In 1835, S.D. Poisson further described it under the name "La loi des grands nombres" (The law of large numbers)[2]. Thereafter, it was known under both names, but the "Law of large numbers" is most frequently used.
Other mathematicians also contributed to make the law better. Some of them were Chebyshev, Markov, Borel, Cantelli and Kolmogorov. After these studies there are now two different forms of the law: One is called the "weak" law and the other the "strong" law. These forms do not describe different laws. They have different ways to describe the convergence of the observed or measured probability to the actual probability. The strong form of the law implies the weak one.