Carl Friedrich Gauss

Carl Friedrich Gauss was one of the greatest mathematicians of all time. He contributed a great deal to the study of probabilities and chance by rigorous definitions of concepts that had been treated by Abraham de Moivre and Pierre-Simon Laplace.

Gauss proved that among unimodal, symmetric and differentiable distributions φ(x-x0) the normal distribution is one in which the maximum likelihood estimator X of the location parameter x0 coincides with the arithmetic mean.

His proof is as follows: Let M1,M2,... be the observations, μ in number, with p being their arithmetic mean.

Then the likelihood equation

φ'(M1 - X) + φ'(M2 - X) + .... = 0

in which φ'(Δ) = dφ(Δ)/φ(Δ)dΔ will possess a (unique) solution p, i.e.

φ'(M1 - p) + φ'(M2 - p) + .... O.

Supposing, then, that M2 = M3 ..... M1 - μN, Gauss arrived at

φ'[N(μ - 1)] = (1 - μ )φ'(-N), φ'(Δ)/Δ = k, (k < 0)

for any natural μ, so

φ(Δ) = Ce2/2.

In 1718 de Moivre had defined probability very simply

The Probability of an Event is greater or less, according to the number of Chances by which it may happen, compared with the whole number of Choices by which it may happen or fail.
This brief statement contains the assumption that all states are equally probable, assuming that we have no information that indicates otherwise.

While this describes our information epistemically, making it a matter of human ignoranc, we can say ontologically that the world contains no information that would make any state more probable than the others. Such information simply does not exist. This is sometimes called the principle of insufficient reason or the principle of indifference.

If that information did exist, it could and would be revealed in large numbers of experimental trials, which provide the statistics on the different "states."

Probabilities are theories. Statistics are experiments.

In the philosophical controversies between a priori or epistemic probability and a posteriori or ontological probability, the latter is often said to be the "frequency" interpretation of probability.

Probability Distributions
de Moivre worked out the mathematics for the binomial distribution by analyzing the tosses of a coin. If p is the probability of a "heads" and q = 1 - p the probability of "tails," then the probability of k heads is

Pr(k) = (n!/(n - k)! k!)p(n - k)qk

Pierre-Simon Laplace also derived this result, which is sometimes called the de Moivre-Laplace Theorem.

de Moivre also was the first to approximate the factorial for large n as

n! ≈ (constant) √n nn e-n

James Stirling determined the constant in de Moivre's approximation ( = √(2π), which is now commonly called Stirling's approximation.

For Teachers
For Scholars

 Chapter 1.5 - The Philosophers Chapter 2.1 - The Problem of Knowledge Home Part Two - Knowledge
Normal | Teacher | Scholar