Am. J. Phys. 26: 388 (1958)

On the Use of the Notion "Probability" in Physics


T. EHRENFEST-AFANASSJEWA

Witterozenstraat 57, Leiden, Holland

(Received April 29, 1957)

An analysis of the meanings of such terms as "at random," "equal chances," "probability calculus," "laws of probability" as used in physics. The author treated this subject for the first time in a lecture at St. Petersburg in 1911, after which it was published in Russian in the Journal of the Physical-Chemical Association of St. Petersburg. Later, Paul Ehrenfest made it the topic of several lectures.

I.

In view of the importance that modern physicists attribute to the idea of probability it seems more necessary than ever to analyze the meaning of the terms employed and the way they are used by the physicists. I shall therefore treat the terms "at random," "equal chances," "probability calculus," and "laws of probability."

With respect to the terms "at random" and "equal chances" I can only say in what circumstances they are used and I prefer to do this by giving a traditional example: let us assume there are n balls, number 1 to n, in the interior of an urn from which they are withdrawn, one by one, and are returned immediately after each draw; let us assume all necessary precautions are taken so that there is no known reason for preferential appearance of one ball over any other. One would say in this case that each individual ball had the "same chance" of being drawn and the result of the draw was "at random."

The term "probability" has a purely mathematical meaning: it is the number representing the ratio of the size of two assemblies of which one forms a part of the other. The "calculu of probabilities" is concerned with the probabilities associated with other assemblies which can be deduced from the initial assemblies by use of certain assumptions. Finally we arrive at a term which I shall try to show as having no definite meaning at all. This is the term "laws of probabilities. "

II.

In order to avoid generalized formulas which would make our exposition too heavy, we shall choose a very simple example which nevertheles contains all the essential traits of which we wish to speak. Let us assume there is an urn with three balls, of which two are red and one black. In comparing the assembly of red balls and that of the black (in this case, only one) with all the balls in our urn, we obtain as probabilities Pr and Pb the numbers 2/3 and 1/3, respectively. Just as we formed the assemblies of balls we may form the assemblies of the results of draws of different balls from our urn and we may divide them in groups: the draws of red and the draws of black balls. It is obviou that these assemblies consist of the same numbers of elements as do the assemblies of the balls themselves, so that the improbability of drawing a red ball" and the "probability of drawing a black ball" are represented, respectively, by the same numbers 2/3 and 1/3. We shall be concerned with the probabilities of the draws, "events."

Now let us examine certain derived assemblies and, to facilitate our discussion, let us introduce the terms "order" and "index." We shall say, the draw of a single ball is an "event of the first order." It possesses three possible different realizations since there are in our urn three balls marked, respectively, by the numbers 1, 2, 3.

A succession of n draws (remember: we draw a ball, we note its color and we throw it back again into the urn after which we make a second withdrawal and so on) is an "event of second order of index n." There are in our example 3n such events of second order which differ by the numbers of the balls of certain colors and by their places in the series. Let us suppose, to be altogether concrete, that n = 3. The total assembly of events "of second order of index n = 3" contains 33 = 27 elements. It divides itself into four group according to the number of red and black balls which enter into each event, namely, the group of three black containing one element, 3-3-3; the group of two black and one red containing six elements, 3-3-1, 3-3-2, 3-1-3, 3-2-3, 1-3-3, 2-3-3; the group of one black and two red containing twelve elements; and the group of three red containing eight elements.

If n were larger, the number of events of second order would be still bigger and would divide itself according to the proportion of colors in an element of this order into a far larger number of groups. As one can show, the largest group-and hence the "most probable"- will always be the one in which the ratio of the number of red and black ball shall be the same as the ratio of the elementary probabilities, hence 2/1 in our case.

But it is possible to consider derived assemblies, formed in a more complicated fashion: a succession of n2 serie of n1 draws is an event of third order of index (nl , n2). It possesses in our case (3nl)n2 = 3nln2 realizations which differ by the individual events of second order which they contain and their place in the sequence.

If we take n2=27, we will have the shortest succession in which every individual element of second order of index 3 can be realized once. The assembly of all elements of third order with index (nl,n2) divides itself in groups according to the different groups of second order which are represented in the third order of events. One can show that the third­order group with maximum probability will be one in which the numbers of different groups of second order are proportional to the probabilities of these groups.

In this fashion one may extend the argument to the arbitrary order p with an arbitrary index (nl,n2,..., np-l). The maximum probability will always be associated with the group in which the number of groups of the preceding order are proportional to their probabilities. It is important to observe the following circumstances.

(a) All these assemblies with their probabilities can be deduced from the initial assemblies and their initial probabilitie ("of the first order," "elementary") and this scheme offers in itself no indications which might make us prefer one order of index over another.

(b) An event which is associated with the most probable group of certain order (other than the first) will contain necessarily an element (at least) of the least probable of the preceding order.

III.

If someone observes the withdrawal from an urn whose contents he knows, his usual attitude is to expect the "most probable" result. In our example, he would expect the appearance of a red ball. With each succeeding draw he will repeat this attitude, but at the moment when a black ball appears, he will say to himself, "but surely, it would be undoubtedly most impossible that black balls would never appear, the laws of probability require that red and black should appear in a definite proportion." (Perhaps he might say this after a fairly long series of red balls, but it would seem that as a rule only the event themselves force one to change one's attitude.) If he makes a long serie of sets of three draws, his first reaction would be to expect series, each of which contained one black and two red balls. But when this result did not come to pass, he would reconcile himself by extending to the succeeding order the "laws of probabilities."

To summarize, we will say that the first attitude consists in accepting the "hypothesis of the most probable of the first order" or, in short, the "hypothesis of the first order"; the following attitude corresponds to the "hypothesis of second order" of "index n1"; the succeeding one to the "hypothesi of third order" of "index (nl,n2)" and so on. One speaks of "laws of probability" when one accepts a hypothesis of a definite order; one renounces the preceding one since the less probable events, excluded by the first are required by the second.

There are an infinite number of hypothese of most probable and they are in contradiction to one another ! (We must concede that if a hypothesis of the third order of index (n1,n2) is realized, the hypothesis of second order of index N=nln2 is also realized. But this is not the same as the hypothesis of second order of index nl made originally.)

What then are the "laws of probability ?" What should I say if at the beginning of my experiments with an urn in which there are more red balls than black I should obtain a long serie of black ones? Who could tell that such an event was "contrary to the law of probabilities?" It is even unjust to say it is "improbable": one could always indicate an order p high enough, for which my series wa the beginning of even the most probable event! Thus, e.g., a succession of five black balls, drawn from our urn of three balls, could be the beginning of the most probable event of third order of index (3,27): such an event should contain one element 3­3­3 and one element 3­3­2 and these two might present themselves precisely at the beginning of my experiments. And the most probable event of the fourth order with index (3,27,2727) must contain the element of 3 x 27=81 repetition of a black ball.

I believe, therefore, I have the right to say: when someone seriously attributes the result of a certain experiment to chance, he thereby denies all regularity and all laws. There is a calculu of probabilities, but the "laws of probabilities" is a term devoid of any definite meaning.

IV.

Nevertheless, one often speaks of the realization of the laws of probabilities. One must therefore examine what is meant in such cases. One will easily observe that when one speaks of applying the probability theory to the explanation of certain phenomenological regularities and to the deduction of certain algebraic formulas expressing the accounts of observed laws, one assumes a definite order: one may say he is believing in chance and the possibility of any event, however improbable, the attitude one observes in practice is one of believing only in the indefinitely prolonged repetition of the most probable events of one definite order; one leaves to chance the succession of individual elements of the preceding order which form as elements the events of the order in question; but the relative frequency of those elements must always correspond to the hypothesi of the most probable of the order in question. (It is true, one never expect an altogether exact realization of the formula in question, as far as one admits a certain random element. Just the same, if one began drawing a black ball from an urn 81 times in succession, one would never suspect that this urn could contain two red balls for only one black and attribute the said result to chance-whether rightly or wrongly is another question.) We shall call this attitude the "belief in a confined chance." Whatever opinion one may have of such an attitude, one must admit it has a definite meaning, which cannot be said of the simultaneous belief in complete chance and the laws of probability.

In most cases it is the hypothesi of second order that one assumes: Boltzmann's hypothesis of collision (which forms the basis of his "H-Theorem" in its initial form) i an example. The calculations of pressure and temperature of a gas are based on assumptions of second order. In all these cases one had to do with very small intervals of time and space. To establish the analogy with draw of balls from an urn, let us assume that these elements of space are divided into a large number n1 of still smaller (ultramicroscopic") parts. At the beginning of an elementary interval of time a certain number k1 of molecules (of a type determined by their momenta) appear in the elementary space; they are distributed among k1 ultramicroscopic parts, while k2=n1-k1 of these parts are empty of this type of molecules. This framework is analogou to a filling of an urn with k1 red and k2 black balls. Boltzmann's hypothesis of collisions appears then as analogou to the hypothesis of the second order, assuming that the number of collision in an elementary time interval must be proportional to the number of the molecules in question present in the given space interval. It is also a second-order hypothesis which directs the experiments made to determine the proportion of red and black balls in the urn.

One finds the same attitude among the physicists who are concerned with quantum theory. In their statistical observations they assume that the frequency of manifestations of electron at different points in space are proportional to the probabilities determined by the corresponding Schrödinger function. They believe their theoretical calculations are confirmed if the statistical result is in accord with this prediction and, undoubtedly, they would reject it in the contrary case despite the fact that they profess to accept chance without any confinement (and that they might with logical consistency have recourse to a hypothesi of higher order).

A characteristic example of passing from the hypothesis of second order to the hypothesis of third order, urged by circumstances of research (in this case by Zermelo's objection based on the postulates of mechanics) is furnished by Boltzmann's admission that the collisions between the molecules, being events dependent on chance, cannot behave exactly according to his hypothesis of collisions and that therefore there will be times when the entropy of an isolated system will change in a sense contradictory to his calculations. (Note that here the hypothesis of the second order was abandoned just insofar as it was necessary to deduce the increase of entropy and the inclination of every isolated system towards the equilibrium.l) Such is also the case concerning the emission of particles from radioactive sources, when their appearance are rare; one believes that in time periods sufficiently long, the number of particles emitted is about constant, but these periods being very long, one divides the time in shorter equal parts in which the said appearance are not constant, and so one is laid to the hypothesis of third order with a smaller n1 and some appropriate n2.

V.

As we have seen the scheme of draw from an urn containing balls of different colors, to which we may compare the physical events constituting the hypothetical bases of our statistical calculations furnish no criterion at all to decide the choice of the order and index of the derived events that should be the object of our calculations. How then do we motivate our choice? We find the answer in the example of the preceding paragraph: the objects of our investigation themselve guide us; we know more about them (or believe we do) than the fact of them being statistical events (which to be sure is a hypothesis). It would seem that ordinarily the thinking about the application of statistical formula does not go from the calculations of probabilities to the problem in question, but in opposite sense. One considers a phenomenon of which one knows enough to attribute to it a certain stability. One has reason to believe certain values observed to be statistical averages of certain other values. Thi determines the order and the index of our hypothesis. If the sensitivity of our observation were more acute, if we could distinguish difference of pressure of a gas on microscopic areas, if we were thinking microbe doing physics, it would not be the hypothesis of second order with index n1, but a hypothesis of third order with index (n1,n2), m1m2 being equal our n1. (Indeed, nobody ever defines the exact number n1, but the order of its greatness is somehow assumed to be in conformity with the observations.)

It is significant that one ordinarily never considers calculating the probability of the uniform distribution of the molecules in space (where no outer forces are present); nevertheles everyone is sure it is the most probable: our phantasy presents this idea as the most believable. (Jeans did calculate this in his gas theory, but who knows about this?)

VI.

Without doubt, an analysis of the motive leading to probability formulas would not suffice to explain the succes of its application. We must not neglect the question; how does it come about that precisely events most probable are so often realized ?

Here we must answer with another question: is it true that the most probable is so frequently realized? Look at the disastrous effect that believing in some laws of probability exercise upon those who apply them upon phenomena closest to really "random," e.g., the roulette! But we should rather choose an example more closely related to physical theories; the present state of the world as it appears from the point of view of any statistical theory, kinetic theory, or quantum theory. These all affirm the world (or that part of it that is accessible to our investigation) tends to the most probable state and it is assumed thi tendency lasted a very long time before the appearance of organic life on earth and will last a very long time after this moment. But this implie that all this long time the state of the world is far from being the most probable! So we seem to accept at the same time the most probable change of the world state and a quite not probable state of the world itself. If in truth the ultimate directing cause in nature is chance, one must say nature offers us a beautiful analogy, with an indefinitely long serie of draws of black (or nearly always black) balls from an urn containing a far greater number of red than black balls! We have two sides to the situation one of which satisfies the idea of the most probable wherea the other contradicts it, but one usually ignores the latter and insist in believing in the "laws of probability."

By the preceding I am far from rejecting the application of statistical formulas which coincide with the "most probable from a certain point of view." I only want to claim the impossibility of assuming the "chance" as the ultimate rationale of any physical laws.

l T. Ehrenfest-Afanassjewa Grundlagen der Thermodynamik (E. J. Brill, Leiden, 1956), p. 125.