SECTION D(1)The Gradual Development of the Idea that Entropy Depends on ProbabilityEntropy is difficult to conceive, in that, as it does not directly affect the senses, there is nothing physical to represent it; it cannot be felt like temperature. It has no analogue in the whole of Physics; Zeuner's heat weight will perhaps serve as such for reversible states, but is inadequate for irreversible ones. This is not surprising when we consider the outcome, namely, that it depends on probability considerations.CLAUSIUS coined the term Entropy from the Greek, from a word meaning transformation; with him the transformation value was equal to the difference between the entropy of the final and initial states. As there is a general expression for entropy, wecan readily write the equivalent of any transformation between two particular states.Strictly speaking, however, entropy by itself depends only on the state in question, not on any change it may experience, nor on its past history before reaching the state contemplated. Of course, this was appreciated by such a master mind as CLAUSIUS, and, indeed, he defined the entropy as the algebraic sum of the transformations necessary to bring a body into its existing state. Moreover, as the formula for it was in terms of other more or less sensible thermodynamic quantities, its relation to these was at first more readily grasped, could be represented diagrammatically, and had to do duty for the true, but still unknown, physical idea of entropy itself. It was early understood, too, that growth of entropy was closely connected with the degradation or waste of energy; that it was identical with the Second Law. The frequently given, but not always valid, relation,[17]led to entropy being called a factor of energy. But all these werechangerelations and did not go to the root of the difficulty, as to what constituted the physical nature ofunchangedentropy.Quite early, too, there was a realization of the fact that entropy had somehow a statistical character, that it had to do withmeanvalues only. This was well brought out by the long known, and much quoted, "demon" experiment suggested by Maxwell, in which a being of superhuman power separated, without doing any work, the colder and hotter particles of a gas, thus effecting an apparent violation of the Second Law. This, to be sure, was getting close to the crux of the whole matter, but still lacked much to give entropy a precise physical meaning. Nevertheless, we see here a notable approach to the fundamentalrequirement that entropy must be tied down to the condition of "elementary chaos" (elementare-unordnung).We have already dwelt somewhat fully on this hypothesis of "elementary chaos.""It follows from this presentation that the concepts of entropy and temperature in their essence are tied to the condition of "elementare Unordnung." Thus a purely periodic absolute plane wave possesses neither entropy nor temperature because it contains nothing whatever in the way of uncheckable, non-measurable magnitudes, and therefore cannot be "elementar-ungeordnet," just as little as can be the case with the motion of a single rigid atom. When there is [an irregular co-operation of many partial oscillations of different periods, which independently of each other propagate themselves in the different directions of space, or] an irregular, confused, whirring intermingling of many atoms, then (and not till then) is there furnished the preliminary condition for the validity of the hypothesis of "elementare Unordnung and consequently for the existence of entropy and of temperature.""Now what mechanical or electro-dynamic magnitude represents the entropy of a state? Evidently this magnitude depends in some way on the "Probability" of the state. For because "elementare Unordnung" and the lack of every individual check (or measurement) is of the essence of entropy it follows that only combination or probability considerations can furnish the necessary foothold for the computation of this magnitude. Even the hypothesis of "elementare Unordnung" by itself is essentially a proposition in Probability, for, out of a vast number of equally possible cases, it selects a definite number and declares they do not exist in Nature."Now since the idea of entropy, and likewise the content of Second Law, is a universal one, and since, moreover, the theorems of probability possess no less universal significance, we may conjecture (surmise) that the connection between Entropy and Probability will be a veryclose one. We therefore place at the head (forefront) of our further presentation the following proposition: "The Entropy of a physical system in a definite condition depends solely on the probability of this state." The permissibility and fruitfulness of this proposition will become manifest later in different cases. A general and rigorous proof of this proposition will not be attempted at this place. Indeed, such an attempt would have no sense here because without a numerical statement of the probability of a state it could not be tested numerically.[17]This relation is not a valid one, unless the external work performed by a gas during its change is equal to.(2)Planck's Formula for the Relation between Entropy and the Number of ComplexionsNow we have already seen, from the permutation considerations presented onp. 27, that the Theory of Probabilities leads very directly to the theorem, "The number of complexions included in a given state constitutes the probability W of that state." The next step (omitted here) is to identify the thermodynamically found expression for entropy of any state with the logarithm of its number of complexions.PLANCK'S formula for entropyis:hereis an arbitrary constant without physical significance and can be omitted at pleasure; the numerical value in the first term of the second member is the quotient of energy (expressed in ergs) divided by temperature (). This certainly gives a physical definiteness and precision to entropy which leaves nothing to be desired.PLANCK, in reproducing from probability consideration the dependence of entropyon probability, finds the relationwhen the dimensions ofevidently depend on those of constant.Hereis BOLTZMANN'S value—, which always changes in one direction only;is the universal integration constant which is the same for a terrestrial as for a cosmical system, and when it is known for one, it is known for the other; whenis known for radiant phenomena it is also known and is the same for molecular motions.There are some general statements which indicate more or less rigorously some of the properties or features of the entropy of a state.(a) Entropy is a universal measure of the "disorder" in the mass points of a system.(b) Entropy is a universal measure of the irreversibility of a state and is its criterion as well.(c) Entropy is a universal measure of nature's preference for the state.(d) Entropy is a universal measure of the spontaneity with which a state acts when it is free to change.(e) Entropy of a system can onlygrow.(f) Entropy asserts the essential one-sidedness of Nature.(g) There exists in Nature a magnitude which always changes in the same sense.(e), (f), and (g) implychangeand therefore, strictly speaking, should not be mentioned here but postponed to a later section.
(1)The Gradual Development of the Idea that Entropy Depends on Probability
Entropy is difficult to conceive, in that, as it does not directly affect the senses, there is nothing physical to represent it; it cannot be felt like temperature. It has no analogue in the whole of Physics; Zeuner's heat weight will perhaps serve as such for reversible states, but is inadequate for irreversible ones. This is not surprising when we consider the outcome, namely, that it depends on probability considerations.
CLAUSIUS coined the term Entropy from the Greek, from a word meaning transformation; with him the transformation value was equal to the difference between the entropy of the final and initial states. As there is a general expression for entropy, wecan readily write the equivalent of any transformation between two particular states.
Strictly speaking, however, entropy by itself depends only on the state in question, not on any change it may experience, nor on its past history before reaching the state contemplated. Of course, this was appreciated by such a master mind as CLAUSIUS, and, indeed, he defined the entropy as the algebraic sum of the transformations necessary to bring a body into its existing state. Moreover, as the formula for it was in terms of other more or less sensible thermodynamic quantities, its relation to these was at first more readily grasped, could be represented diagrammatically, and had to do duty for the true, but still unknown, physical idea of entropy itself. It was early understood, too, that growth of entropy was closely connected with the degradation or waste of energy; that it was identical with the Second Law. The frequently given, but not always valid, relation,[17]led to entropy being called a factor of energy. But all these werechangerelations and did not go to the root of the difficulty, as to what constituted the physical nature ofunchangedentropy.
Quite early, too, there was a realization of the fact that entropy had somehow a statistical character, that it had to do withmeanvalues only. This was well brought out by the long known, and much quoted, "demon" experiment suggested by Maxwell, in which a being of superhuman power separated, without doing any work, the colder and hotter particles of a gas, thus effecting an apparent violation of the Second Law. This, to be sure, was getting close to the crux of the whole matter, but still lacked much to give entropy a precise physical meaning. Nevertheless, we see here a notable approach to the fundamentalrequirement that entropy must be tied down to the condition of "elementary chaos" (elementare-unordnung).
We have already dwelt somewhat fully on this hypothesis of "elementary chaos."
"It follows from this presentation that the concepts of entropy and temperature in their essence are tied to the condition of "elementare Unordnung." Thus a purely periodic absolute plane wave possesses neither entropy nor temperature because it contains nothing whatever in the way of uncheckable, non-measurable magnitudes, and therefore cannot be "elementar-ungeordnet," just as little as can be the case with the motion of a single rigid atom. When there is [an irregular co-operation of many partial oscillations of different periods, which independently of each other propagate themselves in the different directions of space, or] an irregular, confused, whirring intermingling of many atoms, then (and not till then) is there furnished the preliminary condition for the validity of the hypothesis of "elementare Unordnung and consequently for the existence of entropy and of temperature."
"Now what mechanical or electro-dynamic magnitude represents the entropy of a state? Evidently this magnitude depends in some way on the "Probability" of the state. For because "elementare Unordnung" and the lack of every individual check (or measurement) is of the essence of entropy it follows that only combination or probability considerations can furnish the necessary foothold for the computation of this magnitude. Even the hypothesis of "elementare Unordnung" by itself is essentially a proposition in Probability, for, out of a vast number of equally possible cases, it selects a definite number and declares they do not exist in Nature."
Now since the idea of entropy, and likewise the content of Second Law, is a universal one, and since, moreover, the theorems of probability possess no less universal significance, we may conjecture (surmise) that the connection between Entropy and Probability will be a veryclose one. We therefore place at the head (forefront) of our further presentation the following proposition: "The Entropy of a physical system in a definite condition depends solely on the probability of this state." The permissibility and fruitfulness of this proposition will become manifest later in different cases. A general and rigorous proof of this proposition will not be attempted at this place. Indeed, such an attempt would have no sense here because without a numerical statement of the probability of a state it could not be tested numerically.
[17]This relation is not a valid one, unless the external work performed by a gas during its change is equal to.
[17]This relation is not a valid one, unless the external work performed by a gas during its change is equal to.
(2)Planck's Formula for the Relation between Entropy and the Number of Complexions
Now we have already seen, from the permutation considerations presented onp. 27, that the Theory of Probabilities leads very directly to the theorem, "The number of complexions included in a given state constitutes the probability W of that state." The next step (omitted here) is to identify the thermodynamically found expression for entropy of any state with the logarithm of its number of complexions.
PLANCK'S formula for entropyis:hereis an arbitrary constant without physical significance and can be omitted at pleasure; the numerical value in the first term of the second member is the quotient of energy (expressed in ergs) divided by temperature (). This certainly gives a physical definiteness and precision to entropy which leaves nothing to be desired.
PLANCK, in reproducing from probability consideration the dependence of entropyon probability, finds the relationwhen the dimensions ofevidently depend on those of constant.
Hereis BOLTZMANN'S value—, which always changes in one direction only;is the universal integration constant which is the same for a terrestrial as for a cosmical system, and when it is known for one, it is known for the other; whenis known for radiant phenomena it is also known and is the same for molecular motions.
There are some general statements which indicate more or less rigorously some of the properties or features of the entropy of a state.
(a) Entropy is a universal measure of the "disorder" in the mass points of a system.
(b) Entropy is a universal measure of the irreversibility of a state and is its criterion as well.
(c) Entropy is a universal measure of nature's preference for the state.
(d) Entropy is a universal measure of the spontaneity with which a state acts when it is free to change.
(e) Entropy of a system can onlygrow.
(f) Entropy asserts the essential one-sidedness of Nature.
(g) There exists in Nature a magnitude which always changes in the same sense.
(e), (f), and (g) implychangeand therefore, strictly speaking, should not be mentioned here but postponed to a later section.