Thinking more about this it occurred to me that the source of confusion in all of this might be that we have not clarified the notion of "maximum possible entropy". The unspoken assumption seems to be that the quark-gluon plasma at thermal equilibrium is in a state of maximum entropy and, as the universe expands, the entropy increases, so the maximum entropy has increased.

I think this is mistaken. Entropy can be taken to be the number of micro-states consistent with a given macro-state. The macro-state here has been taken to be a plasma of a certain volume, temperature, pressure, density, and so on. Given that state, its entropy is what it is, and if the volumes expands, the macro parameters change -- the temperature and density drop and so on. In this new state the entropy is again what it is -- and a larger value because the system has undergone an irreversible change which, by the second law, will increase the entropy.

But my argument has been that we must take gravity into account in this. So the maximum entropy for the plasma at a given volume and temperature is the maximum entropy that this mass-energy in this volume can have. That, of course, is given when the mass-energy is in the form of black holes, and the entropy is maximized when everything is in one BH. No possible configuration of this matter could have a larger entropy, so that is the sensible definition of the entropy maximum in this case.

As the universe expands, the temperature and density drop, but the maximum entropy is still that which would be obtained if everything were in the form of a black hole. The total energy is now lower, so the maximum entropy is slightly lower, but this is what one might expect since energy is not conserved in the expansion of the universe. Thinking in terms of phase space, the constancy of the maximum entropy is determined by Liouville's theorem which says that the volume occupied in phase space is a constant of the motion. In the Hamiltonian formulation, this is just an expression of energy conservation. Since energy conservation does not obtain in an expanding universe, this phase space argument has to be modified.

Whatever the case, however, it is clear that if you define 'maximum entropy' as the maximum possible for the amount of mass-energy present, then the BH limit is many many orders of magnitude above the entropy of a simple classical plasma or gas, and it does not change much with the expansion once one enters the matter dominated phase of the universe.

Bruce




Bruce Kellett wrote:
Russell Standish wrote:
On Fri, Nov 07, 2014 at 12:59:28PM +1100, Bruce Kellett wrote:

I agree that the past hypothesis, while it explains the
thermodynamic AoT, itself stands in need of explanation. This is the
great unsolved problem of cosmology -- at least according to many
cosmologists. The initial big bang might be assumed to be in
thermodynaic equilibrium, but that is essentially the same
assumption as the assumption of low entropy. The question remains as

Thermodynamic equilibrium is at maximum entropy.

This leads me into commenting on your post slightly earlier in this
thread - expansion of the universe is coupled to th second law in that
it allows a universe initially at maximum entropy (thermodynaic
equilibrium) to evolve into a universe not at maximum entropy, but
never have entropy decrease, so satisfying the second law.

I think you are making the mistake that Liz made -- you are ignoring the gravitational degrees of freedom. The quark-gluon plasma of the hot big bang might have been at thermodynamic equilibrium for the quark and gluon degrees of freedom, but the gravitational degrees of freedom were not thermalized. Consequently, the entropy of the plasma was almost infinitely below the maximum possible. There is no need for the maximum entropy, whatever that might be, to increase with the expansion because there is still an enormous potential for entropy to increase as gravitation comes into play. The time scale for this is much longer than the timescale of quark processes, so is not evident at early times.

The main role that the expansion plays is in cooling the early universe. As space expands, relativistic matter cools and non-relativistic matter simply becomes less dense. Energy is not conserved in these processes. Neither of these process affect any entropy bound, but they are essential for the formation of order in the form of bound states, then clusters of gas, galaxies, stars and so on. All these processes are according to the standard laws of physics -- all obey the second law and lead to increases in entropy. But entropy remains many many orders of magnitude below any possible bound throughout all of this.

--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to