On 3/12/2011 12:41 PM, Evgenii Rudnyi wrote:
on 12.03.2011 18:59 Brent Meeker said the following:
On 3/12/2011 1:40 AM, Evgenii Rudnyi wrote:
on 12.03.2011 04:43 Brent Meeker said the following:
On 3/11/2011 7:24 PM, Stephen Paul King wrote:
Hi Andrew, The answer to the simple question that you see that
all of this detail leads to is that at its core, Existence is
Change itself. Becoming is the fundamental ontological
primitive., just as Bergson argued. This is the result that
Hitoshi discovered and discussed in his Inconsistent Universe
Paper in terms of the truth value of the total Universe being
in an infinite oscillation between True and False. Bart Kosko
also obtained a similar result in how research on Fuzzy sets.
What Barbour really found is that there does not exist a
universal */global /*standard of measure of this change.
I think Einstein found that long before Barbour. There's no
time-like Killing vector field in an FRW universe, so there's no
If there is no standard then there is not a determination of
definiteness for the Total Change of existence and thus there
is no global measure of change. Since time can be defined in
generic terms as a measure of change, Barbour is correct in
claiming that time as a global quantity cannot exist. What
Barbour missed, as have countless others, is that/* local
measures of change can be defined*/. The fact that there is
more than one measure of entropy is a huge clue of this.
Thermodynamic entropy has always been relative to whatever is
taken to be the constraint (constant energy, constant
pressure,...) In the Everett interpretation evolution is always
unitary and the Boltzmann entropy is constant.
I would suggest to look at the CODATA Tables
What is the meaning that the entropy is relative in this case?
The temperature, which is fixed at 298.15degK.
Here I actually wanted to understand, what you mean by "Thermodynamic
entropy has always been relative". So, let me try once more and
consider entropy as well as energy to understand better what you mean.
First thermodynamic tables tabulate the molar quantities indeed but
there should not be a problem to multiply them with the number of moles.
So, the entropy as well as the energy are functions in temperature and
pressure (and composition in the case of a solution). So if you take
for example the JANAF Tables, then there is a columns of energy and
entropies values for different temperatures.
The difference between energy and entropy is that according to the
Third Law, the change of entropy at zero K is zero. This allows us to
obtain the absolute entropies, while energies are relative, that is,
one can determine the change in energy only. Well, probably one can
introduce absolute energies with E = mc^2 but this is not very practical.
What a particular feature of the entropy then have you meant that
differs it from other other physical properties?
Also for example
S(Ag,cr,298.15 K) = 42.55 ą 0.20 J/(K mol)
Do you mean that the Everett interpretation changes this value?
First, that's not entropy it's the entropy per mol. Second, the
Everett interpretation is that evolution is deterministic so the
entropy never changes. Of course you can still measure different
entropy values and it *seems* to change because your measurement
can't project out the whole ray in Hilbert space (you've "split
I still do not understand implications. Chemists use entropies and
energies from thermodynamics tables for example to compute equilibria.
Engineers use thermodynamics to develop heat engines. What then the
Everett interpretation implies for them?
Right. The entropy is just a function of state variables that is
maximum at equilibrium. In chemistry and engineering the choice is
usually temperature (energy per unit mass) and maybe something else like
pressure or chemical composition. The Boltzmann entropy, which is just
the number of micro-states consistent with state variable values, isn't
usually used directly in engineering. But it shows how the values in
tables are related to the microscopic structure. Under Everett's
interpretation, all time evolution is deterministic so there is always
only one future state and the entropy never changes. That's only of
interest when considering fundamental questions, like the entropy of the
universe, because in all practical questions we don't know the
micro-state of complex systems, so we assign an entropy that represents
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to firstname.lastname@example.org.
To unsubscribe from this group, send email to
For more options, visit this group at