Of all the words being bandied about (quality, property, composition, domain, continuity, intensity, general, special, iteration, 
etc.) EricC's "contextless" stands out and reflects EricS' initial target of dimension analysis. The conversation seems 
to be about essentialism. Maybe that's a nice reflection that we're sticking to the OG topic "analytic idealism". But 
maybe it's Yet-Another example of our pareidolia to see patterns in noise and then to *reify* those patterns. [Re]Abstracting and 
[re]concretizing heuristics across contexts may well be what separates us from other life forms. But attributions of the 
"unreasonable effectiveness" of any body of heuristics is the most dangerous form of reification. The superhero ability 
to [re]abstract and [re]concretize your pet heuristics convinces you they are "properties" or "qualities" of 
the world, rather than of your anatomy and physiology. Arguing with myself, perhaps Dave's accusation is right. Maybe this is an 
example of swapping the sign for the object, or reworded prioritizing for the description over the referent, confusing the 
structure of the observer with the structure of the observed.

Those of us with less ability tend to attribute (whatever haphazard heuristics 
they've landed on) to the world *early*. Those of us with more ability continue 
the hunt for Truth, delaying attribution to the world until we get too old to 
play that infinite game any more.

I think Possible Worlds helps, here, too: 
https://plato.stanford.edu/entries/possible-worlds/ Patterns are simply 
(non-degenerate) quantifiers over possible worlds.

Regardless, I'd like to ask whether the formulation of intensive properties as 
derivatives of entropy w.r.t. extensive properties is formalized somewhere? If 
so, I'd be grateful for pointers. I'm used to the idea that the intensives 
divide out the extensives. But I haven't seen them formulated as higher order 
derivations from entropy.

Thanks.
-glen

On 3/29/22 14:37, David Eric Smith wrote:
[snip]
1. One first has to have a notion of a macrostate; all these terms only come 
into existence with respect to it. (They are predicates of what are called 
“state variables” — the intensive ones and the extensive ones — and that is 
what the “state” refers to.)

2. One needs some criterion for what is likely, or stable, which in general 
terms is an entropy (extending considerably beyond the Gibbs equilibrium 
entropy, but still to be constructed from specific principles), and on the 
macrostates _only_, the entropy function (which may be defined on many other 
states besides macroststates as well) becomes a _state function_.

3. Then (actually, all along since the beginning of the construction) one needs 
to talk about what kind of aggregation operator we can apply to systems, and 
quantities that do accumulate under aggregation become the arguments of the 
state-function entropy, and the extensive state variables.  (I say “accumulate” 
in favor of the more restrictive word “add”, because what we really require is 
that they are what are termed “scale factors” in large-deviation language, and 
we can admit a somewhat wider class of kinds of accumulation than just 
addition, though addition is the extremely common one.)

4. Once one has that, the derivatives of the entropy with respect to the 
extensive variables are the intensive state variables.  It is precisely the 
duality — that one is the derivative of a function with respect to the other, 
which is the argument of that function — that makes it not bizarre that both 
exist and that they are different.  But as EricC rightly says, if one just uses 
phenomenological descriptions, why any of this should exist, and why it should 
arrange itself into such dual systems, much less dual systems with always the 
same pair-wise relations, seems incomprehensible.  For some of the analogistic 
applications, there may not be any notions of state, or of a function doing 
what the entropy does, or of aggregation, or an associated accumulation 
operation, or gradients, or any of it.  Some of the phenomenology may seems to 
kinda-sorta go through, but whether one wants to pin oneself down to narrow 
terms, is less clear.

[snip]

On Mar 30, 2022, at 5:04 AM, Eric Charles <[email protected] 
<mailto:[email protected]>> wrote:

That is a bizarre distinction, that can only be maintained within some sort of odd, 
contextless discussion. If you tell me the number of atoms of a particular substance that 
you have smushed within a given space, we can, with reasonable accuracy, tell you the 
density, and hence the "state of matter". When we change the quantity of matter 
within that space, we can also calculate the expected change in temperature.


--
Mɥǝu ǝlǝdɥɐuʇs ɟᴉƃɥʇ' ʇɥǝ ƃɹɐss snɟɟǝɹs˙

.-- .- -. - / .- -.-. - .. --- -. ..--.. / -.-. --- -. .--- ..- --. .- - .
FRIAM Applied Complexity Group listserv
Zoom Fridays 9:30a-12p Mtn UTC-6  bit.ly/virtualfriam
un/subscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/
archives:
5/2017 thru present https://redfish.com/pipermail/friam_redfish.com/
1/2003 thru 6/2021  http://friam.383.s1.nabble.com/

Reply via email to