The model proposed for numeric treatment of information answers following
points raised by Shannon and Logan:
Logan:

>  The inspiration for adopting the word entropy in information theory comes
> from the close resemblance between Shannon's formula and the very similar
> formula from thermodynamics: S = -k ∑ pi ln pi .   Entropy is related to
> the second law of thermodynamics that states that:
>
> Energy spontaneously tends to flow only from being concentrated in one
> place 
to becoming diffused or dispersed and spread out. And energy is
> governed by the first law of thermodynamics that states that energy cannot
> be destroyed or created.
>
> Is there an equivalent 1st and  2nd law for information?
>

Yes, there is. Information being the relation between the number of symbols
and their kind (properties, extent: in the numeric sense, extent, in the
logical sense: kind) one can propose following observation to e generalised
into a rule:
A closed system of symbols can be transformed into a differing closed system
of symbols while maintaining an identical informational content.
This means that the relation between the number of symbols and their kind
cannot be destroyed or created.

 Entropy is used to describe systems that are undergoing dynamic
> interactions like the molecules in a gas. What is the analogy with Shannon
> entropy or information?
>

In both cases, this is a LOCAL phaenomen taking place in a globally closed
system. While a part of the system cools down to a uniform level, a
different part of the system heats up (explodes, fuses, contracts, etc.). In
the numeric model, if the overall constant of information content of an
assembly remains the same (as it definitely does, assuming a finite *n*),
there may well be subsegments in the logical space which are more uniform
than other subsegments.
(Example: all true logical sentences that detail the relation between parts
and the whole with the whole being <137 is the closed universe. This set has
a given, constant, overall information content. It may however be very well
the case, that one specific subset has a deviating extent - locally - of its
own - local - information content.
Numeric explanation:
It may be, that 66 is with a prob of 90% in 10 .. 18 parts, but it may as
well be, that one of the cases describes a freakish assembly of far too many
1s as opposed to bigger summands. Any of the summands can be outside its
most probable range, and it is a certainity that one will observe a local
phaenomen of dissolving of summands into elementar units.)
This process happens in actual Nature surrounding us sufficiently often and
is sufficiently unusual so that humans notice and remember it and give it a
name. It appears that this process carries the name of "entropy".

 Is Shannon's formula really the basis for a theory of information or is it
> merely a theory of signal transmission?
>

No, Shannon's formula is not the basis of anything in information theory.
Shannons formula is the roof of an edifice based on the logic of
similarities. Information theory deals with a different basic concept. In
information theory the kind of a symbol is also a logical category of its
own and is not derrived from the number of symbols. (In classical -
similarity based - logic, the kind of a symbol derives from its number (the
number of elementar units that make up this whole). Information theory
negates the assumption that the parts actually and absolutely fuse into a
whole, an assumption which is depicted in the procedures currently in
exclusive use relating to the operation of addition.

 Thermodynamic entropy involves temperature* *and energy in the form of
> heat, which is constantly spreading out. Entropy S  is defined as ∆Q/T. What
> are the analogies for Shannon entropy?
>
> There is the flow of energy in thermodynamic entropy but energy is
> conserved, i.e. it cannot be destroyed or created.
>
> There is the flow of information in Shannon entropy but is information
> something that cannot be destroyed or created as is the case with energy? Is
> it conserved? I do not think so because when I share my information with you
> I do not lose information but you gain it and hence information is created.
> Are not these thoughts that I am sharing with you, my readers, information
> that I have created?
>
Globally:
In the case that humans DISCOVER the a-priori existing laws of Nature, your
communication does not transmit anything new, because the connections have
always been there, only we did not notice them afore.
In the case that humans CREATE mental images depicting a Nature that is - by
axiomatic reasons - not intelligible to humans, your communication says. "Is
it new for you that I can make myself understood?" and is a communication
for grammatical reasons, with no content.
 Locally:
Your communication reorders the concepts within the brain of the reader and
presumably changes some relations between the number of symbols and kinds of
symbols that were and are there in the brain of the reader.
The dissipation of some kind-property of symbols into a form of
extent/numerosity/manyness of symbols is a basic human experience which we
learn as we drink Mother's milk. So it should not be big news for the brain.


 Shannon entropy quantifies the information contained in a piece of data: it
> is the minimum average message length, in bits. Shannon information as the
> minimum number of bits needed to represent it is similar to the formulations
> of Chaitin information or Kolomogorov information. Shannon information has
> functionality for engineering purposes but since this is information without
> meaning it is better described as the measure of the amount and variety of
> the signal that is transmitted and not described as information. Shannon
> information theory is really signal transmission theory. Signal transmission
> is a necessary but not a sufficient condition for communications. There is
> no way to formulate the semantics, syntax or pragmatics of language within
> the Shannon framework.
>

This is what Wittgenstein said. In one of Pedro's ten points the same
question was raised. The new approach uses the same bricks and mortar but
builds an intertwined method of counting.
The relation between amount and variety of signals IS information. Insofar
of course Shannon is absolutely right. The results of ONE way of counting
áre correct and precise. The appear useless to us, because to be useful they
need to have the OTHER way of counting to stand and move alongside them.
The logical problem - and, for some a psychological hurdle - lies in having
to give up the EXCLUSIVITY of thinking and counting correctly as we count
the similarities.
Notwithstanding the foreground nature of the foreground, it is lucrative and
practical and useful to count in units of the background, too.
The truth, rightness, correctness, might and power of the calculations based
on an assumed similarity - and fusionability - of the elementar logical unit
remains unchallenged as we learn to use a bit more brain while learning to
count. It is not a sacrilege to count in units of Satan. The complete
opposite of what we consider fundamental exists also and it has just as many
logical legitimacy arguments as the traditional, which says, ach, forget
about the differences and the negations, similarity and continuity is what
we count, all the other is just plain background.

_______________________________________________
> fis mailing list
> fis@listas.unizar.es
> http://webmail.unizar.es/mailman/listinfo/fis
>
>
_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis

Reply via email to