Re: The physical limits of computation

2024-01-22 Thread John Clark
On Sun, Jan 21, 2024 at 9:37 PM Brent Meeker  wrote:


> *> His* [Shannon's] *measure of information is relative to a channel and
> depends on the counterfactual number of messages that could be sent.
> You're presuming that each letter could have been one of 25 other letters.
> But there are only seven different letters in "tamaontietoa" so maybe only
> 3 bits are needed for each one. *
>

OK that is a valid point because  Shannon says information can be thought
of as a measure of surprise and if there are only 7 possibilities rather
than 26 then there is less surprise when you see each new letter, however
you were wrong when you said Shannon would not realize that  tamaontietoa
is information.


> *> Incidentally "tama on tietoa" is Finnish for "this is information".*
>

I knew nothing about Finnish except that it is one of the very few
languages spoken in Europe, along with Hungarian and Basque, that is not
Indo-European; so I used Google to confirm that you are right,  "tama on
tietoa" does indeed mean  "this is information". I was surprised when you
told me that, I didn't know it before so Shannon would say you gave me one
bit of new information, and thanks to that extra bit of information I was
able to conclude that Finnish requires not 7 but at least 8 characters, 7
letters and the space character, because neither Google nor Bard had any
idea what "tamaontietoa" meant.  I can also conclude that "tamaontietoa" is
NOT Finnish, although I can't rule out the possibility that it is Martian.

 John K ClarkSee what's on my new list at  Extropolis

inf

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CAJPayv1nVfYkPhEG7oyNYvYA7i_3KDZB1_xOW3a959OGv6jpsA%40mail.gmail.com.


Re: The physical limits of computation

2024-01-21 Thread Brent Meeker



On 1/21/2024 5:40 PM, John Clark wrote:
On Sun, Jan 21, 2024 at 7:03 PM Brent Meeker  
wrote:



>>> /If I write "tamaontietoa" is it information or gibberish? 
Is it about something? /


>> There's no reason it couldn't be both, Shannon would say it's
definitely information,

/>No he wouldn't./


Of course Shannon would say  "tamaontietoa" contains information, and 
he can even tell you how much. There are 26 letters in the alphabet so 
5 bits, 2^5, is more than enough to specify a letter, there are 12 
letters in your example so"tamaontietoa" contains 60 bits of information.
No.  Since 5 bits is /more than enough/, you've only show that it 
contains /less than 60 bits/.


Personally I don't think the information that tamaontietoacontains is 
very interesting but that's just me, Shannon makes no value judgments.


You have apparently never read Shannon.  His measure of information 
is/relative to a channel /and depends on the counterfactual number of 
messages that could be sent.  You're presuming that each letter could 
have been one of 25 other letters.  But there are only seven different 
letters in "tamaontietoa" so maybe only 3 bits are needed for each one.  
And maybe in this language "m" is always preceded and followed by "a" so 
there are only 10 symbols in the message instead of 12.  And so on.  You 
surely know that English is about 50% redundant; an estimate that 
Shannon himself made.  So it turns out you have to know the channel to 
calculate the information.


Incidentally "tama on tietoa" is Finnish for "this is information".

Brent


> /Shannon information is relative to the possible messages. /


Yes, and there are 26^12  possible 12 letter strings, and 
tamaontietoais one of them.


John K Clark    See what's on my new list at Extropolis 


21e






--
You received this message because you are subscribed to the Google 
Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send 
an email to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CAJPayv1TNS3-cA5a%3DLtK6ofeRX3CBVATzvnRAvZRMJ2J_FH8Nw%40mail.gmail.com 
.


--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/c674ed4f-b25a-4b4b-b9ff-e6572df279e4%40gmail.com.


Re: The physical limits of computation

2024-01-21 Thread Jason Resch
On Sat, Jan 20, 2024 at 1:46 AM 'scerir' via Everything List <
everything-list@googlegroups.com> wrote:

> Interesting quote about all that (and information)
> Frank Wilczek: "Information is another dimensionless quantity that plays a
> large and increasing role in our description of the world. Many of the
> terms that arise naturally in discussions of information have a distinctly
> physical character. For example we commonly speak of density of information
> and flow of information. Going deeper, we find far-reaching analogies
> between information and (negative) entropy, as noted already in Shannon's
> original work. Nowadays many discussions of the microphysical origin of
> entropy, and of foundations of statistical mechanics in general, start from
> discussions of information and ignorance. I think it is fair to say that
> there has been a unification fusing the physical quantity (negative)
> entropy and the conceptual quantity information. A strong formal connection
> between entropy and action arises through the Euclidean, imaginary-time
> path integral formulation of partition functions. Indeed, in that framework
> the expectation value of the Euclideanized action essentially is the
> entropy. The identification of entropy with Euclideanized action has been
> used, among other things, to motivate an algebraically simple (but deeply
> mysterious "derivation" of black hole entropy. If one could motivate the
> imaginary-time path integral directly and insightfully, rather than
> indirectly through the apparatus of energy eigenvalues, Boltzmann factors,
> and so forth, then one would have progressed toward this general prediction
> of unification: Fundamental action principles, and thus the laws of
> physics, will be re-interpreted as statements about information and its
> transformations." http://arxiv.org/pdf/1503.07735v1.pdf
> 
>

Interesting quote and reference, I appreciate them!

I especially like: "the laws of physics, will be reinterpreted as
statements about information and its transformations."

I think I will include that in my write up. :-)

Jason


>
>
> Il 20/01/2024 01:10 +01 Jason Resch  ha scritto:
>
>
> I put together a short write up on the relationship between physics,
> information, and computation, drawing heavily from the work of Seth Lloyd
> and others:
>
>
> https://drive.google.com/file/d/124q3ni51E3sf9kMC_sNKgP3ikcl8ou1t/view?usp=sharing
>
> I thought it might be interesting to members of this list who often debate
> whether our reality is fundamentally computational/informational.
>
> Jason
>
>
> --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to everything-list+unsubscr...@googlegroups.com.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/everything-list/CA%2BBCJUgRo-xNors%2BWZbDVpboT3QwiHC_NS24_uQ9_QkiTd3fyQ%40mail.gmail.com
> .
>
>
> --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to everything-list+unsubscr...@googlegroups.com.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/everything-list/1890352135.1654730.1705733200088%40mail1.libero.it
> 
> .
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CA%2BBCJUg8Px8e4QLzYer0G2MgM4k1V97YBXC%3DVNQ-x5W65m70Ww%40mail.gmail.com.


Re: The physical limits of computation

2024-01-21 Thread John Clark
On Sun, Jan 21, 2024 at 7:03 PM Brent Meeker  wrote:

>>> * If I write "tamaontietoa" is it information or gibberish?  Is it
>> about something? *
>>
>
> >> There's no reason it couldn't be both,  Shannon would say it's
> definitely information,
>
> * >No he wouldn't.*
>

Of course Shannon would say  "tamaontietoa" contains information, and he
can even tell you how much. There are 26 letters in the alphabet so 5 bits,
2^5, is more than enough to specify a letter, there are 12 letters in your
example so"tamaontietoa" contains 60 bits of information. Personally I
don't think the information that tamaontietoa contains is very interesting
but that's just me, Shannon makes no value judgments.


 > *Shannon information is relative to the possible messages. *


Yes, and there are 26^12  possible 12 letter strings, and tamaontietoa is
one of them.

 John K ClarkSee what's on my new list at  Extropolis

21e




>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CAJPayv1TNS3-cA5a%3DLtK6ofeRX3CBVATzvnRAvZRMJ2J_FH8Nw%40mail.gmail.com.


Re: The physical limits of computation

2024-01-21 Thread Brent Meeker



On 1/21/2024 12:00 PM, John Clark wrote:



On Sun, Jan 21, 2024 at 2:31 PM Brent Meeker  
wrote:


> /If I write "tamaontietoa" is it information or gibberish?  Is it
about something? /


There's no reason it couldn't be both, Shannon would say it's 
definitely information,

No he wouldn't.

but he doesn't care if that information contains a great profundity or 
is just gibberishbecause that is a matter of opinion and there is no 
disputing matters of taste. And if you're designing the technology to 
send messages through a fiber optic line mathematically you don't need 
to know what the messages will be saying or if the information in them 
is saying anything important, you just need to know how big it is, and 
Shannon can tell you that.


No, Shannon information is relative to the possible messages.  You have 
to know what the possible messages are to quantify the Shannon information.



/> All the science of information is about encoding and decoding;
it is not only substrate independent, it is content independent.
/


Yes, and Shannon was the first one to realizethat is not a bug, it's a 
feature.  Imagine the chaos that would result if Internet routers had 
to understand the information and determine whether it was important 
or just gibberishbefore they could transmit it!


John K Clark    See what's on my new list at Extropolis 


wsa



--
You received this message because you are subscribed to the Google 
Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send 
an email to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CAJPayv3fwe6sBuxZ3pXiFeFfN4Od%3DsQyrvpwFkSWHJOLSNNeZw%40mail.gmail.com 
.


--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/af58d8f6-aa95-4130-963f-c634a51196c7%40gmail.com.


Re: The physical limits of computation

2024-01-21 Thread Brent Meeker



On 1/21/2024 5:15 AM, John Clark wrote:
On Sat, Jan 20, 2024 at 7:27 PM Brent Meeker  
wrote:


//
/> The problem with this is that information, like complexity, has
no physically definite operational meaning.  You can't go into the
lab and ask what's the information content of "this"./


In 1948 Claude Shannongave us an operational definition of 
information, the amount of uncertainty reduced by a message, and it is 
measured in bits.
And Shannon's definition requires that the the possible messages be 
predefined.


There is also a thermodynamic definition for information, the amount 
of entropy that is reduced in a given system, and it is also measured 
in bits. The two definitions work harmoniously together.
Again, the thermodynamic definition depends on what variables will be 
ignored.


So if you know the encoding algorithm you can always determine how 
much information something has, or at least the maximum amount of 
information a message has the potential to hold. For example, we know 
from experiment that the human genome contains 3 billion base pairs, 
and we know there are 4 bases, so each base can represent 2 bits and 
there are 8 bits per byte; therefore the entire human genome only has 
the capacity to hold 750 MB of information; that's about the amount of 
information you could fit on an old-fashioned CD, not a DVD, just a 
CD. The true number must be considerably less than that because the 
human genome contains a huge amount of redundancy, 750 MB is just the 
upper bound. Incidentally that's why I now think the singularity is 
likely to happen sometime within the next 5 years, one year ago, 
before it became obvious that a computer had passed the Turing Test, I 
would've said 20 to 30 years.
A good example, proving my point.  A lot, maybe even a majority, of the 
the human genome is junk and doesn't code for anything and you can only 
know this by seeing how it interacts in a cell.  It's "information" is 
context dependent, not inherent.





I think we can be as certain as we can be certain of anything that it 
should be possible to build a seed AI that can grow from knowing 
nothing to being super-intelligent, and the recipe for building such a 
thing must be less than 750 MB, a *LOT* less.

It takes a womb and all that is needed to support a womb.

After all Albert Einstein went from understanding precisely nothing in 
1879 to being the first man to understand General Relativity in 1915,
He understood general relativity by absorbing information from 
Minkowski, Riemann, Maxwell, Lorentz, and Grossman...not just from his 
genome.


Brent


and the human genome only contains 750 megs of information, and yet 
that is enough information to construct an entire human being not just 
a brain. So whatever algorithm Einstein used to extract information 
from his environment was, it must have been pretty simple, much much 
less than 750 megs. That's why I've been saying for years that 
super-intelligence could be achieved just by scaling things up, no new 
scientific discovery was needed, just better engineering; although I 
admit I was surprised how little scaling up turned out to be required.


John K Clark    See what's on my new list at Extropolis 


98n


--
You received this message because you are subscribed to the Google 
Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send 
an email to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CAJPayv31LubK_sNn6tRspUwfjRqvOHtf2dTDcr%2B96xBAhQmkRQ%40mail.gmail.com 
.


--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/e8b11e9f-1670-4b7f-9b8d-5590aff96087%40gmail.com.


Re: The physical limits of computation

2024-01-21 Thread John Clark
On Sun, Jan 21, 2024 at 2:31 PM Brent Meeker  wrote:

> * If I write "tamaontietoa" is it information or gibberish?  Is it
> about something? *
>

There's no reason it couldn't be both,  Shannon would say it's definitely
information, but he doesn't care if that information contains a great
profundity or is just gibberish because that is a matter of opinion and
there is no disputing matters of taste. And if you're designing the
technology to send messages through a fiber optic line mathematically you
don't need to know what the messages will be saying or if the information
in them is saying anything important, you just need to know how big it is,
and Shannon can tell you that.


>
> * > All the science of information is about encoding and decoding; it is
> not only substrate independent, it is content independent.*
>

Yes, and Shannon was the first one to realize that is not a bug, it's a
feature.  Imagine the chaos that would result if Internet routers had to
understand the information and determine whether it was important or just
gibberish before they could transmit it!

  John K ClarkSee what's on my new list at  Extropolis

wsa

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CAJPayv3fwe6sBuxZ3pXiFeFfN4Od%3DsQyrvpwFkSWHJOLSNNeZw%40mail.gmail.com.


Re: The physical limits of computation

2024-01-21 Thread Brent Meeker
That assumes that there is something that the physical state is /about/. 
If I write "tamaontietoa" is it information or gibberish?  Is it about 
something?  All the science of information is about encoding and 
decoding; it is not only substrate independent, it is content independent.


Brent

On 1/20/2024 10:23 PM, 'scerir' via Everything List wrote:


"In some respects, information is a qualitatively different sort of 
entity from all others in
terms of which the physical sciences describe the world. It is not, 
for instance, a function
only of tensor fields on spacetime (as general relativity requires all 
physical quantities to

be), nor is it a quantum-mechanical observable.
But in other respects, information does resemble some entities that 
appear in laws of
physics: the theory of computation, and statistical mechanics, seem to 
refer directly to it
without regard to the specific media in which it is instantiated, just 
as conservation laws
do for the electromagnetic four-current or the energy-momentum tensor. 
We call that the
substrate-independence of information. Information can also be moved 
from one type of
medium to another while retaining all its properties qua information. 
We call this its
interoperability property; it is what makes human capabilities such as 
language and science
possible, as well as biological adaptations that use symbolic codes, 
such as the genetic

code.
Also, information is of the essence in preparation and measurement, 
both of which are
necessary for testing scientific theories. The output of a measurement 
is information; the
input of a preparation includes information, specifying an attribute 
with which a physical

system is to be prepared.
All these applications of information involve abstraction, in that one 
entity is represented
symbolically by another. But information is not abstract in the same 
sense as, say, the set
of all prime numbers, for it only exists when it is physically 
instantiated. So the laws
governing it, like those governing computation – but unlike those 
governing prime
numbers – are laws of physics. In this paper we conjecture what these 
laws are.
Also, despite being physical, information has a counter-factual 
character: an object in a
particular physical state cannot be said to carry information unless 
it could have been in a
different state. As Weaver (1949) put it, this word ‘information’ in 
communication theory relates not so much to what you *do* say, as to 
what you *could* say…." D.Deutsch, Constructor Theory, Arxiv



Il 21/01/2024 01:28 +01 Brent Meeker  ha scritto:
The problem with this is that information, like complexity, has no 
physically definite operational meaning.  You can't go into the lab 
and ask what's the information content of "this".


Brent

On 1/19/2024 10:46 PM, 'scerir' via Everything List wrote:


Interesting quote about all that (and information)

Frank Wilczek: "Information is another dimensionless quantity that 
plays a large and increasing role in our description of the world. 
Many of the terms that arise naturally in discussions of information 
have a distinctly physical character. For example we commonly speak 
of density of information and flow of information. Going deeper, we 
find far-reaching analogies between information and (negative) 
entropy, as noted already in Shannon's original work. Nowadays many 
discussions of the microphysical origin of entropy, and of 
foundations of statistical mechanics in general, start from 
discussions of information and ignorance. I think it is fair to say 
that there has been a unification fusing the physical quantity 
(negative) entropy and the conceptual quantity information. A strong 
formal connection between entropy and action arises through the 
Euclidean, imaginary-time path integral formulation of partition 
functions. Indeed, in that framework the expectation value of the 
Euclideanized action essentially is the entropy. The identification 
of entropy with Euclideanized action has been used, among other 
things, to motivate an algebraically simple (but deeply mysterious 
"derivation" of black hole entropy. If one could motivate the 
imaginary-time path integral directly and insightfully, rather than 
indirectly through the apparatus of energy eigenvalues, Boltzmann 
factors, and so forth, then one would have progressed toward this 
general prediction of unification: Fundamental action principles, 
and thus the laws of physics, will be re-interpreted as statements 
about information and its transformations." 
http://arxiv.org/pdf/1503.07735v1.pdf 
 


Il 20/01/2024 01:10 +01 Jason Resch  ha scritto:
I put together a short write up on the relationship between 
physics, information, and computation, drawing heavily from the 
work of Seth Lloyd and others:
https://drive.google.com/file/d/124q3ni51E3sf9kMC_sNKgP3ikcl8ou1t/view?usp=sharing 

I thought it might be 

Re: The physical limits of computation

2024-01-21 Thread John Clark
On Sat, Jan 20, 2024 at 7:27 PM Brent Meeker  wrote:

* > The problem with this is that information, like complexity, has no
> physically definite operational meaning.  You can't go into the lab and ask
> what's the information content of "this".*
>

In 1948 Claude Shannon gave us an operational definition of information,
the amount of uncertainty reduced by a message, and it is measured in bits.
There is also a thermodynamic definition for information, the amount of
entropy that is reduced in a given system, and it is also measured in bits.
The two definitions work harmoniously together.

So if you know the encoding algorithm you can always determine how much
information something has, or at least the maximum amount of information a
message has the potential to hold. For example, we know from experiment
that the human genome contains 3 billion base pairs, and we know there are
4 bases, so each base can represent 2 bits and there are 8 bits per byte;
therefore the entire human genome only has the capacity to hold 750 MB of
information; that's about the amount of information you could fit on an
old-fashioned CD, not a DVD, just a CD. The true number must be
considerably less than that because the human genome contains a huge amount
of redundancy, 750 MB is just the upper bound. Incidentally that's why I
now think the singularity is likely to happen sometime within the next 5
years, one year ago, before it became obvious that a computer had passed
the Turing Test, I would've said 20 to 30 years.

I think we can be as certain as we can be certain of anything that it
should be possible to build a seed AI that can grow from knowing nothing to
being super-intelligent, and the recipe for building such a thing must be
less than 750 MB, a *LOT* less. After all Albert Einstein went from
understanding precisely nothing in 1879 to being the first man to
understand General Relativity in 1915, and the human genome only contains
750 megs of information, and yet that is enough information to construct an
entire human being not just a brain. So whatever algorithm Einstein used to
extract information from his environment was, it must have been pretty
simple, much much less than 750 megs. That's why I've been saying for years
that super-intelligence could be achieved just by scaling things up, no new
scientific discovery was needed, just better engineering; although I admit
I was surprised how little scaling up turned out to be required.

  John K ClarkSee what's on my new list at  Extropolis

98n

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CAJPayv31LubK_sNn6tRspUwfjRqvOHtf2dTDcr%2B96xBAhQmkRQ%40mail.gmail.com.


Re: The physical limits of computation

2024-01-20 Thread 'scerir' via Everything List
"In some respects, information is a qualitatively different sort of entity from 
all others in
terms of which the physical sciences describe the world. It is not, for 
instance, a function
only of tensor fields on spacetime (as general relativity requires all physical 
quantities to
be), nor is it a quantum-mechanical observable.
But in other respects, information does resemble some entities that appear in 
laws of
physics: the theory of computation, and statistical mechanics, seem to refer 
directly to it
without regard to the specific media in which it is instantiated, just as 
conservation laws
do for the electromagnetic four-current or the energy-momentum tensor. We call 
that the
substrate-independence of information. Information can also be moved from one 
type of
medium to another while retaining all its properties qua information. We call 
this its
interoperability property; it is what makes human capabilities such as language 
and science
possible, as well as biological adaptations that use symbolic codes, such as 
the genetic
code.
Also, information is of the essence in preparation and measurement, both of 
which are
necessary for testing scientific theories. The output of a measurement is 
information; the
input of a preparation includes information, specifying an attribute with which 
a physical
system is to be prepared.
All these applications of information involve abstraction, in that one entity 
is represented
symbolically by another. But information is not abstract in the same sense as, 
say, the set
of all prime numbers, for it only exists when it is physically instantiated. So 
the laws
governing it, like those governing computation – but unlike those governing 
prime
numbers – are laws of physics. In this paper we conjecture what these laws are.
Also, despite being physical, information has a counter-factual character: an 
object in a
particular physical state cannot be said to carry information unless it could 
have been in a
different state. As Weaver (1949) put it, this word ‘information’ in 
communication theory relates not so much to what you *do* say, as to what you 
*could* say…." D.Deutsch, Constructor Theory, Arxiv

> Il 21/01/2024 01:28 +01 Brent Meeker  ha scritto:
>  
>  
> The problem with this is that information, like complexity, has no physically 
> definite operational meaning.  You can't go into the lab and ask what's the 
> information content of "this".
> 
> Brent
> 
> On 1/19/2024 10:46 PM, 'scerir' via Everything List wrote:
> 
> > 
> > Interesting quote about all that (and information)
> > 
> > Frank Wilczek: "Information is another dimensionless quantity that plays a 
> > large and increasing role in our description of the world. Many of the 
> > terms that arise naturally in discussions of information have a distinctly 
> > physical character. For example we commonly speak of density of information 
> > and flow of information. Going deeper, we find far-reaching analogies 
> > between information and (negative) entropy, as noted already in Shannon's 
> > original work. Nowadays many discussions of the microphysical origin of 
> > entropy, and of foundations of statistical mechanics in general, start from 
> > discussions of information and ignorance. I think it is fair to say that 
> > there has been a unification fusing the physical quantity (negative) 
> > entropy and the conceptual quantity information. A strong formal connection 
> > between entropy and action arises through the Euclidean, imaginary-time 
> > path integral formulation of partition functions. Indeed, in that framework 
> > the expectation value of the Euclideanized action essentially is the 
> > entropy. The identification of entropy with Euclideanized action has been 
> > used, among other things, to motivate an algebraically simple (but deeply 
> > mysterious "derivation" of black hole entropy. If one could motivate the 
> > imaginary-time path integral directly and insightfully, rather than 
> > indirectly through the apparatus of energy eigenvalues, Boltzmann factors, 
> > and so forth, then one would have progressed toward this general prediction 
> > of unification: Fundamental action principles, and thus the laws of 
> > physics, will be re-interpreted as statements about information and its 
> > transformations." http://arxiv.org/pdf/1503.07735v1.pdf 
> > https://l.facebook.com/l.php?u=http%3A%2F%2Farxiv.org%2Fpdf%2F1503.07735v1.pdf=6AQGH8JQz
> >  
> > 
> > > Il 20/01/2024 01:10 +01 Jason Resch  
> > > mailto:jasonre...@gmail.com ha scritto:
> > >  
> > >  
> > > I put together a short write up on the relationship between physics, 
> > > information, and computation, drawing heavily from the work of Seth Lloyd 
> > > and others:
> > >  
> > > https://drive.google.com/file/d/124q3ni51E3sf9kMC_sNKgP3ikcl8ou1t/view?usp=sharing
> > >  
> > > I thought it might be interesting to members of this list who often 
> > > debate whether our reality is fundamentally computational/informational.
> 

Re: The physical limits of computation

2024-01-20 Thread Brent Meeker
The problem with this is that information, like complexity, has no 
physically definite operational meaning.  You can't go into the lab and 
ask what's the information content of "this".


Brent

On 1/19/2024 10:46 PM, 'scerir' via Everything List wrote:


Interesting quote about all that (and information)

Frank Wilczek: "Information is another dimensionless quantity that 
plays a large and increasing role in our description of the world. 
Many of the terms that arise naturally in discussions of information 
have a distinctly physical character. For example we commonly speak of 
density of information and flow of information. Going deeper, we find 
far-reaching analogies between information and (negative) entropy, as 
noted already in Shannon's original work. Nowadays many discussions of 
the microphysical origin of entropy, and of foundations of statistical 
mechanics in general, start from discussions of information and 
ignorance. I think it is fair to say that there has been a unification 
fusing the physical quantity (negative) entropy and the conceptual 
quantity information. A strong formal connection between entropy and 
action arises through the Euclidean, imaginary-time path integral 
formulation of partition functions. Indeed, in that framework the 
expectation value of the Euclideanized action essentially is the 
entropy. The identification of entropy with Euclideanized action has 
been used, among other things, to motivate an algebraically simple 
(but deeply mysterious "derivation" of black hole entropy. If one 
could motivate the imaginary-time path integral directly and 
insightfully, rather than indirectly through the apparatus of energy 
eigenvalues, Boltzmann factors, and so forth, then one would have 
progressed toward this general prediction of unification: Fundamental 
action principles, and thus the laws of physics, will be 
re-interpreted as statements about information and its 
transformations." http://arxiv.org/pdf/1503.07735v1.pdf 
 


Il 20/01/2024 01:10 +01 Jason Resch  ha scritto:
I put together a short write up on the relationship between physics, 
information, and computation, drawing heavily from the work of Seth 
Lloyd and others:
https://drive.google.com/file/d/124q3ni51E3sf9kMC_sNKgP3ikcl8ou1t/view?usp=sharing 

I thought it might be interesting to members of this list who often 
debate whether our reality is fundamentally computational/informational.

Jason

--
You received this message because you are subscribed to the Google 
Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, 
send an email to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CA%2BBCJUgRo-xNors%2BWZbDVpboT3QwiHC_NS24_uQ9_QkiTd3fyQ%40mail.gmail.com 
. 


--
You received this message because you are subscribed to the Google 
Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send 
an email to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/1890352135.1654730.1705733200088%40mail1.libero.it 
.


--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/bafbd2d4-e18f-4be3-93fd-ddf8145128ca%40gmail.com.


Re: The physical limits of computation

2024-01-19 Thread 'scerir' via Everything List
Interesting quote about all that (and information)

Frank Wilczek: "Information is another dimensionless quantity that plays a 
large and increasing role in our description of the world. Many of the terms 
that arise naturally in discussions of information have a distinctly physical 
character. For example we commonly speak of density of information and flow of 
information. Going deeper, we find far-reaching analogies between information 
and (negative) entropy, as noted already in Shannon's original work. Nowadays 
many discussions of the microphysical origin of entropy, and of foundations of 
statistical mechanics in general, start from discussions of information and 
ignorance. I think it is fair to say that there has been a unification fusing 
the physical quantity (negative) entropy and the conceptual quantity 
information. A strong formal connection between entropy and action arises 
through the Euclidean, imaginary-time path integral formulation of partition 
functions. Indeed, in that framework the expectation value of the Euclideanized 
action essentially is the entropy. The identification of entropy with 
Euclideanized action has been used, among other things, to motivate an 
algebraically simple (but deeply mysterious "derivation" of black hole entropy. 
If one could motivate the imaginary-time path integral directly and 
insightfully, rather than indirectly through the apparatus of energy 
eigenvalues, Boltzmann factors, and so forth, then one would have progressed 
toward this general prediction of unification: Fundamental action principles, 
and thus the laws of physics, will be re-interpreted as statements about 
information and its transformations." http://arxiv.org/pdf/1503.07735v1.pdf 
https://l.facebook.com/l.php?u=http%3A%2F%2Farxiv.org%2Fpdf%2F1503.07735v1.pdf=6AQGH8JQz
 

> Il 20/01/2024 01:10 +01 Jason Resch  ha scritto:
>  
>  
> I put together a short write up on the relationship between physics, 
> information, and computation, drawing heavily from the work of Seth Lloyd and 
> others:
>  
> https://drive.google.com/file/d/124q3ni51E3sf9kMC_sNKgP3ikcl8ou1t/view?usp=sharing
>  
> I thought it might be interesting to members of this list who often debate 
> whether our reality is fundamentally computational/informational.
>  
> Jason 
> 
>  
> 
> --
> You received this message because you are subscribed to the Google Groups 
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an 
> email to everything-list+unsubscr...@googlegroups.com 
> mailto:everything-list+unsubscr...@googlegroups.com.
> To view this discussion on the web visit 
> https://groups.google.com/d/msgid/everything-list/CA%2BBCJUgRo-xNors%2BWZbDVpboT3QwiHC_NS24_uQ9_QkiTd3fyQ%40mail.gmail.com
>  
> https://groups.google.com/d/msgid/everything-list/CA%2BBCJUgRo-xNors%2BWZbDVpboT3QwiHC_NS24_uQ9_QkiTd3fyQ%40mail.gmail.com?utm_medium=email_source=footer.
> 

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/1890352135.1654730.1705733200088%40mail1.libero.it.