# Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

```On 28.01.2012 23:26 meekerdb said the following:
```
```On 1/27/2012 11:47 PM, Evgenii Rudnyi wrote:
```

```
...

```
```You disagree that engineers do not use thermodynamic entropy
```
```

Yes. I disagreed that information "has nothing to do with
thermodynamic entropy", as you wrote above. You keep switching
formulations. You write X and ask if I agree. I disagree. Then you
writing. There's a difference between "X is used in place of Y" and
"X has nothing to do with Y".
```
```
```
A good suggestion. It well might be that I express my thoughts unclear, sorry for that. Yet, I think that my examples show that
```
1) There is information that engineers employ.

2) There is the thermodynamic entropy.

3) Numerical values in 1) and 2) are not related to each other.

```
Otherwise I would appreciate if you express the relationship between information that engineers use and the thermodynamic entropy in your own words, as this is the question that I would like to understand.
```
```
I understand you when you say about the number of microstates. I do not understand though how they are related to the information employed by engineers. I would be glad to hear your comment on that.
```
Evgenii

```
```but you have not shown yet how information in engineering is
related with the thermodynamic entropy. Form the Millipede example

```
```http://en.wikipedia.org/wiki/Millipede_memory
```
```
"The earliest generation millipede devices used probes 10
nanometers in diameter and 70 nanometers in length, producing pits
about 40 nm in diameter on fields 92 µm x 92 µm. Arranged in a 32 x
32 grid, the resulting 3 mm x 3 mm chip stores 500 megabits of data
or 62.5 MB, resulting in an areal density, the number of bits per
square inch, on the order of 200 Gbit/in²."

If would be much easier to understand you if you say to what
thermodynamic entropy corresponds the value of 62.5 MB in
Millipede.
```
```

The Shannon information capacity is 5e8 bits. The thermodynamic
entropy depends on the energy used to switch a memory element. I'd
guess it must correspond to at least few tens of thousands of
electrons at 9v, so

S ~ [5e8 * 9e4 eV]/[8.6e-5 eV/degK * 300degK]~17e15

So the total entropy is about 17e15+5e8, and the information portion
is numerically (but not functionally) negligible compared to the
thermodynamic.

Brent

```
```
The only example on Thermodynamic Entropy == Information so far
from you was the work on a black hole. However, as far as I know,
there is no theory yet to describe a black hole, as from one side
you need gravitation, from the other side quantum effects. The
theory that unites them seems not to exist.

Evgenii

```
```
```
```My example would be Millipede

http://en.wikipedia.org/wiki/Millipede_memory

I am pretty sure that when IBM engineers develop it, they do
not employ the thermodynamic entropy to estimate its
information capabilities. Also, the increase of temperature
would be destroy saved information there.

Well, I might be deliberately obtuse indeed. Yet with the only
goal to reach a clear definition of what the information is.
Right now I would say that there is information in engineering
and in physics and they are different. The first I roughly
understand and the second not.

Evgenii

```
```Brent

```
```
```
```
```
```
```
```
```
```
--
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to