On Sun, Jan 25, 2015  'Chris de Morsella' via Everything List <
[email protected]> wrote:

> The very simple operation of defining the square root of two generates an
> -- (as far as we know infinitely extending) – number stream that is
> characterized by a high degree of randomness.
>

That would only be pseudorandom. Algorithms are deterministic, and random
means a event without a cause. There exists a short algorithm that can
produce the decimal value of digits the square root of 2 to any desired
degree of precision so it can't be random.  PI also has such a algorithm,
and so does e and so does any real number you can name, so none of them can
be random.

However Turing proved in 1936 that the vast majority of  numbers on the
real number line have no name and no algorithm can produce them, or rather
the only "algorithm" to produce a true random number would  be just as long
as the as the number itself; for example the only "algorithm" that could
produce a sequence of truly random digits would just be a list of those
digits. That's why no program can compress random white noise. To produce
true randomness you'd need a physical random number generator, something
involving radioactive decay or photons of light hitting a polarizing filter
would do the trick.

Turing also proved that while the computable numbers are denumerable, that
is countably infinite,  the non-computable (random) numbers belong to the
next higher class of infinity.So if you had a dart with a infinitely sharp
point and threw it at the real number line there is a 100% chance it will
hit a non-computable number and a 0% chance it will hit a computable
number.

By the way I think Alan Turing was one of the giants of 20th century
science, the current movie "The Imitation Game" is about his non-scientific
but very important work breaking the German Enigma Code during the second
world war. I loved the movie.

> Now say you are an observer from a parallel universe who somehow gets
> akind of sample set through some absurd imaginary portal that deluges the
> poor fellow with reams upon reams of seemingly random data
>

By "seemingly random" I assume you mean it came from a algorithm.

> each one of them, let’s give it a data dimension say a KB, MB, GB doesn’t
> matter, but constrained to a given chunk or window size. These
> inter-dimensional data packets unfortunately arrive to our observer in a
> scrambled order
>

How is the data stream scrambled, by another algorithm or a physical random
process such as radioactivity decay?


> >The data deluge arrives for eternity… but will the recipient ever be able
> to derive the function from the data.
>

In other words will the recipient ever be able to predict what the next
digit will be?
If you had a large enough sample and true randomness was not used then you
could at least in theory predict what the next digit will be ( assuming you
don't run up against the limit on the number of computations the universe
says can be performed in it),  but if true physical randomness was involved
at any point then it would be hopeless.

  John K Clark

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to