On Sun, Oct 6, 2019, 2:59 AM Ben Goertzel <[email protected]> wrote: > Matt, > > > It probably takes a few hundred bits to describe the laws of physics. > > Hmm, that seems very few, just taking a look at the Standard Model and > General Relativity right now... >
Yudkowsky and Wolfram seem to think so. I don't know their exact reasoning, but it probably takes 400 bits to describe the 40 or so free parameters in string theory from which all the fundamental physical constants and properties of the fundamental particles could be derived. Not that we know how, or that string theory is the simplest theory that derives both quantum mechanics and general relativity. We do know that simple programs can produce seemingly complex output and there is no general procedure to invert the process. We are still searching. > > What sort of machine are you assuming is interpreting these bits? Algorithmic complexity is language dependent up to a constant, of course. Practical programming languages are optimized to do things that programmers typically want to do. I realize it would take a huge program to output "hello world" in Wolfram's 2 state 3 color universal Turing machine or Conway's game of Life. But suppose you enumerated all possible universes and ran the n'th one for n steps. Ours would be n ~ 10^120 steps, or about 400 bits. But we don't know how far the universe extends beyond the 13.8 billion light year event horizon. This implies that the probability of a planet evolving intelligent life could be much smaller than 10^-24. ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T8eabd59f2f06cc50-M560baa419ac25fa52147f2f2 Delivery options: https://agi.topicbox.com/groups/agi/subscription
