True, the contestants are given extra information, though. They know
ahead of time that the words make up the name of a place, or a common
saying, for example. That helps decrease the entropy considerably. They
also know the exact number of characters in the final answer and are able
to probe
Leichter, Jerry wrote:
| A couple of questions. How did you come up with the ~2.5 bits per
| word? Would a longer word have more bits?
He misapplied an incorrect estimate! :-) The usual estimate - going
back to Shannon's original papers on information theory, actually - is
that
Allen wrote:
Now take the phrase Mary had a lamb, and its fleece was as white as
snow. Not counting the quotes it is 52 characters and has both upper
and lower case characters, spaces and two specials or a total of 55 key
space. How big would the rainbow table be to contain that? How long