Hi Jesse and Bruno:
To consolidate my response:
Yes indeed. Most books give different definition of "axiomatic" and "recursively enumerable", but there is
a theorem by Craig which shows that for (most) theories, the notion are equivalent. (See Boolos and Jeffrey for
a proof of Craig's theorem).
Also, consistency is a pure syntactical notion, at least for theories having a symbol for "falsity" or having a negation connective. A theory (or a theorem proving machine) is consistent iff there is no derivation in it of the "falsity" (or of a proposition and its negation). Now, for the important class of first order logical theories (like Peano Arithmetics, Zermelo Fraenkel Set theory, etc.) the completeness theorem of Godel (note: the completeness, not the incompleteness one!) gives that being consistent is equivalent with having a model.
The All contains all information [is this controversial?] but that must add up to no net information content if my total system is to have no information. The small amount of external information necessary to define the All is balanced to zero net information by the other components of the system.
I do not think that all information adding up to no net information is controversial.
Further there is a dynamic within the All [computer simulations etc.] in the majority of positions I am aware of on this list - including my own - resulting in evolving universes.
I give a justification for that dynamic based in the incompleteness of one of the components of my system - the Nothing.
Now to maintain a zero net information within the All this dynamic must be devoid of selection and plan.
I used to think that the solution was to say the dynamic was random. I now think that this is not correct. Random after all is a selection in its own right and pays attention to past behavior. But to say that the dynamic is inconsistent with its past seems to retire the problem.
To me to say that the All is inconsistent carries benefits when explaining our universe not disadvantages.
I am not a mathematician by formal training but it seems to me that there may be additional justification for my position in what Bruno says below.
But I do think, and perhaps that's related with Hal intuition (I'm not sure), that any theory which try to capture too big things will be inconsistent. Classical example is the naive idea of set which leads to Frege theory and this one was shown inconsistent by Russell. Church's logical theory based on his Lambda calculus was inconsistent, etc. What is a little bit amazing is Hal insistence that the ALL should be inconsistent. This is not an uninteresting idea, but it is a risky idea which is in need of handling with care (like in the paraconsistent logic perhaps?).
As to the "Laws of Logic" I do not see that each kernel of information as I call them requires the presence of anything of the sort to be. The "laws of Logic" [in my opinion] are rather a way to progressively decompress the information in such a kernel. Turing said that to "prove" is the same as to "compute". So I seem to be in good company. To us "compute" is a process and thus assumes that time exists. This assumption is today suspect. Why should we impose it on other universes?