On Wednesday, October 02, 2019, at 11:24 AM, James Bowery wrote: > ANY situation can be one where the most viable _decision_ is to stop the > search for the simplest explanation and _act_ on the simplest explanation you > have found _thus far_. This is a consequence of the incomputability of > Solomonoff Induction in the face of limited resources.
>From my amateurish view of this doesn't Gödel incompleteness show that there >will be at least one less simple future explanation that may or may not be >found? So the decision to expend more resources searching should be based on >trust in environmental computability? On Wednesday, October 02, 2019, at 11:24 AM, James Bowery wrote: > There is an explore/exploit tradeoff. See the prior "issue" with > "computability" and then compound that with the "irrationality" of the > valuation function applied during sequential decision theory. How do you > justify that, outside of the "exploration" provided by evolution? Seems rationality leads to smaller and smaller search spaces so you have to often back out often while maintaining global/local perspective. What produces better results, irrationality or randomness? On Wednesday, October 02, 2019, at 11:24 AM, James Bowery wrote: > Not in the way that theologians posing as "social scientists" would have us > believe. For example, choosing a universal Turing machine as the basis for > Solomonoff Induction can be, and has been blown into an argument to abandon > induction entirely by simply defining one's UTM as that which outputs all > observations up to the present. The benefit of such theology, posing as > "social science" is the theologian, serving his political masters, can > "scientifically justify" anything they want to do to you. That’s some pretty good insight there. There is flip-flopping between theologians and "social scientists"… ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T8eabd59f2f06cc50-Me50f6ea163e897223b1a2246 Delivery options: https://agi.topicbox.com/groups/agi/subscription
