On Sun, 04 Feb 2007 11:10:57 -0500, Pei Wang <[EMAIL PROTECTED]> wrote:

I don't think any intelligent system (human or machine) can achieve
any of the three desiderata, except in trivial cases.

I have no doubt you and Ben are correct on this point. Enormous resources would be required for an ideal version of Jaynes' objective bayesian model of the probabilistic robot, which is one reason why I think it might be important to consider which philosophical interpretation to emulate.

Personally I would be inclined to allow exceptions to Jaynes' second and third desiderata. The reason for compromising the second is easy enough to see: it is simply not always feasible to have and consider all the relevant information before making a decision. Any compromise of the third desiderata (that our AGI must by some supposed force of objective logic always represent equivalent states of information with equivalent plausibility assignments) is more controversial.

People of Keynesian/logical persuasion might cry heresy, but I would respond that all is not lost; that these apparent sacrifices still leave us with the perfectly reasonable and coherent subjectivist account of De Finetti. The question then would be how to go about implementing it. I'm a bit skeptical that it can be done, but, unlike you and Ben, I am by no means an expert in the field of AI. Is it possible to program AGI without forcing it to abide by the tenets of objective/logical bayesianism?

Subjectivists like De Finetti and Ramsey define probability as degree of belief but unlike the objective/logical bayesians they measure it according to an agent's *willingness to act* on said degrees of belief, (as opposed to some supposed calculable mental barometer of rationally determined belief separate from the will). Even though I might support the subjectivist programme philosophically, I'm not sure if or how a programmer might get a handle on this subjective 'willingness to act', as distinct from the logical restraints that objective bayesians would already seek to impose.


-gts

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303

Reply via email to