Re: [agi] Self-maintaining Architecture first for AI

2008-05-11 Thread William Pearson
2008/5/11 Russell Wallace [EMAIL PROTECTED]:
 On Sat, May 10, 2008 at 10:10 PM, William Pearson [EMAIL PROTECTED] wrote:
 It depends on the system you are designing on. I think you can easily
 create as many types of sand box as you want in programming language E
 (1) for example. If the principle of least authority (2) is embedded
 in the system, then you shouldn't have any problems.

 Sure, I'm talking about much lower-level concepts though. For example,
 on a system with 8 gigabytes of memory, a candidate program has
 computed a 5 gigabyte string. For its next operation, it appends that
 string to itself, thereby crashing the VM due to running out of
 memory. How _exactly_ do you prevent this from happening (while
 meeting all the other requirements for an AI platform)? It's a
 trickier problem than it sounds like it ought to be.


I'm starting to mod qemu (it is not a straightforward process) to add
capabilities.  The VM will have a set amount of memory and if a
location outside this memory is referenced, it will throw a page fault
inside the VM, not crash it directly. The system will be able to deal
with it how it wants to, something smarter than, Oh no I have done a
bad memory reference, I must stop all my work and lose everything!!!
Hopefully.
In the greater scheme of things the model that a computer has
unlimited virtual memory has to go as well. Else you might get
important things on the hard disk and have much thrashing and ephemera
in main memory. You could still make high level abstractions but the
virtual memory one is not the one to display to the low level
programs.
  Will Pearson

---
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244id_secret=101455710-f059c4
Powered by Listbox: http://www.listbox.com


Re: Newcomb's Paradox (was Re: [agi] Goal Driven Systems and AI Dangers)

2008-05-11 Thread Vladimir Nesov
On Sun, May 11, 2008 at 4:06 AM, Matt Mahoney [EMAIL PROTECTED] wrote:


 Yes, but in this case the input to P is not (P,y), it is a self reference
 to whatever program P is running plus y.


It's irrelevant, because description of P (or Q) could've been
contained in the prefix that said simulate this on yourself: , and
it could've been handled by the same machinery that in my example was
printing the output is . The only problem is brackets, so if the
description is always [finite prefix with machine specification]+[data
parameters], it will work.

 I think you would agree that a virtual machine with n bits of memory can
 only be implemented on a machine with more than n bits of memory.


If it needs to, it can reserve a finite number of additional bits just
for this purpose.

-- 
Vladimir Nesov
[EMAIL PROTECTED]

---
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244id_secret=101455710-f059c4
Powered by Listbox: http://www.listbox.com


Re: [agi] Self-maintaining Architecture first for AI

2008-05-11 Thread Russell Wallace
On Sun, May 11, 2008 at 7:45 AM, William Pearson [EMAIL PROTECTED] wrote:
 I'm starting to mod qemu (it is not a straightforward process) to add
 capabilities.

So if I understand correctly, you're proposing to sandbox candidate
programs by running them in their own virtual PC, with their own
operating system instance? I assume this works recursively, so a
qemu-sandboxed program can itself run qemu?

---
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244id_secret=101455710-f059c4
Powered by Listbox: http://www.listbox.com


Re: Newcomb's Paradox (was Re: [agi] Goal Driven Systems and AI Dangers)

2008-05-11 Thread Matt Mahoney

--- Vladimir Nesov [EMAIL PROTECTED] wrote:

 On Sun, May 11, 2008 at 4:06 AM, Matt Mahoney [EMAIL PROTECTED]
 wrote:
 
 
  Yes, but in this case the input to P is not (P,y), it is a self
 reference
  to whatever program P is running plus y.
 
 
 It's irrelevant, because description of P (or Q) could've been
 contained in the prefix that said simulate this on yourself: , and
 it could've been handled by the same machinery that in my example was
 printing the output is . The only problem is brackets, so if the
 description is always [finite prefix with machine specification]+[data
 parameters], it will work.

If a machine P could simulate two other machines Q and R (each with n bits
of memory), then P needs n+1 bits, n to reproduce all the states of Q or
R, and 1 to remember which machine it is simulating.  You described a
machine P that can simulate only P.



-- Matt Mahoney, [EMAIL PROTECTED]

---
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244id_secret=101455710-f059c4
Powered by Listbox: http://www.listbox.com


Re: Newcomb's Paradox (was Re: [agi] Goal Driven Systems and AI Dangers)

2008-05-11 Thread Vladimir Nesov
On Sun, May 11, 2008 at 8:57 PM, Matt Mahoney [EMAIL PROTECTED] wrote:

 If a machine P could simulate two other machines Q and R (each with n bits
 of memory), then P needs n+1 bits, n to reproduce all the states of Q or
 R, and 1 to remember which machine it is simulating.  You described a
 machine P that can simulate only P.


You are being obscure again. A machine that I described can simulate
any machine of limited size and number of states. Basically, it is a
universal machine that can run any other machine, one of which is P,
the machine itself, but other machines are allowed too. Each machine
can initiate loading of another machine on the underlying universal
machine.

-- 
Vladimir Nesov
[EMAIL PROTECTED]

---
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244id_secret=101455710-f059c4
Powered by Listbox: http://www.listbox.com


Re: Newcomb's Paradox (was Re: [agi] Goal Driven Systems and AI Dangers)

2008-05-11 Thread Matt Mahoney

--- Vladimir Nesov [EMAIL PROTECTED] wrote:

 On Sun, May 11, 2008 at 8:57 PM, Matt Mahoney [EMAIL PROTECTED]
 wrote:
 
  If a machine P could simulate two other machines Q and R (each with n
 bits
  of memory), then P needs n+1 bits, n to reproduce all the states of Q
 or
  R, and 1 to remember which machine it is simulating.  You described a
  machine P that can simulate only P.
 
 
 You are being obscure again. A machine that I described can simulate
 any machine of limited size and number of states. Basically, it is a
 universal machine that can run any other machine, one of which is P,
 the machine itself, but other machines are allowed too. Each machine
 can initiate loading of another machine on the underlying universal
 machine.

You have only shown that P can simulate itself with no additional memory.
If it simulates any other machine, then you need to show that no
additional memory is needed for those machines too.

Also, it doesn't make sense to talk about universal finite state
machines, because there are only a finite number of unique programs it can
simulate.  Instead we can define a machine as general purpose if it can
accept 2 or more program specifications (which can be as small as 1 bit).

So let me restate my claim.  We say that P simulates Q if for all x,
P((Q,x)) = Q(x).  We say P is general purpose if there exists Q and R such
that P simulates both, and there is an x such that Q(x) != R(x) (i.e. Q
and R are different programs).  Then I claim there is no general purpose
finite state machine P that can simulate itself.


-- Matt Mahoney, [EMAIL PROTECTED]

---
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244id_secret=101455710-f059c4
Powered by Listbox: http://www.listbox.com