Re: [agi] Reverse Turing Test

2015-03-06 Thread Steve Richfield via AGI
Matt, On Fri, Mar 6, 2015 at 7:32 PM, Matt Mahoney via AGI wrote: > On Fri, Mar 6, 2015 at 8:26 PM, Steve Richfield > wrote: > > The objective of the competition is to uncover presently hidden > challenges that lie ahead, e.g. is it even possible to explain to people > that sometimes it is what

Re: [agi] Reverse Turing Test

2015-03-06 Thread Matt Mahoney via AGI
On Fri, Mar 6, 2015 at 8:26 PM, Steve Richfield wrote: > The objective of the competition is to uncover presently hidden challenges > that lie ahead, e.g. is it even possible to explain to people that sometimes > it is what they value the most that is the very PROBLEM they wish to > overcome. I

Re: [agi] SAT and Dynamic Programs of Models

2015-03-06 Thread Matt Mahoney via AGI
On Fri, Mar 6, 2015 at 4:12 PM, Jim Bromer via AGI wrote: > Let me restate that question. > Are there any other compression methods that have an average > logarithmic compression ratio, which can take an exponential time to > decompress using a general set of algorithms, that do not rely on any >

Re: [agi] Reverse Turing Test

2015-03-06 Thread Steve Richfield via AGI
Matt, On Fri, Mar 6, 2015 at 4:49 PM, Matt Mahoney via AGI wrote: > What I'm asking is the objective of the test. Is it to convince the judge > that you are a computer? > No, that would be very easy, be very easy, be very easy, be very easy, STACK OVERFLOW. Presuming that you are SOMETHING tha

Re: [agi] Reverse Turing Test

2015-03-06 Thread Mike Archbold via AGI
On 3/6/15, Steve Richfield via AGI wrote: > It seems obvious (to) me that any envisioned super duper AGI of the future > would be easily able to win a reverse Turing competition - demonstrating > with advanced logical solutions to difficult problems that it is a machine > and NOT merely human. >

Re: [agi] Reverse Turing Test

2015-03-06 Thread Matt Mahoney via AGI
What I'm asking is the objective of the test. Is it to convince the judge that you are a computer? On Fri, Mar 6, 2015 at 6:24 PM, Steve Richfield wrote: > Matt, > > Just because there are no (known) non-biological AGIs doesn't mean that we > can't run a competition for the biological variety.

Re: [agi] Reverse Turing Test

2015-03-06 Thread Steve Richfield via AGI
Matt, Just because there are no (known) non-biological AGIs doesn't mean that we can't run a competition for the biological variety. Just set it up so that all participants are welcomed, regardless of their technology. Continuing... On Fri, Mar 6, 2015 at 9:57 AM, Matt Mahoney via AGI wrote: >

Re: [agi] SAT and Dynamic Programs of Models

2015-03-06 Thread Jim Bromer via AGI
Let me restate that question. Are there any other compression methods that have an average logarithmic compression ratio, which can take an exponential time to decompress using a general set of algorithms, that do not rely on any non-general special substitutions, which are not reducible to Boolean

Re: [agi] SAT and Dynamic Programs of Models

2015-03-06 Thread Jim Bromer via AGI
One other thing Matt. We have talked about this before. Your interest in compression should tell you intuitively that the P<>NP theory is unlikely. Are there any other compression schemes - that use systems of algorithms but don't use special non-general individual substitutions - that are exponent

Re: [agi] Reverse Turing Test

2015-03-06 Thread Benjamin Kapp via AGI
I think you are just stringing words together. If I'm mistaken perhaps you can express yourself without using obscure terminology from various domains and metaphorically at that. Why not just say what you mean. The objective of AGI is hard enough as it is. We don't need to make it harder by tal

Re: [agi] SAT and Dynamic Programs of Models

2015-03-06 Thread Jim Bromer via AGI
Matt I appreciate the conversations we have had about the N=?NP question. My interest in the subject concerns the question of how significant improvements in SAT solutions could affect development of AGI programs. The reason I have not been interested in joining discussion groups of N=?NP is just b

RE: [agi] Reverse Turing Test

2015-03-06 Thread Nanograte Knowledge Technologies via AGI
In my view But the Turing Test already has such a test inherent in it, as antithesis. That is the point. Logically, if a human proves that it is a machine, then one proves one's own humanity. If one proves that it isn't, then one proves the same thing. The test always returns a value of 1.

Re: [agi] Reverse Turing Test

2015-03-06 Thread Matt Mahoney via AGI
Could you explain the rules? On Fri, Mar 6, 2015 at 12:54 PM, Steve Richfield via AGI wrote: > It seems obvious (to) me that any envisioned super duper AGI of the future > would be easily able to win a reverse Turing competition - demonstrating > with advanced logical solutions to difficult pro

[agi] Reverse Turing Test

2015-03-06 Thread Steve Richfield via AGI
It seems obvious (to) me that any envisioned super duper AGI of the future would be easily able to win a reverse Turing competition - demonstrating with advanced logical solutions to difficult problems that it is a machine and NOT merely human. To see how such an AGI might function, and how its re

Re: [agi] SAT and Dynamic Programs of Models

2015-03-06 Thread Matt Mahoney via AGI
On Fri, Mar 6, 2015 at 11:59 AM, John Rose via AGI wrote: > No, there are others that believe P=NP besides Jim I'm sure some of you have > been following Bolotin's argument from last year: > > https://medium.com/the-physics-arxiv-blog/the-astounding-link-between-the-p-np-problem-and-the-quantum-n

RE: [agi] SAT and Dynamic Programs of Models

2015-03-06 Thread Nanograte Knowledge Technologies via AGI
Just a thought. John, this has been my contention all along. It exists, but it also does not exist, and everything in between and nothing. It exists in both "known states" of a singular quantum universe, yet is not manifested by either or combination of both alone. Something is possibly mi

RE: [agi] SAT and Dynamic Programs of Models

2015-03-06 Thread John Rose via AGI
> -Original Message- > From: Matt Mahoney via AGI [mailto:a...@listbox.com] > > NP-hard means NP-complete or harder. NP-complete means that a solution > would solve any problem in NP. NP is the class of problems whose answers > can be verified in time that is a polynomial function of the i

Re: [agi] SAT and Dynamic Programs of Models

2015-03-06 Thread Matt Mahoney via AGI
Jim, can you describe an algorithm where P = NP would exponentially speed up visual processing? My understanding is that the most advanced vision algorithms use deep neural networks with a structure similar to the visual cortex. In general, neural network size (in synapses) should be proportional t