Jim, can you describe an algorithm where P = NP would exponentially speed up visual processing? My understanding is that the most advanced vision algorithms use deep neural networks with a structure similar to the visual cortex. In general, neural network size (in synapses) should be proportional to the training set information content. Thus, training time is O(n^2).
On Thu, Mar 5, 2015 at 10:01 PM, Jim Bromer via AGI <[email protected]> wrote: > Matt said: > Vision is a pattern recognition problem. You input a picture of a cat > and output a label like "cat". It is not NP-complete because (1) > experimentally, the problem scales polynomially with input size and > (2) the time to verify that a label like "cat" is correct is about the > same as the time it takes to label the image. Thus, the problem is in > P and would not benefit even if P = NP. > ------------------------------------------------- > This is a truly insipid response. You have taken one narrow situation > and used it in an over-generalization of a kind of AGI problem. "The > problem scales polynomially with input size? The point that I made is > that the general analysis of imagery is presumably as bad or worse > than NP (in the lexicon of the day). What I mean is that there is > sufficient evidence that AGI is, in the worse case, at least > exponentially difficult and that makes it worthwhile to examine why > that may be. One reason, the reason I gave, is that the easiest > methods to make a methodical and thorough analysis of the relations > between associated pixels would be those that are (literally) in NP. > The implied case of scaling a particular picture and arguing that it > would scale polynomially with input size is analogous to saying that > converting an unrestricted Boolean Satisfiability problem to 3-SAT > scales polynomially (and that somehow proves that unrestricted SAT > scales polynomially). It is pretty obvious that you have little > experience with visual data. > > This is an example of a blatant overgeneralization being declared as > if it were a factual statement. I can't casually explain why visual > analysis is at least exponentially difficult because I am not enough > of an expert to be that familiar with all the problems. However, I am > confident that there is no overwhelming evidence to suggest that, in > general, it is less difficult. > Jim Bromer > > > On Thu, Mar 5, 2015 at 1:04 PM, Matt Mahoney via AGI <[email protected]> wrote: >> On Wed, Mar 4, 2015 at 2:51 AM, Jim Bromer <[email protected]> wrote: >>> On Tue, Feb 17, 2015 at 11:52 PM, Matt Mahoney via AGI <[email protected]> >>> wrote: >>>> On Tue, Feb 17, 2015 at 10:26 PM, Jim Bromer via AGI <[email protected]> >>>> wrote: >>>>> I started wondering about how a good Satisfiability model might be >>>>> used with AGI. >>>> >>>> It wouldn't because the hard problems in AI like vision and language >>>> are not NP-hard. The more useful application would be breaking nearly >>>> all forms of cryptography. (One time pad would still be secure). >>>> -- Matt Mahoney >>> >>> I seriously doubt the premise that the hard problems like vision and >>> language in AI are not NP-hard. >> >> NP-hard means NP-complete or harder. NP-complete means that a solution >> would solve any problem in NP. NP is the class of problems whose >> answers can be verified in time that is a polynomial function of the >> input size. P is the class of problems that can be solved in >> polynomial time. It is widely believed by everyone except Jim Bromer >> that P != NP. This belief is not because of any proof, but because >> thousands of other people like Jim Bromer who believed P = NP failed >> to find polynomial time solutions to any NP-complete problems after >> years of effort until they were convinced they would be better off if >> they gave up. The time it takes to give up is inversely proportional >> to the person's efforts into studying the math and researching the >> work of others instead of repeating their mistakes. >> >>> My (admittedly limited) experience >>> with visual AI ran up against NP-Hard solutions that I thought would >>> work. >> >> Vision is a pattern recognition problem. You input a picture of a cat >> and output a label like "cat". It is not NP-complete because (1) >> experimentally, the problem scales polynomially with input size and >> (2) the time to verify that a label like "cat" is correct is about the >> same as the time it takes to label the image. Thus, the problem is in >> P and would not benefit even if P = NP. >> >>> And since language could be considered to be a form of >>> cryptography then your conjunction of cases (not language but >>> cryptography) does not look really strong. >> >> No, language is also a pattern recognition problem. >> >>> (Visual processing also >>> might be considered to be a form of cryptography and indeed it is used >>> as such in captchas.) >> >> Cryptography depends on the existence of one-way functions: given >> function f and output f(x), you can't find input x any faster than >> trying all possible values and comparing the outputs. If P = NP, then >> one-way functions would not exist. You could build a circuit that >> computes f and compares the output. Then set the bits of x one at a >> time and ask your polynomial SAT solver if a solution exists. If not, >> flip the bit before going to the next bit. >> >> You could argue that a captcha is a one way function. It is easy to >> convert text to an image, but hard to convert it back. But it is >> polynomially hard, not exponentially hard. Adding one bit to the image >> doesn't double the solution time, like adding one bit to an encryption >> key would. >> >> -- >> -- Matt Mahoney, [email protected] >> >> >> ------------------------------------------- >> AGI >> Archives: https://www.listbox.com/member/archive/303/=now >> RSS Feed: https://www.listbox.com/member/archive/rss/303/24379807-653794b5 >> Modify Your Subscription: https://www.listbox.com/member/?& >> Powered by Listbox: http://www.listbox.com > > > ------------------------------------------- > AGI > Archives: https://www.listbox.com/member/archive/303/=now > RSS Feed: https://www.listbox.com/member/archive/rss/303/3701026-786a0853 > Modify Your Subscription: https://www.listbox.com/member/?& > Powered by Listbox: http://www.listbox.com -- -- Matt Mahoney, [email protected] ------------------------------------------- AGI Archives: https://www.listbox.com/member/archive/303/=now RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424 Modify Your Subscription: https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657 Powered by Listbox: http://www.listbox.com
