Ben Goertzel via<http://support.google.com/mail/bin/answer.py?hl=en&answer=1311182&ctx=mail> jeeves.archives.listbox.com Understanding quantitative data coming from sense organs is arguably more basic to intelligence than understanding words, since dogs and pigs can understand the former but not the latter...
That is not literally true, it is a philosophical shorthand, and it has not even proven to be a solid theoretical explanation of how understanding emerges from primitive neural activity. However the point that I was making was not that the most primitive or elemental components of understanding were linguistic but that language was the best way to convey the range of referential variations of ideas that are needed to convey some kind of understanding. (Of course other IO modalities are also important...) We don't run quantitative analyses of the ideas that we are talking about right now and the decision to ignore that point is a guarantee that you will be lagging behind those researchers who don't. Jim Bromer On Sat, Feb 22, 2014 at 10:29 AM, Ben Goertzel <[email protected]> wrote: > > > > On Sat, Feb 22, 2014 at 9:05 PM, Jim Bromer <[email protected]> wrote: > >> On Thu, Feb 20, 2014 at 6:46 PM, Ben Goertzel <[email protected]> wrote: >> >>> Do you mean: Given two numbers x and and y drawn from a specific sample >>> S of numbers (or a specific probability distribution D over the set of >>> numbers)? >>> Without this background S or D, the question is meaningless... >>> Given a distribution D, one can draw a sample S, of course; so the case >>> where one has a sample S is sufficient to deal with >>> One sensible measure would be: What ratio of the numbers in the sample S >>> are greater than max(x,y) or less than min(x,y) , as opposed to lying >>> between x and y? >>> >> >> >> But even this definition is (relatively) trivial relative to the problem >> of AGI. >> >> > Yes it is. I was not proposing it as a profound, revolutionary insight; > just as a reasonable answer to the question that was asked ;) > > > >> You don't "encounter" different probability distributions in the "real >> world", you derive them from "observations" of the real world. >> > > True... > > >> If you had answered the question using examples of numerical relations >> (with simple but powerful examples) and everyone understood those examples >> then I'd have to conclude that the underlying principle of numerical >> relations must have a great deal of relevance to the problem of >> discovering, representing and conveying meaning. But instead, when you >> realized that you had a chance to teach something that could be useful to a >> lot of people, you started out using words. >> > > > Understanding quantitative data coming from sense organs is arguably more > basic to intelligence than understanding words, since dogs and pigs can > understand the former but not the latter... > > Human language emerged to talk about the physical world, not in a universe > composed entirely of text corpora. And sense organs measure the world in > ways naturally modeled via quantitative variables. > > A specific example of modeling quantitative variables using quantile > normalization and propositional logic is at the end of: > > http://wiki.opencog.org/w/QuantitativePredicate > > The example is about GDP, but it could as well be about decibel levels > coming out of an ear, etc. > > -- Ben > *AGI* | Archives <https://www.listbox.com/member/archive/303/=now> > <https://www.listbox.com/member/archive/rss/303/24379807-f5817f28> | > Modify<https://www.listbox.com/member/?&>Your Subscription > <http://www.listbox.com> > ------------------------------------------- AGI Archives: https://www.listbox.com/member/archive/303/=now RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424 Modify Your Subscription: https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657 Powered by Listbox: http://www.listbox.com
