On Fri, Aug 24, 2012 at 2:24 PM, Aaron Hosford <[email protected]> wrote: > > So why wouldn't we design a system that attempts to attain a nice simple > goal like "make people happy" and build in the awareness that in order to > define that goal in all its complexity, it needs to *ask* us what we want.
Because that's not a shortcut. The goal "make people happy" is not nice and simple. It is 10^17 bits, unless you mean make people happy by giving them drugs or inserting an electrode into the nucleus accumbens. > Then the system iteratively refines that goal as new information comes in at > the measly rate of "1 to 5 bits per second through > speech, writing, or typing", as time is available and the need arises, > making do with a less individualized but still highly effective definition > of the general goal in the meantime. People recognize the value of > information vs. the time it takes to communicate it, and will point out the > most inconvenient misunderstandings first, so the system can rely on the > users to selectively identify and convey the information it needs to know in > order to meet their needs. In other words, if you want the system to be > individualized to your preferences, you pay the cost of gathering & > transmitting a description of your preferences. This is the current model > for all those apps you mention: you go to the preferences page and check the > boxes according to what you prefer. In the future, it will be communicated > via natural language, but it will be the same principle at work. I thought we were already doing that. But yes, the cost of communicating our preferences will be the most expensive part of AGI once Moore's Law makes the hardware cheap enough. (Right now, it would cost $1 quintillion if you could buy it. In 15 years the same computing power should cost $1 quadrillion, low enough to make it cost effective to replace most human labor). Natural language is better than filling out an online survey. Observing your behavior is better still. Guessing based on the preferences of other people with similar behavior is better still. We already do all of these things because it is so expensive. -- Matt Mahoney, [email protected] ------------------------------------------- AGI Archives: https://www.listbox.com/member/archive/303/=now RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-c97d2393 Modify Your Subscription: https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-2484a968 Powered by Listbox: http://www.listbox.com
