Russell,

On Mon, Aug 13, 2012 at 2:39 PM, Russell Wallace
<[email protected]>wrote:

> Okay, suppose for the sake of argument we are only dealing with binary
> variables, so only two possibilities per variable (in practice we often
> deal with more than that, which of course only makes  things even more
> complex).
>
> If we have one variable, there are two possibilities, 0 and 1.
>
> With two variables there are four possibilities, 00, 01, 10, 11.
>
> With three variables there are 2^3 = eight possibilities, 000, 001, 010,
> 011, 100, 101, 110, 111.
>
> With N variables there are 2^N possibilities.
>

Those possibilities are called "states", and there are 2^bits states in
your computer, where "bits" is the number of bits on disk. However, what is
more interesting is how we logically connect things, and then the overhead
of honoring those connections.

>
> The point of machine learning is to find, among all of the 2^N possible
> combinations, one that is a good enough solution to the problem at hand, so
> while on each iteration, each test, you are only dealing with N variables,
> for the learning run you are dealing with the exponential search space of
> 2^N possibilities.
>

You are confusing states with functionality. There is no "cost" to having a
large number of states, aside from the cost of RAM and disk. A large number
of states does NOT necessarily map to needing a larger CPU.

Steve
=============

> On Mon, Aug 13, 2012 at 7:13 PM, Steve Richfield <
> [email protected]> wrote:
>
>> Russell,
>>
>> On Mon, Aug 13, 2012 at 11:05 AM, Russell Wallace <
>> [email protected]> wrote:
>>
>>> On Mon, Aug 13, 2012 at 6:47 PM, Steve Richfield <
>>> [email protected]> wrote:
>>>
>>>> Russell,
>>>>
>>>> "What we have here is a failure to communicate."
>>>>
>>>
>>> I think so...
>>>
>>> Not really. If every one of the ~10^10 neurons that have axons and
>>>> dendrites connected to 10% of the other neurons (probably close to the
>>>> worst possible real case), we have then "only" complicated things by ~7
>>>> orders of magnitude (adjusting downward for present NN complexities).
>>>>
>>>
>>> You're missing the central point.
>>>
>>> The computational requirement for one iteration may be linear in the
>>> number of variables, which with future hardware should end up being
>>> tractable even for something as complex as the human brain, as you say.
>>>
>>> But the size of the search space is *exponential* in the number of
>>> variables.
>>>
>>
>> No, it is quadratic. n things have n(n-1)/2 potential interactions.
>>
>>
>>> If you have ten thousand variables (equivalent to some simple
>>> invertebrate brains), you're looking at computational effort, not of 10^4
>>> operations, but exp(10^4). That's the problem.
>>>
>>
>> Please explain how it is exponential rather than quadratic.
>>
>> Steve
>>    *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
>> <https://www.listbox.com/member/archive/rss/303/1658954-f53d1a3f> |
>> Modify <https://www.listbox.com/member/?&;> Your Subscription
>> <http://www.listbox.com>
>>
>
>    *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/10443978-6f4c28ac> |
> Modify<https://www.listbox.com/member/?&;>Your Subscription
> <http://www.listbox.com>
>



-- 
Full employment can be had with the stoke of a pen. Simply institute a six
hour workday. That will easily create enough new jobs to bring back full
employment.



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-c97d2393
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-2484a968
Powered by Listbox: http://www.listbox.com

Reply via email to