On 1/27/15, ARAKAWA Naoya via AGI <[email protected]> wrote:
> From the NTM paper, you can tell the work is rather an extension
> of previous works on RNN, though the idea is to combine external
> (learnable) memory (as the 'tape' of TM) with RNN (such as LSTM)
> (as the 'head' of the TM).
>
> Though RNN seems to have been mathematically proven to be
> Turing-complete (see ref. in the NTM paper), the naming of NTM
> is apparently from its architecture (not from its mathematical nature)
> and I guess its computational power as TM is yet to be proven.
>
> Besides, I doubt the practicality of the Turing equivalence of RNN,
> as infinite sates are realized in its fractal structure and would be
> vulnerable to noise...
>
> -- Naoya Arakawa
Thanks. I wasn't really clear on what the tape was in that article on
first read.
I've read all kinds of hybrid approaches down through the years...
neural with memory, some with symbols, neuro-fuzzy hybrids etc. A lot
of the hype now is that there are some labelling changes ("cognitive
computing" vs AI) and with certain approaches big in the news: deep
learning. But, upon investigation it seems most of these approaches
have been around for many years. So you see an article with the
latest buzzwords, and you are thinking "man, I've seen this..." I'm
not knocking it, I'm glad it is so big in the news now.
Mike A
>
> 2015/01/28 8:56, Mike Archbold via AGI <[email protected]> wrote:
>
>> On 1/27/15, Ted Sanders via AGI <[email protected]> wrote:
>>> Neural turing machine I think is something along those lines.
>>> http://arxiv.org/pdf/1410.5401v2.pdf
>>> http://www.technologyreview.com/view/532156/googles-secretive-deepmind-startup-unveils-a-neural-turing-machine/
>>>
>>
>> These sorts of hybrid neural + discrete-structured efforts have been
>> around for a long time in my recollection. Suddenly now it's
>> "secretive deep mind" and I'm left wondering how this effort is so
>> vastly different. I'm no expert on neural networks, though, somebody
>> fill me in.
>>
>>>
>>> On Tue, Jan 27, 2015 at 7:12 AM, YKY (Yan King Yin, 甄景贤) via AGI <
>>> [email protected]> wrote:
>>>
>>>> *The idea:* make programs continuous and then evolve programs using
>>>> continuous techniques. Valiant's recent book "Probably Approximately
>>>> Correct" has said something about evolving continuous parameters for
>>>> strong
>>>> AI. (It may be more tractable than evolving programs with discrete
>>>> elements, the kind of programs we have known usually).
>>>>
>>>> *In the context of logic-based AI* (such as OpenCog, NARS, and my
>>>> Genifer) the idea is to make all logic and procedural statements
>>>> continuous. The part concerning making logic continuous is via
>>>> algebraization which I have been looking into, but will discuss
>>>> elsewhere.
>>>> The "procedural" aspect can be realized by letting the AGI control a
>>>> Turing
>>>> machine (TM) with one or more tapes, and by making such a Turing
>>>> machine
>>>> "continuous".
>>>>
>>>> As a first step towards continuous TMs, we can start with *finite state
>>>> machines (FSM)*. It seems that a continuous version of FSMs
>>>> corresponds
>>>> to continuous dynamical systems (aka topological dynamical systems). I
>>>> have not looked into the details of this correspondence, but it looks
>>>> fairly straightforward.
>>>>
>>>> *To make TMs continuous* is somewhat more difficult. One way is to
>>>> turn
>>>> the "tape read/write operations" into states of an FSM (in such case
>>>> the
>>>> number of states may become infinite). But I'm not sure if that is a
>>>> good
>>>> way to create continuous TMs.
>>>>
>>>> Any other idea for continuous TMs?
>>>>
>>>> Thanks in advance =)
>>>> --
>>>> *YKY*
>>>> *"The ultimate goal of mathematics is to eliminate any need for
>>>> intelligent thought"* -- Alfred North Whitehead
>>>> *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
>>>> <https://www.listbox.com/member/archive/rss/303/23797659-62a5f550> |
>>>> Modify
>>>> <https://www.listbox.com/member/?&>
>>>> Your Subscription <http://www.listbox.com>
>>>>
>
>
>
> -------------------------------------------
> AGI
> Archives: https://www.listbox.com/member/archive/303/=now
> RSS Feed: https://www.listbox.com/member/archive/rss/303/11943661-d9279dae
> Modify Your Subscription:
> https://www.listbox.com/member/?&
> Powered by Listbox: http://www.listbox.com
>
-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription:
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com