[agi] Re: Attention is All you Need

2021-04-20 Thread keghnfeem
Will Transformers Replace CNNs in Computer Vision? + NVIDIA GTC Giveaway: https://www.youtube.com/watch?v=QcCJJOLCeJQ -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/Tefaeb8e790a54cec-Me5095e2a2845fc9c283c0278 Del

Re: [agi] using NNs to solve PDEs

2021-04-20 Thread James Bowery
Yeah I've been saying for a while that system identification of PDEs is likely where it's at given the need to hook up with the empirical world's priors with a formal system that's Turing Complete -- at least potentially. Indeed the second connectionist summer would likely not have happened if not

[agi] Re: Attention is All you Need

2021-04-20 Thread Jim Bromer
When I said that ANNs used linear approximations you knew what I meant because 'you are in the club.' But a newbie might have been confused and thought something like, "So that's how Neural Networks work. They use linear approximations." Seeing this I will try to find better phrases like - they

[agi] Re: Attention is All you Need

2021-04-20 Thread Jim Bromer
Transformer Attention does seem to be more than just those two fundamental points. I do not want to spend a lot of time working with NNs (other than on my TinyML projects) but I do want to get a better understanding about how these things work and then apply some of the ideas to some slightly m

Re: [agi] using NNs to solve PDEs

2021-04-20 Thread Alan Grimes via AGI
Looks kewl, I wouldn't tend to trust them very far but it sounds like a great way to obtain priors for more conventional methods to squash out the last little bits of epsilon u don't want. Bill Hibbard via AGI wrote: > Interesting article: > https://www.quantamagazine.org/new-neural-networks-solve

[agi] using NNs to solve PDEs

2021-04-20 Thread Bill Hibbard via AGI
Interesting article: https://www.quantamagazine.org/new-neural-networks-solve-hardest-equations-faster-than-ever-20210419/ Points to a couple arxiv papers: https://arxiv.org/abs/1910.03193 https://arxiv.org/abs/2010.08895 -- Artificial General Intelligence

Re: [agi] Re: Thursday, March 25, 2021 Constructing Transformers For Longer Sequences with Sparse Attention Methods

2021-04-20 Thread John Rose
On Tuesday, April 20, 2021, at 1:59 AM, Ben Goertzel wrote: > In general though, I think the academic community has not adapted fast enough to the shift to online publication, which means that cost of printing on paper is not an issue anymore. I never thought of that. I suppose it's this way with