Re: [agi] Re: I made a multi-file compressor that beats 7zip on real-world data

2020-08-01 Thread stefan.reich.maker.of.eye via AGI
OK, turns out my compression algorithm is basically known, it's called Re-Pair. Ah well. I'm not sure if a linear time implementation had been found before mine, see e.g. here . However, I think I am doing new

[agi] Re: Why do Transformers have layers of Attention Heads?

2020-08-01 Thread stefan.reich.maker.of.eye via AGI
Not enough attention heads activated here... -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/Ta21b3b47e26f50e7-Meb027e15ed9d959fd04c4d53 Delivery options: https://agi.topicbox.com/groups/agi/subscription

Re: [agi] Re: GPT3 -- Super-cool but not a path to AGI (

2020-08-01 Thread immortal . discoveries
"If you could predict the next word.." "any information and therefore it wouldn't be said.." And how does that help prediction? It doesn't. No, use common words when you write books and replies. It helps even the author. -- Artificial General

[agi] question just for Ben

2020-08-01 Thread immortal . discoveries
How does your AI update its goal / domain of interest? For example it wants to invent better storage devices, then changes its mind to want to invent better AI algorithms. A "career choice" change. :) ? -- Artificial General Intelligence List: AGI

[agi] Re: My main computer

2020-08-01 Thread stefan.reich.maker.of.eye via AGI
So the main computer is slow and the other one is vulnerable? SCNR =) -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/Ta7b12361e874c22d-M5bb48adb71b67d154ec75203 Delivery options:

Re: [agi] Re: I made a multi-file compressor that beats 7zip on real-world data

2020-08-01 Thread Matt Mahoney
I suggest if you want to profit from your software that you release it open source with a GPL license. Any company that wants to use it and doesn't want to open source their own software will need another license from you. In either case, you are advertising your software skills and you are the

Re: [agi] Re: GPT3 -- Super-cool but not a path to AGI (

2020-08-01 Thread immortal . discoveries
I don't know how you guys have been working on AGI for 30+ years and still can't say anything about how to actually predict the next word clearly using common words and using few words so any audience can quickly learn what you know. Why can't you say your AGI like this below? They all build

Re: [agi] Re: GPT3 -- Super-cool but not a path to AGI (

2020-08-01 Thread Alan Grimes via AGI
immortal.discover...@gmail.com wrote: > I don't know how you guys have been working on AGI for 30+ years and > still can't say anything about how to actually predict the next word > clearly using common words and using few words so any audience can > quickly learn what you know. Permalink >

Re: [agi] Re: GPT3 -- Super-cool but not a path to AGI (

2020-08-01 Thread Ben Goertzel
Contradictions are an interesting and important topic... PLN logic is paraconsistent, which Curry-Howard-corresponds to a sort of gradual typing Intuitionistic logic maps into Type Logical Categorial Grammar (TLCG) and such; paraconsistent logic would map into a variant of TLCG in which there

Re: [agi] Re: GPT3 -- Super-cool but not a path to AGI (

2020-08-01 Thread Rob Freeman
On Sat, Aug 1, 2020 at 7:08 PM Matt Mahoney wrote: > > On Fri, Jul 31, 2020, 10:00 PM Ben Goertzel wrote: > >> I think "mechanisms for how to predict the next word" is the wrong >> level at which to think about the problem, if AGI is your interest... >> > > Exactly. The problem is to predict

Re: [agi] Re: GPT3 -- Super-cool but not a path to AGI (

2020-08-01 Thread Matt Mahoney
On Fri, Jul 31, 2020, 10:00 PM Ben Goertzel wrote: > I think "mechanisms for how to predict the next word" is the wrong > level at which to think about the problem, if AGI is your interest... > Exactly. The problem is to predict the next bit. I mean my interest is in compression, but you still

Re: [agi] Re: GPT3 -- Super-cool but not a path to AGI (

2020-08-01 Thread Rob Freeman
How many billion parameters do PLN and TLCG have? Applications of category theory by Coecke, Sadrzadeh, Clark and others in the '00s are probably also formally correct. As were applications of the maths of quantum mechanics. Formally. Does Dominic Widdows still have that conference?

[agi] KERMIT: Logicalization of BERT

2020-08-01 Thread Yan King Yin, 甄景贤
This is my latest presentation of Logic BERT, also named KERMIT: https://github.com/Cybernetic1/2020/raw/master/logic-BERT-en.pdf The theory is based on symmetric neural networks. I think KERMIT will perform very close to human-level AI, despite its reptilian name :) Chinese version:

Re: [agi] Re: GPT3 -- Super-cool but not a path to AGI (

2020-08-01 Thread Rob Freeman
On Sun, Aug 2, 2020 at 1:58 AM Ben Goertzel wrote: > ... > ...I also think that the search for concise > abstract models is another part of what's needed... > It depends how you define "concise abstract model". Even maths has an aspect of contradiction. What does Chaitin call his measure of