Re: [agi] Re: Fabrice Bellard has a new score!

2021-04-28 Thread Matt Mahoney
Yeah, forgot to mention that the latest version of nncp improves compression by 2% and is 3x faster (2 days per GB) and uses 6 GB RAM vs 23 GB. Mostly just optimizing. nncp uses a transformer neural network running on a GPU. Again reconfirming the dominance of CPU heavy neural networks over rule

[agi] Re: Fabrice Bellard has a new score!

2021-04-28 Thread immortal . discoveries
On Monday, April 26, 2021, at 7:18 PM, immortal.discoveries wrote: > Every time I run my own code it feels like playing the casino lottery lol. > Jackpot lol. By jackpot I don't mean the money, obviously it is a small earnings for the Hutter Prize website, and still it is not the cash for me

Re: [agi] Re: Making an AI from a compressor

2021-04-28 Thread immortal . discoveries
Either JR is just joking in every post or he really believes what he says. Both are not good cases... My suggestion - build a AI like GPT-2 / PPM, and see for yourself how one exactly works, to get out of this qualia bubble your seriously stuck in (clearly you are, yes, I'm telling you, I've

Re: [agi] Re: Making an AI from a compressor

2021-04-28 Thread John Rose
On Wednesday, April 28, 2021, at 11:55 AM, immortal.discoveries wrote: > What matters here is brains can solve many problems by predicting solutions > based on context/ problem given Single brains specialize. Multibrains generalize. That's why they communicate. Multiparty intelligence on a

Re: [agi] Re: Making an AI from a compressor

2021-04-28 Thread John Rose
On Wednesday, April 28, 2021, at 12:01 PM, Jim Bromer wrote: > Malleable compression is an interesting way to put it. Well, we could reframe the concept of compression and redefine it in terms of consciousness and intelligence. Assume panpsychism, all compressors have non-zero consciousness.

Re: [agi] Re: Making an AI from a compressor

2021-04-28 Thread Jim Bromer
Malleable compression is an interesting way to put it. -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/Tbdfca102d702de94-Mb517e1b5a5fc923ff46099fc Delivery options:

Re: [agi] Re: Making an AI from a compressor

2021-04-28 Thread immortal . discoveries
John says you can compress sometimes horribly/ nothing, sometimes ace it, but not all the time ace all the thing. Here's my answer: Given 100 letters of context that are random, yes a smart brain will fail here because it is random, and given 100 letters all 'a' ex. 'aaaa' it will

Re: [agi] Re: Making an AI from a compressor

2021-04-28 Thread John Rose
On Wednesday, April 28, 2021, at 9:24 AM, Jim Bromer wrote: > I do not think that "compression" per se is the basis of making AI (which is > directly related to the topic). However, I do believe that an AGI (or an > advanced AI) program would like a compressor. I'm with you there Jim, unlike

Re: [agi] Re: Making an AI from a compressor

2021-04-28 Thread Jim Bromer
In response to something I said about cross-generalization, John Rose replied "  You can optimally compress some of the data all of the time, you can optimally compress all of the data some of the time, but you can’t optimally compress all of the data all of the time. It is what it is bruh."