Re: [agi] MindForth Programming Journal 2019-10-05

2019-10-09 Thread immortal . discoveries
So basically the current best compression uses: - Many predictors combined - Arithmetic coding - Huffman coding too - Translate grouping ex. sister/brother The predicting of the next bit is a totally separate thingy from the Huffmaning, if I'm correct here. Is it so that the arithmetic coding is

Re: [agi] MindForth Programming Journal 2019-10-05

2019-10-08 Thread Matt Mahoney
I think someone else linked to the transformers paper. The best text compressors in http://mattmahoney.net/dc/text.html use context mixing algorithms with dictionary preprocessing to replace words with symbols. Context mixing uses lots of models to predict one bit at a time and neural networks to c

Re: [agi] MindForth Programming Journal 2019-10-05

2019-10-06 Thread immortal . discoveries
Yes I read all of it Matt 2 months ago, it was thrilling to read. @James, I did intend them both were combined. Above is 3 visualizations, each with 2 sticks of a certain length. My point was the size of the data you start with is the same if either stick is the original size...the actual compre

Re: [agi] MindForth Programming Journal 2019-10-05

2019-10-06 Thread Matt Mahoney
I assume you are talking about the Hutter prize for compressing a 100 MB text file. http://prize.hutter1.net/ I suggest reading http://mattmahoney.net/dc/dce.html to understand why your method won't work. Sure, you can find the file embedded in the digits of pi or some other big number, but you ne

Re: [agi] MindForth Programming Journal 2019-10-05

2019-10-06 Thread James Bowery
See the rules for Matt's large text compression benchmark. You *always* add the length of the decompression program to the length of the compressed data in order to approximate the Kolmogorov Complexity of the data. Only by bringing the errors of the model into the same units (bits) as the model

Re: [agi] MindForth Programming Journal 2019-10-05

2019-10-06 Thread immortal . discoveries
I was working on the compression prize last month ago or so for many reasons. I failed every time but I learned a lot more about trees, bin files, etc. Every time I'd come up with a unique solution that's innovative but however was not verified to do the job well enough One idea was that 9^9^9

Re: [agi] MindForth Programming Journal 2019-10-05

2019-10-06 Thread James Bowery
It's the hard problem of intelligence because it is incomputable and because once you've solved it, you have optimal prediction. What's left after compression is sequential decision theory which, while hard, is at least computable. On Sun, Oct 6, 2019 at 5:33 AM Stefan Reich via AGI wrote: > Wh

Re: [agi] MindForth Programming Journal 2019-10-05

2019-10-06 Thread immortal . discoveries
I recognize an error, I did every word. Here: The The sat duck The sat sat The sat down The sat. -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T2d0576044f01b0b1-M6e28a4ab2abed57ae0ef1f48 Delivery options: https:

Re: [agi] MindForth Programming Journal 2019-10-05

2019-10-06 Thread immortal . discoveries
Because Unsupervised Learning is the hard part and the most important part? SL and RL are the cherry on the cake. I agree however, there's more to AGI than just a compressed network. For example the task of switching between summarization, translation, entailment, and segmentation tasks, requir

Re: [agi] MindForth Programming Journal 2019-10-05

2019-10-06 Thread Stefan Reich via AGI
Why is compression "the hard problem of intelligence"? We have compression. I'd say the hard problem of intelligence is making an AI that builds a boat and sails in it. We do not have that yet. On Sun, 6 Oct 2019 at 06:37, James Bowery wrote: > Chuck chose my question the first to answer on Slas

Re: [agi] MindForth Programming Journal 2019-10-05

2019-10-05 Thread James Bowery
Chuck chose my question the first to answer on Slashdot circa 2001 . As it pertains to the relationship between programming and compression -- specifically programming _as_ compression -- and as compression is The Hard

Re: [agi] MindForth Programming Journal 2019-10-05

2019-10-05 Thread Stefan Reich via AGI
Wait - are these two different lists? (topicbox vs listbox) On Sun, 6 Oct 2019 at 01:48, Stefan Reich < stefan.reich.maker.of@googlemail.com> wrote: > prednom @ 0 > IF \ 2018-06-21: positive predicate nominative? > midway @ t @ DO \ 2018-06-21: search KB to infer facts; > I 1

Re: [agi] MindForth Programming Journal 2019-10-05

2019-10-05 Thread Stefan Reich via AGI
prednom @ 0 > IF \ 2018-06-21: positive predicate nominative? midway @ t @ DO \ 2018-06-21: search KB to infer facts; I 1 psy{ @ prednom @ = I 8 psy{ @ 2 = AND IF \ 2018-06-21: plural KB data? I 7 psy{ @ 1 = IF \ 2018-06-21: nominative? seqverb @ 0 = IF \ 201

[agi] MindForth Programming Journal 2019-10-05

2019-10-05 Thread A.T. Murray
MindForth resets associative tags before each operation of Indicative module. In the MindForth artificial intelligence (AI) for robots, we will now start to display an apparatus of diagnostic messages at the start of the Indicative module to tell us the values being held in variables which serve t