Actually, we _know_ that computers have analog _effects_ on their
environment in the form of RF noise. So, given the system has adequate
knowledge of its environment, and of its own RF effects, it might be able
to reach outside Turing's box and thereby escape.
On Thu, Feb 13, 2020 at 10:44 AM
Physical embodiment combined with Occam's Razor's foundation of physics
implies that a self-improving program may succeed in sufficient
self-modeling as to detect anomalous artifacts of is physical grounding
which are "outside Turing's box" so to speak. One can reasonably
conjecture that such
On Wed, Feb 12, 2020, 8:06 PM James Bowery wrote:
> On Wed, Feb 12, 2020 at 2:44 PM Matt Mahoney
> wrote:
>
>>
>>
>> Here is versions 1.2.0.87b of my recursively self improving AGI. This
>> hopefully fixes a bug in the module that detects when it is about to launch
>> an unfriendly singularity.
On Wed, Feb 12, 2020 at 2:44 PM Matt Mahoney
wrote:
>
>
> Here is versions 1.2.0.87b of my recursively self improving AGI. This
> hopefully fixes a bug in the module that detects when it is about to launch
> an unfriendly singularity. Just to be safe, be sure to run it in a virtual
> sandbox on
On Tue, Feb 11, 2020, 8:26 PM wrote:
> Do not, call it shit. It's leading up to something big. BIG shit.
>
Right. Well post your code when it's finished or you have a question. It's
not very helpful if you post code without comments explaining what it is
supposed to do. If it's more than a few
Hey MattMahoney stop being such a bastard, hes only a beginner coder and hes
getting excited, so dont spoil his mood.
So are u launching this thing, or is it just notepad code so far?
--
Artificial General Intelligence List: AGI
Permalink:
Props for making something of your own. However, you don't need to spam the
list with every intermediate version. Polish your code until it's *done *and
post the *final *version, with a summary of what it accomplishes and the
significance thereof.
As for the "something big" ... don't count
Do not, call it shit. It's leading up to something big. BIG shit.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T11f5dc3052b454b3-M59dab0a4df4cada264002e6b
Delivery options:
Who cares? Why are you posting this shit?
On Tue, Feb 11, 2020, 4:49 PM wrote:
> ROTCAFER
>
> e=['']
> g=4
> for count2 in range(2):
> f='ababa'[count2:g]
> g=g+1
> c=1
> d=1
> for count in range(4):
> a=f[count-1]
> b=e[d-1].find(a)+1
> if b==0:
>
https://www.youtube.com/watch?v=iFalZ7mNoV0
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T11f5dc3052b454b3-M81fc498e3266f6a816203cac
Delivery options: https://agi.topicbox.com/groups/agi/subscription
ROTCAFER
e=['']
g=4
for count2 in range(2):
f='ababa'[count2:g]
g=g+1
c=1
d=1
for count in range(4):
a=f[count-1]
b=e[d-1].find(a)+1
if b==0:
e[d-1]=str(e[d-1])+str(a)
if d==len(e):
e.append([])
e[d].append(len(e)+1)
e.append('')
R E F A C T O R
e = ['']
g = 4
for count2 in range(2):
f = 'ababa'[count2 : g]
g = g + 1
c = 1
d = 1
for count in range(4):
a = f[count - 1]
b = e[d - 1].find(a) + 1
if b == 0:
e[d - 1] = str(e[d - 1]) + str(a)
if d == len(e):
e.append([])
RE FACT OR
e = ['']
h = 1
g = 4
for count2 in range(2):
f = 'ababa'[h - 1 : g]
h = h + 1
g = g + 1
c = 1
d = 1
for count in range(4):
a = f[c - 1]
c = c + 1
b = e[d - 1].find(a) + 1
if b == 0:
e[d - 1] = str(e[d - 1]) + str(a)
if d == len(e):
i added a note:
also my code had '- 1 + 1' in it because lists have the first iem as '0' and so
i want Next Item and '+1' achieves that but -1 works when you want the actual
item not next item and so -1=Current, none=next
REFACTOR
tree = ['']
window_start = 1
window_end = 4
for count2 in
i removed it manually and still works, hmm
tree = ['']
window_start = 1
window_end = 4
for count2 in range(2):
window = 'ababa'[window_start - 1 : window_end]
window_start = window_start + 1
window_end = window_end + 1
char_location = 1
node = 1
for count in range(4):
ITS STILL GOT +1-1 INNIT!! dont make sense!!!
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T11f5dc3052b454b3-M330e2d0fbc20a91b70fc6fdf
Delivery options: https://agi.topicbox.com/groups/agi/subscription
tree = ['']
window_start = 1
window_end = 4
for count2 in range(2):
window = 'ababa'[window_start - 1 : window_end]
window_start = window_start + 1
window_end = window_end + 1
char_location = 1
node = 1
for count in range(4):
char_in_window = window[char_location - 1]
REFACTORR!!
https://blockly-demo.appspot.com/static/demos/code/index.html#95fn2y
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T11f5dc3052b454b3-M297548706cd4d72a33961d7d
Delivery
REFACTOR!
https://blockly-demo.appspot.com/static/demos/code/index.html#xhyzufS
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T11f5dc3052b454b3-Mba33813ec846c49710e9154a
Delivery options:
refactor!
https://blockly-demo.appspot.com/static/demos/code/index.html#xhyzuf
tree = ['']
window_start = 1
window_end = 4
for count2 in range(800):
window = text[window_start - 1 : window_end]
window_start = window_start + 1
window_end = window_end + 1
char_location = 1
node = 1
New refactor; not smaller code but less blocks!
New:
https://blockly-demo.appspot.com/static/demos/code/index.html#axnwxx
Previous:
https://blockly-demo.appspot.com/static/demos/code/index.html#u5tpn3
--
Artificial General Intelligence List: AGI
Permalink:
window was 5 letters, just find the first letter 't', then it location in the
next list item to get the goto pointer, then repeat
--
Artificial General Intelligence List: AGI
Permalink:
my tree returns for the input "the cat and the dog"
['the cand', [3, 10, 17, 24, 31, 38, 60, 67], 'h ', [5, 45], 'e', [7], ' ',
[9], 'cd', 'e', [12], ' ', [14], 'cd', [16, 79], 'a', ' ', [19], 'cd', [21,
80], 'a', [23], 't', 'cat', [26, 50, 74], 'a', [28], 't', [30], ' ', 'a', [33],
't', [35],
well come u guys show me ur smallest trees!!!
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T11f5dc3052b454b3-M07ace97bca3c1ebafeca9daa
Delivery options: https://agi.topicbox.com/groups/agi/subscription
oh my bad, the code is from he big purple blocks after the first 3 IF statements
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T11f5dc3052b454b3-M1023b38e3fad81b834cf9270
Delivery options:
but but...mine is nested lists
https://blockly-demo.appspot.com/static/demos/code/index.html#u5tpn3
see the last 2 lists in each other at bottom??
--
Artificial General Intelligence List: AGI
Permalink:
precedence doesnt make a difference.
8+1-1=8
(8+1)-1=8
8+(1-1)=8
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T11f5dc3052b454b3-Mfdbff2e5a3a553e707abbef9
Delivery options:
no it has operator presedence, it was originally:
tree[int((node + 1) - 1)].append(len(tree) + 1)
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T11f5dc3052b454b3-M44f9fa55745628964676f0ef
Delivery options:
oops.here:
tree = ['']
window_start = 1
window_end = 4
for count2 in range(2):
window = 'a'[window_start - 1 : window_end]
window_start = window_start + 1
window_end = window_end + 1
char_location = 1
node = 1
for count in range(4):
char_in_window = window[char_location -
tree = ['']
window_start = 1
window_end = 4
for count2 in range(25):
window = 'abbacbccacbbbababcabccacabcbcb'[int(window_start - 1) :
int(window_end)]
window_start = window_start + 1
window_end = window_end + 1
char_location = 1
node = 1
for count in range(4):
char_in_window =
Had to restart. My code wasn't readable. Is this tree below small enough code?
It takes 3 mins on 10MB input at 17-letter tree branches but uses 13GB RAM.
https://blockly-demo.appspot.com/static/demos/code/index.html#f6wbqn
code is below after removing useless code
tree = ['']
window_start = 1
Omg my alg is so complex now, I can't understand what'll do. Should I divide it
into chunks?
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T11f5dc3052b454b3-M800605d159dd5fa1464888a1
Delivery options:
I've learnt 6 Programming Languages overnight and am a master of them all. I've
surpassed Stefan by 5000 karma levels. And Google is my pimp.
--
Artificial General Intelligence List: AGI
Permalink:
thats excellent mate, youll be a full virtual architect in a matter of some
amount of time.
just dont let no-one tell you your doing the wrong thing for just brainstorming
and pulling it out of your bum.
Its more original that way, and you get less fear of the unknown factor in ya,
then your
almost done!!
https://blockly-demo.appspot.com/static/demos/code/index.html#mzd28f
And I didn't even refactor yet.
how u like my list-tree?
--
Artificial General Intelligence List: AGI
Permalink:
this is just a data compressor btw :D
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T409fc28ec41e6e3a-Mf3100089fd0163b31e939187
Delivery options: https://agi.topicbox.com/groups/agi/subscription
Coming along. Soon, soon.
https://blockly-demo.appspot.com/static/demos/code/index.html#5wv49n
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T409fc28ec41e6e3a-M203d9b1285177c0f2777f1c9
Delivery options:
lots of gotos and lots of global variables please.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T409fc28ec41e6e3a-M05a0c5d4d2969c6a46dac4bb
Delivery options: https://agi.topicbox.com/groups/agi/subscription
Long live goto and oh god please make them stop using it... like road detours
and fire escapes and electrical jumpers and heart stents and emergency brakes
and
--
Artificial General Intelligence List: AGI
Permalink:
when you hit 3 - 0, you have to make a life decision now to go do something
with your life.
That very day, I had this strange urge that i could get infinite compression +
computing power together just from using subtraction!
--
Artificial General
if i imagine back then its like a foggy dream.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T409fc28ec41e6e3a-M5ae14833155b2454cc2b8a1d
Delivery options: https://agi.topicbox.com/groups/agi/subscription
I'm 24.5 years old :D
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T409fc28ec41e6e3a-M4978466ec1e061f21d562304
Delivery options: https://agi.topicbox.com/groups/agi/subscription
how old are u ID. because if your as old as i think you are, your going pretty
good for your age.
Dont worry about other guys your age looking better than you, if you ever
choose to go for your big finale, could come later!
--
Artificial General
I've never done drugs, yet, :p Just fries Ev daY. i could die... :p
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T409fc28ec41e6e3a-Md62517915000c44502574db3
Delivery options:
yeh but after youve gone through a pound of pot how clear are you thinkin' then?
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T409fc28ec41e6e3a-M67c05e942913c53c13ec41bc
Delivery options:
However yeah like I said in my PL idea Goto can do a lot of things, though it
may be too low level.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T409fc28ec41e6e3a-Mc8cfab9757abf6b304e4496a
Delivery options:
See you are messy person I knew it look; searching Google I get:
"No, Python does not support labels and goto, if that is what you're after.
It's a (highly) structured programming language. Python offers you the ability
to do some of the things you could do with a goto using first class
hey lockster, hire a guy to make the c converter for your programming language,
and then you can use it for real.
goto is better than break because u control the exit position.
and i swear if you ever had "goto x" (variable jump) its a complete programming
language in itself. does anything.
https://www.youtube.com/watch?v=Fk6mGa57Vi0
pttt
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T409fc28ec41e6e3a-M01ec6304739ac22237cfe514
Delivery options: https://agi.topicbox.com/groups/agi/subscription
https://softwareengineering.stackexchange.com/questions/566/is-using-goto-ever-worthwhile
Check the Functions Tab guys heh heh, u can DO something and call it baby CALL
IT!! ~
--
Artificial General Intelligence List: AGI
Permalink:
The higher the goto density in a large block of source code,
the more likelihood of greater Kolmogorov complexity of said code that you
reluctantly agreed to maintain... seems like. So, after a skilled coder gets
frustrated, he or she thus attempts to reduce the complexity, cleans it up, so
that
mad goto skillz you learn in your formative years. :)
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T409fc28ec41e6e3a-Ma3d134f07cb02b881607adae
Delivery options: https://agi.topicbox.com/groups/agi/subscription
Exactly. It allows you to go deep with confidence because you know you can bail
at any time. And it's in C#.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T409fc28ec41e6e3a-M714324aa36407dfd4d64594e
Delivery
if you get yourself in trouble in a big scoped nest, you can make a break for
it with a goto statement, STILL! in ansi c!!
--
Artificial General Intelligence List: AGI
Permalink:
What Blockly has no goto? How could you create an AGI without goto.,, pffft.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T409fc28ec41e6e3a-Mf275cfd8dbb0d9c388aed532
Delivery options:
At least not in full.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T409fc28ec41e6e3a-Ma43f4e98ecd790a71643ed1b
Delivery options: https://agi.topicbox.com/groups/agi/subscription
No it's the opposite, we need sharing and caring. I'm not seeing it from yous
either.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T409fc28ec41e6e3a-M57fa7c4959bbd978dcd7a400
Delivery options:
YES ID GET YOUR FINGERS TAPPING!!! then you get to keep all your ideas
secret, telling ppl them is a losing game.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T409fc28ec41e6e3a-M206d380fa14ea6e4495bd674
I'm currently constructing my AGI prototype here >
https://blockly-demo.appspot.com/static/demos/code/index.html#ht2ard
Using Blockly I've learned programming instantly. Previously I was hiring
programmers.
You have to tweak some things in real code but it's easy.
Yes, I’ve eked away at pieces of it intermittently like 20 times and it's a
treasure trove. But I find it difficult to believe that it will hold up, I've
actually never seen a proof so elaborate like that... and so... sparse? but
dense.
Will take time to absorb… pretty amazing piece of work.
On Wed, Feb 5, 2020, 11:20 AM John Rose wrote:
> The paper is an attempted proof of MIP*=RE.
>
> The consciousness aspect is just me suggesting an intelligence topology
> optimizes on a communication fabric :)
>
Did you read the paper?
--
Artificial
The paper is an attempted proof of MIP*=RE.
The consciousness aspect is just me suggesting an intelligence topology
optimizes on a communication fabric :)
--
Artificial General Intelligence List: AGI
Permalink:
The abstract doesn't say anything about consciousness.
On Wed, Feb 5, 2020, 7:16 AM John Rose wrote:
> See this really reinforces my beliefs; multiple wetware general
> intelligences discussing the compression of a particular chunk of data
> (enwik8), an example in classical communication
See this really reinforces my beliefs; multiple wetware general intelligences
discussing the compression of a particular chunk of data (enwik8), an example
in classical communication complexity. My belief is that conscious agents
compress better than non-conscious agents and are capable of more
On Tue, Feb 4, 2020, 8:26 AM stefan.reich.maker.of.eye via AGI <
agi@agi.topicbox.com> wrote:
> > It means predicting text as well as a human.
>
> So you would basically design blank-space test (a text with missing
> words), check how well humans do on that one and then do the same with
>
No Stefan, we know humans are really good predictors, the actual test is to
make a really good predictor for text so that your compression is extremely
high and so that it usually has a high probability for the next letter or bit
to predict that it should predict during decompression in a
> It means predicting text as well as a human.
So you would basically design blank-space test (a text with missing words),
check how well humans do on that one and then do the same with computers.
Are any of those tests online?
--
Artificial General
Marcus Hutter has added a couple of Questions to the Hutter Prize FAQ which
people interested in AGI (or just plain old ML) should grok in fullness
before they do any more rudderless work:
http://prize.hutter1.net/hfaq.htm#xvalid
http://prize.hutter1.net/hfaq.htm#largenn
On Mon, Feb 3, 2020 at
Doing so let's you now generate related data to the input dataset, like similar
songs you'd want.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T409fc28ec41e6e3a-Mba8145cc421a61a0d7198647
Delivery options:
The compression let's it learn the patterns, so it can extract not just the
original data, but also related data.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T409fc28ec41e6e3a-M5852ed88ef24e0180b71a034
On Mon, Feb 3, 2020, 12:33 PM wrote:
> predicting text with the smallest possible model given the output, tho
> right?
>
That's the idea. Occam's Razor, measured precisely.
--
Artificial General Intelligence List: AGI
Permalink:
predicting text with the smallest possible model given the output, tho right?
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T409fc28ec41e6e3a-M85d7a126d390b1d49d62c40a
Delivery options:
On Mon, Feb 3, 2020, 8:34 AM stefan.reich.maker.of.eye via AGI <
agi@agi.topicbox.com> wrote:
> > That is why text compression is a test for AI and solving it solves AI.
>
> I challenge that second assumption. What does "solving text compression"
> mean? I don't understand what that is supposed
> That is why text compression is a test for AI and solving it solves AI.
I challenge that second assumption. What does "solving text compression" mean?
I don't understand what that is supposed to be.
Also, I'd still like to hear how you would create any kind of general-purpose
AI system out
heres a free idea -> what if you train the net to avoid gibberish, instead of
telling it what to say.
negative reinforcement, instead of positive reinforcement.
itll be equally kinda moronic result, but it may work as well!!
--
Artificial General
If you had your gig training set, youd like it if the model always said that
was in the training text, then you let the computer find the compression for
you. But... like i said, it would take ages even to test one set of synapses,
and there could be lots of them.
so it would be
""In our community, the C-word (consciousness) ..." =D"
And the D word? Oh, that's shallow isn't it. Deep Learning training. You just,
let it run over night and pass some epochs.
The P word? Predictive model. The bigger the model, the more unseen moments
that can be solved. Longer context is
If you wanted to evolve a chat bot properly, it would take months and months
if you did it open ai style.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T409fc28ec41e6e3a-M24bba9d8ce81e3c65d93c843
Delivery
On Sat, Feb 1, 2020 at 2:04 PM Matt Mahoney wrote:
> 10 years of writing code and doing experiments. Big AGI projects like Cyc,
> NARS, and OpenCog took considerably longer.
>
Lenat really should get together with these guys and best the Large Text
Compression Benchmark:
On Fri, Jan 31, 2020, 9:24 PM wrote:
> "10 years or more to develop a good compressor" Do you mean leaving the
> computer generating the network that long? :)
>
10 years of writing code and doing experiments. Big AGI projects like Cyc,
NARS, and OpenCog took considerably longer.
"10 years or more to develop a good compressor" Do you mean leaving the
computer generating the network that long? :)
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T409fc28ec41e6e3a-M982e67268f298c2a70fd7079
On Friday, January 31, 2020, at 8:29 PM, Matt Mahoney wrote:
> All of this complexity is in keeping with Legg's proof that powerful
> predictors are necessarily complex. You end up writing lots of code to handle
> special cases and obscure file types to squeeze out just a little more
>
On Fri, Jan 31, 2020, 5:05 PM wrote:
> Gotta hold off on that BWT, it's losing patterns Matt!! I don't feel good
> about it. And the PPM & Cmixing is what I already do ... what are u
> sayinggg?? I mix together many partial matches.
>
You can read about BWT and PPM in my book to understand
Gotta hold off on that BWT, it's losing patterns Matt!! I don't feel good about
it. And the PPM & Cmixing is what I already do ... what are u sayinggg?? I mix
together many partial matches.
--
Artificial General Intelligence List: AGI
Permalink:
hate to say it, but even if you managed to compress it to all hell, whats
driving the computer to speak is more important than it madly rambling the
contradictions that formed the insane useless model.
--
Artificial General Intelligence List: AGI
On Fri, Jan 31, 2020, 3:24 PM wrote:
> On Friday, January 31, 2020, at 2:04 PM, Matt Mahoney wrote:
>
> Compression is a highly experimental process. Most of the stuff I tried
> either didn't work or resulted in tiny improvements.
>
> Last I checked I did 1 thing and shaved off 76MB of the 100MB
On Friday, January 31, 2020, at 2:04 PM, Matt Mahoney wrote:
> Compression is a highly experimental process. Most of the stuff I tried
> either didn't work or resulted in tiny improvements.
Last I checked I did 1 thing and shaved off 76MB of the 100MB wiki8. Yes I
studied it but whoever figured
Have u tried randomizing for it with occams razor as the heuristic?
That seems like a good idea to me, Someones obviously done it right? How did
it go?
--
Artificial General Intelligence List: AGI
Permalink:
On Fri, Jan 31, 2020 at 1:29 PM Matt Mahoney
wrote:
>
>
> On Fri, Jan 31, 2020, 2:11 PM wrote:
>
>> However you do it, If theres no repetition theres no possible
>> compression... its a losing game unless you find where the repetition is.
>> Counting 1's and 0's gets you log over the bits, but
However you do it, If theres no repetition theres no possible compression...
its a losing game unless you find where the repetition is.
Counting 1's and 0's gets you log over the bits, but you lose topological
position, and its only good for say, getting the area of a circle for
computing pi.
On Fri, Jan 31, 2020, 3:49 AM wrote:
> that sounds like the cool way to do it :), i do it the easy way and just
> use a binary key store... and i get my compression by sharing sections of
> the keys.
>
Any benchmark results? I would be interested if it improves compression.
Compression is a
that sounds like the cool way to do it :), i do it the easy way and just use
a binary key store... and i get my compression by sharing sections of the keys.
--
Artificial General Intelligence List: AGI
Permalink:
Paq mixes the probabilities from the context models by stretching: x =
ln(p/(1-p)), then weighted summation using a neural network, then squashing
the output with the inverse function p = 1/(1+e^-x). The weights are
updated as w += Lx(b-p) where b is the actual bit, b-p is the output error,
x is
On Tue, Jan 14, 2020, 7:02 AM wrote:
> I have achieved compression of the wiki8 100MB into 50MB so far. Woho.
> Record is 15MB. Gets hard near the end though basically.
>
That's how I got started in data compression. It took about 4 years for my
code to start beating existing compressors and
94 matches
Mail list logo