i'm going to try to use this new coping strategy to keep a local
schedule, to reduce random postings to this list.
1002
suddenly sent that. different kind of inhibition.
1004 pasting stuff during dyskinesia ;p
(Pdb) p index
{'capture': {'ditem': ['495FZqKXSr9cCPObKGVuNHShJA79enrwHDk-xcMOBVw',
'-m-6k-usTx0RUbRI9EEDRaiA2vIapMObgKz3S1bB2Vs',
'aA2go7KTnc4ArkqcjDN-pg4A-c97_bypml5C01eS5ZU'], 'length': 20480},
0724
lost a bunch of these when my fingers accidentally clicked 'discard draft'.
with Data.lock:
data.extend_needs_lk(self.path, out_chunks)
self.chunker = None
i'm in the middle of handling an inhibition.
0750 logged on_created for the new unincluded
1410
k let's make multithreading work
1431
return self.peer_stream(txid, range=range)
File "/home/ubuntu/src/pyarweave/ar/peer.py", line 1062, in peer_stream
return io.BufferedReader(PeerStream.from_txid(self, txid), 0x4)
File "/home/ubuntu/src/pyarweave/ar/stream.py", line 12,
0602
oogh i'm so weary. looking forward to doing something pleasant today,
like eating breakfast or napping or writing some code that's _easy_ to
write, not that makes my eyes spasm around and stuff.
anyway i'll copy some chunks back and forth.
0603
self[idx:] = (
the goal amnesia is sad to me
if i forget what i am doing, this is not workable for doing it. i need
to be able to keep doing it to succeed.
one thing i'm thinking of is kind of tricking myself into continuing,
even though i'm not aware of it. then the behavior could remind me,
when the
i guess i'll ignore it for now since i value this task.
if i can accomplish this task -- especially if i can get better at
tasks like this -- then maybe i can handle mailbombs better e.g.
automatically filtering or considering them in a way i prefer
i'm having amnesia and distraction around the goal, and worried about
the mailbomb i'm receiving.
apparently it's been going on since yesterday. flurries of
unsubscriptions every few minutes.
0438
bumping into this:
Found a swap file by the name ".mix_indices.py.swp"
owned by: ubuntu dated: Sat Aug 13 04:37:48 2022
file name: ~ubuntu/src/flat_tree/mix_indices.py
modified: YES
user name: ubuntu host name: ubuntu
process ID: 30926
0434
want to think about the reasons
- if ran capture now, would have debugging data
- meanwhile, the tree issue is prior to capture pouint
ok can work on tree issue
0435
let's check out code from both places.
capture.py:
def append(self, last_indices_id, data, data_start, data_size):
print('Capturing ...')
#capture = Popen("./capture", stdout=PIPE).stdout
capture = Popen(('sh','-c','./capture | tee last_capture.log.bin'),
stdout=PIPE).stdout
04:33 .
that's that :)
rather than just running it, which is easier, I'll instead save it and
try to do more work in a kind of
one thing i'd like to add is to store the capture data before it's
turned into a tree.
then (a) i can check the output without having to upload it to ipfs or
whatever and (b) i can troubleshoot lost chunks between tree
generation and data recording
i'll work on that now maybe
note: the reason to do a dissociated behavior in a separately
dissociated way, is that it reduces the reason for the dissociation
itself
there are 2 thingies ... 2 newer upload data formats, one in
https://github.com/xloem/flat_tree and another in
https://github.com/xloem/log in the wip branch
i really love the idea of having data that doesn't break
wonder where this project is at
1947 thinking of poking at this more. better commit my changes.
re: texts: my friend wasn't influenced as badly as i was, which is
always wonderful to learn and hard to remember and believe later. i
don't know what to say to them.
1948 committed the downloader that doesn't crash, to 'wip'.
more
focus/log email for working on bug[s] in tree iteration
1837 ET i am still happy to be spending time engaging this without
stimulating too many issues.
1837 i am remembering that the issue is happening between the 1st and
2nd leaf, and that i was curious about why i didn't see evidence of
On 8/14/22, zey...@keemail.me wrote:
> 15 Ağu 2022 01:31 tarihinde
> gmkarl+brainwashingandfuckingupthehackersla...@gmail.com:
>
>> while> also being much more easily censorable by hackers.
>>
> You're already in my spam folder. The problem is you've been ruining the
> archive for the last 3
On 8/14/22, zey...@keemail.me wrote:
> gmkarl+brainwashingandfuckingupthehackersla...@gmail.com:
>> On 8/14/22, zey...@keemail.me wrote:
>>
>>> Stop spamming asshole.
>>> -
>>
>> zeynep, why do you say this to me and not professor rat?
>>
>> i really think my spam is better than his. for
On 8/14/22, zey...@keemail.me wrote:
> Stop spamming asshole.
> --
> Tutanota ile gönderildi, güvenli & reklamsız bir posta kutusu.
zeynep, why do you say this to me and not professor rat?
i really think my spam is better than his. for example, my spam
doesn't insult people as much, nor
spammidy spammidy spam
the spam went up the spam to spam its spam
i'll ground a little by spamming the list some more
here are the tree traversal operations after the 1st leaf:
- output 1st leaf
- descend (presumably after advancing 1 sibling)
- advance to subregion
- crash
that's cool and relatively simple. here's the index it's on during the crash:
[[0, {'capture': {'ditem':
so the conclusion i memorised: the issue is happen between the 1st
leaf and the 2nd. after the 1st leaf.
nope lost it!
now i'm having an experience i have sometimes of spreading amnesia,
where thoughts get harder and harder to form inferences from, it seems
no matter what they are but usually i am thinking on a topic
ok i'm a little near the bug conceptually. it's hard to think about.
recent inhibition experience associated.
the issue happens between the 1st leaf and the 2nd.
this means it's probably deterministic! we could celebrate or
something, or keep going
please note although some of the parts seem very familiar this is just
a joke and is not what you actually went through
fictional interlude [actually completely fictional]
ANTICODING CONCENTRATION CAMP
the ex-coders were placed together in the 'computer dump' and forced
to whip each other with rubber hoses, each ex-coder equipped with a
rubber hose of their very own.
obedient workers were given shocking imagery
conclusion -> the issue happens between the 1st leaf and the 2nd.
now i'm working on memorising what i learned about the issue. this is
important information that really helps, but i'm having trouble
thinking it.
0 0 ['4csX1gHqv5OK04kBmXJKIuLU6KWifFn_X9trt1raMho']
assert index_offset_in_stream < stream_output_offset
AssertionError
it only outputs one line before failing. this is so much easier than i
thought it was.
#yield index, header, stream, length
print(index_offset_in_stream,
stream_output_offset, index['capture']['ditem'])
i'm thinking it would be helpful to mutate the downloader so it shows
its stream offset adjacent to its node id, and not much else.
then i can compare its stream offset calculations to what i see in the data.
maybe output both offsets.
spamming the list
let's spam with ... a snippet!
except ArweaveException as exc:
this ridiculous disgusting line of spam is actually a line of code
did not mean to pick the first name of mr. may . did not realise. i
don't know so much formal list culture really :S
anyway i was thinking about how to rebuild cypherpunks after all
evidence of what they are has been erased from your memory and the
archives, and i figured, what seems most
it's hard for me to stay. i'll focus on spamming the list.
bob: "hey tim what is weaker than rsa-4096?"
tim: "u"
tim: "i dunno some kind of um substitution code? is rsa the public key one?"
bob: "no tim: rsa-2048!"
tim: "how is that funny"
bob: "it's just spam"
tim: "stop sending me
i'm thinking it would be helpful to mutate the downloader so it shows
its stream offset adjacent to its node id, and not much else.
then i can compare its stream offset calculations to what i see in the data.
maybe output both offsets.
oh actually it's kinda ok!
the first two children are 300k large, and then there are 3 children
each 100k large.
since it just appended a 100k block, doing its streaming thing, this
makes sense. this is the algorithm i wrote, where it only consolidates
when it reaches the full number of
id of possibly-balanced root: nIOsVBh6IM0EWq10P882TH9OhFw272ByHFJZD8G6rrA
size: 900k. suboffset: 0.
>>> len(stream.dataitem_json(stream.tail[0][1]['ditem'][0],
>>> '-p20J8zfYeZn8jFYiV-X4I62ubge3RW-2pthuB_hN5LrKqA2L4tvX55fgwSoAatG'))
5
it has 5 children. so it's possibly all messed up.
it might
[the test children per node is 3 here]
so, if the first index is 900k bytes long, it should be a balanced
tree if made correctly, because each chunk is 100k large.
i can check that ... in theory ...
FICTION
Cultist-or-whatever: "Oh, I used to do software, but my cult didn't
like that. Oh no! I went training programs where we went near computer
programs and held our breath and whipped each other with rubber hoses
as soon as we looked at them, and then left and breathed again when we
disavowed
i'll show the parts of the root/tail index
[
[
1,// type: index
{ // a way to locate it on the chain. relates to how it was uploaded
'ditem': ['nIOsVBh6IM0EWq10P882TH9OhFw272ByHFJZD8G6rrA'],
'min_block': [
995159,
>>> stream.tail
[[1, {'ditem': ['nIOsVBh6IM0EWq10P882TH9OhFw272ByHFJZD8G6rrA'],
'min_block': [995159,
'-p20J8zfYeZn8jFYiV-X4I62ubge3RW-2pthuB_hN5LrKqA2L4tvX55fgwSoAatG'],
'api_block': 995559}, 0, 90], [0, {'capture': {'ditem':
['rWTfslX9PzbtNeTjlrHmCHQXuW16nZg7iQ7WAY3Y-ZM']}, 'min_block':
spamming this list
{ CAN OF HAM } ~ flies through air
i opened up a repl. i can import the download script and call its parts. [...]
here's the root content: {"ditem":
["u1PDzuiGDbIRpQEFw6OZx6Dm_0tkmNTyY1jT2lkXQTc"], "min_block": [995159,
"-p20J8zfYeZn8jFYiV-X4I62ubge3RW-2pthuB_hN5LrKqA2L4tvX55fgwSoAatG"],
"api_block": 995559}
maybe i'll write a little script to walk the data and verify that none
of the branches overlap
back
it's nice to have foudn a way of engaging my puzzle where i can
consider it a little bit without stimulating wonky issues in myself.
i'm taking a break with intention of returning. we will see whether i
do, or how long it takes me.
the reason is i'm out of water, and i like drinking water when
here's more of the start of the while loop:
# a stack to perform a depth-first enumeration of the tree
# atm, the child offset is not tracked as inner nodes are traversed.
# instead, the global offset is tracked, and children are enumerated
# again when
some people smash bugs.
i do it too. out of community habit.
but we need to respect our bugs, and learn to understand them.
i want to understand these bugs.
bugs are great.
i've always loved bugs.
in software development, we attend to the bugs and take care of them.
we protect them from system problems that spew them all over people's
bank accounts, like when a plague of locusts eats all your crops.
when a plague of locusts eats all your crops, it
yupyup
it's not all clear of course.
the function has like 5 parts.
it could be simpler. it's what i've written atm. it has bugs.
all the workings are a flattened recursion:
51while len(indices):
it's a while loop. everything's in it.
looping on the length of the indices means it runs so long as more
data is queued.
data is pushed to the indices to process it next. this happens a few
lines down. the line numbers
here's the top of the function commented:
def iterate(self):
# this function is the guts of a class that wraps a tree root record
# indexing binary data on a blockchain. it is intended to yield the
# chunks in order when called.
# the number of bytes that have
I'm guessing the reason for my name is that I have not been able to
find therapy that will talk about
slavery/trafficking/brainwashing/mind control . I don't even believe
in mind control, which I understand demonstrates that I have been
exposed to it.
I'd like a therapist who's interested in
spam spam spam spam
i bring you: _digital spam_. it is like canned ham except you need a
VR suit to eat it.
more accurate
# a stack to perform a depth-first enumeration of the tree
# atm, the child offset is not tracked as inner nodes are traversed.
# instead, the global offset is tracked, and children are enumerated
# again when backtracking, under the idea that total
this quirk is left over from my strategies for making the loop off of
a previous loop:
# a stack to perform a depth-first enumeration of the tree
# atm, the offset is not updated as inner nodes are traversed. instead
# they are simply traversed again when backtracked,
i think it was in response to this
> I didn't realise i was traversing the tree in a depth-first manner
until I wrote the comment.
the expression seems scary in maybe an unfamiliar way ... other things ...
> Studying depth-first traversal implementations would likely simplify
this problem for me.
i'm also unsure whether i want to leave the function and study this,
or keep engaging it.
the pressure to stop doing it would of course prefer motions and plans
that switch tasks.
i don't know which
i'm just taking some time for whatever parts of me want to tense and
writhe to have some of their thing
they could be trying to be part of my consciousness for all i know
i'm having some issues i'm not grasping well, patterns of facial and
manual dyskinesia i haven't learned to describe yet
this is cool to me:
# a stack to perform a depth-first enumeration of the tree
# index stream offset index offset region size
indices = [(self.tail, 0, 0,total_size)]
I didn't realise i was traversing the tree in a depth-first
i'm thinking a little about adding a data structure just for debugging
i'm thinking there are probably ways i could actually do that even
more for now here, to debug more effectively
i'm not sure what those ways are
here's the head that i commented:
def iterate(self):
# this function is the guts of a class that wraps a tree root record
# indexing binary data on a blockchain. it is intended to yield the
# chunks in order when called.
# the number of bytes that have been
# for debugging: tracks nodes that should only be visited
once, to check this
visited = set()
here are some more lines from later in the function, just to have them:
print('adding', ditem)
indices.append((ditem, index_offset_in_stream,
index_substart, index_subsize))
break
else:
here's what i've written:
# for debugging:
visited = set()
i know what it's for has to do with it's name.
i'm thinking a comment that mentions it's for debugging, and what it's
for. it's hard to really say what it's for because i'm still
struggling to engage it, but it's not complicated.
this is the next line:
visited = set()
i haven't commented it yet. it's for debugging.
here it is together with the previous line:
# the size of all the chunks: the sum of the sizes of each child node
total_size = len(self)
visited = set()
here's the first one:
# the number of bytes that have been yielded, increased every chunk
stream_output_offset = 0
maybe i'll post them separately, kind of get used to them.
here's the second one:
# the size of all the chunks: the sum of the sizes of each child node
total_size = len(self)
here are two code snippets i've already posted:
# the number of bytes that have been yielded, increased every chunk
stream_output_offset = 0
# the size of all the chunks: the sum of the sizes of each child node
total_size = len(self)
here's some more spam, maybe the next spam can have a code snippet in it.
maybe i'll just practice spamming a little bit, since i don't really
know how bad it is for certain
spam spam spam spam spam
i would of course rather not need to spam. but it doesn't seem the end
of the world now mostly because i have already done it so much.
i'm experiencing inhibition around my coping strategy of spamming the
list to move forward on goals.
one reason for this strategy is to store a log of my behavior.
something highly valued when there is a lot of amnesia in one's life.
another one is to support pretending to spam and disrupt the
# the size of all the chunks: the sum of the sizes of each child node
total_size = len(self)
#note:
#def __len__(self):
# return sum((size for type, data, start, size in self.tail))
# the number of bytes that have been yielded, increased every chunk
stream_output_offset = 0
def iterate(self):
# this function is the guts of a class that wraps a tree root record
# indexing binary data on a blockchain. it yields the chunks in order
# when called.
maybe i'll walk through my function until it gets complicated
it's nice to think of there being more slightly-different ways of
thinking about approaching this.
really i don't usually use many strategies to do my stuff. i just kind
of push and defend hard with the approaches i know, and when i burn
out with those i just kind of wait.
i like this idea of
this is part of a function that uses a stack to behave similarly to a
recursive function.
here's where the stack ('indices = [') is initialised:
(Pdb) list 38
33 # index stream offset index offset region size
34 indices = [(self.tail, 0, 0,
let's look just around the area where the assertion is raised.
the debugger is actually paused there right now.
(Pdb) list
56 assert length > 0
57 yield index, header, stream, length
58
as i spend time around it, and see things, i remember it a little better.
i think i actually resolved the assertion on line 42, by accommodating
data with zero-lengthed branches. this separates the concerns.
then i think i ran into the assertion on line 61 as a remaining issue.
so i could think
the two issues i'm aware of:
42 assert ditem not in visited
this assertion is getting hit.
a third. i have pdb open and this line shows:
61 -> assert index_offset_in_stream < stream_output_offset
this assertion is getting hit too
[1, {'ditem':
i'm now at the window with some code.
i encountered a bug here. it looked likely there were multiple bugs. i
feel confusion around engaging the datastructure i made, as if it is
complex to engage. the confusion seems to increase when i look at
parts of it, variables that hold importance for it.
i made it to the system where i worked on it most recently.
it feels hard to be on this system, notably while holding this intention.
i feel like the current code is very messy. when i think of how that's
similar to my difficulty being there, maybe it makes sense to head to
the code and try
maybe i can just study a small part of it
one of my big issues coding is the development of errors. i don't see
them. i make them, and don't see them , and am right now having issues
debugging it. -> focus on how to do it, not the issues. looking for
moving forward on it.
so -> debugging this, or redesigning it, both are great.
maybe
one cool thing about the second chapter of that cult book is how the
author, just like me, had to struggle so hard to just read single lines of
non-cult things they wanted to read.
and then they learned to read pages. and to demonstrate comprehension of
what was on the pages.
and then they got a
94 matches
Mail list logo