The way to make that available is as an addon; then use the wiki to give
examples.

Thanks for writing it.  The pyx primitives are like nitroglycerine: even
look at a result and it blows up on you. They need a good set of cover
functions.

Henry Rich

On Tue, May 3, 2022, 7:37 PM Jan-Pieter Jacobs <[email protected]>
wrote:

> That is excellent Henry, thank you for taking the time to work out this
> example. I will try to put it up on the NuVoc page if I can find a moment
> of time.
>
> I see the generality of these pyx stuffing primitives, but still find them
> very tricky to work with and very error prone.
> So I wrote up a class for coroutines that more or less does what I wanted,
> without requiring keeping track of pyxes in two directions etc.
>
> The model is quite simple:
> the new_coro_ adverb takes a coroutine verb and turns it into a verb that
> starts the coroutine in parallel. This derived verb takes as x an optional
> delay, and as y any parameters to be passed to the freshly created
> coroutine. It returns an object reference (locale) that should be used in
> any calls to yield (in the coroutine verb) and resume (in the controlling
> context).
> The coroutine verb receives this object when it is started as left argument
> (to be used with yield as yield__x), so coroutine verbs are limited to
> monads. There's also a status verb, checking whether the coroutine can be
> resumed, and an end verb.
>
> If there's any interest, I could put it as a library on github as well.
>
> Here goes the code:
>
> NB. Class for coroutines
> NB. uses Henry's suggested semaphore mechanism, wrapped in an OOP object to
> limit loose pieces hanging around.
>
> coclass 'coro'
> NB. creator function, does not do much but verify at least 1 thread is
> running.
> create =: {{
> 0&T.@''^:(-1&T.@'') 1 NB. ensure 1 thread
> NB. starting of coro need to be handled by adverb, as it requires the verb
> }}
> NB. entrypoint: adverb taking coroutine verb and returns a verb that takes
> a timeout as x and as y any initial arguments. This derived verb in turn
> returns a reference to the coro object to refer to when using yield__x and
> resume__ref.
> new =: {{
>   30&$:
> :
>   co =. conew 'coro'
>   WAIT__co =: x
>   pyxback__co =: 5 T. WAIT
>   pyxfin__co=: co u t. ''y
>   co
> }}
> NB. naming convention:
> NB. - forw suffix is for passing stuff from main to coro
> NB. - back suffix is for passing stuff from coro to main
> NB. Mutex needed?
> sendsema =:{{ pl [ 6 T. x,<(pl =. 5 T. WAIT);<y}}
> NB. yield: Only called by the coro. takes values to pass back to the "main"
> thread, returns values passed to the resume function in the "main" thread.
> yield  =: {{r [ pyxback=:p [ 'p r'=. > pyxforw =: pyxback sendsema y}}
> NB. resume: Only called by the "main" thread. Passes aruments to the coro
> and returns yielded values from the coro.
> resume =: {{r [ pyxforw=:p [ pyxback =: pyxforw sendsema y [ 'p
> r'=.>pyxback}}
>
> NB. status returns 1 when coro is active, and 0 when it has ended, based on
> whether pyx fin is pending, i.e. > 0
> NB. takes as optional y a tiny delay needed for allowing coroutine to
> return if done, default 0.001
> status =: {{0 < 4 T. pyxfin [ 6!:3]{.y,0.001 }}
> destroy =: {{
> NB. if anything is hanging/waiting, send them an error.
> for_p. pyxforw,pyxback,pyxfin do.
>   if. 0 < 4 T. p do. 7 T. p,<18 end.
> end.
> codestroy '' NB. remove locale
> }}
> NB. shortcut for getting return value and disposing of object.
> end =: {{r [ destroy '' [ r=.>pyxfin}}
>
> NB. back to base for a test
> cocurrent'base'
> NB. test coroutine; only limitation: x is be coro object reference passed
> by new_coro_ function
> test =: {{
> echo 'C: coro init done'
> echo 'C: test init args ', y NB. presume literal args
> args =. yield__x |. y
> echo 'C: got new args: ', args
> args =. yield__x |. args
> echo 'C: sigh ... I''ve had enough of this: ',args
> 'ret: so long and thanks for all the fish'
> }}
>
> NB. first test, simple
> main =: {{
> c =: 10 test new_coro_ 'init args for C'
> echo 'M: got from coro: ', resume__c 'what the ...'
> echo 'M: got from coro(2): ', resume__c 'last warning'
> echo pyxfin__c
> end__c
> }}
>
> NB. second test, using loop and status.
> main2 =: {{
> c =: 10 test new_coro_ 'init 2'
> while. status__c'' do.
> echo resume__c 'abcde' {~ 3 ?@$ 5
> end.
> end__c ''
> }}
>
> The output of main'' and main2'' is:
>    main''
> C: coro init done
> C: test init args init args for C
> C: got new args: what the ...
> M: got from coro: C rof sgra tini
> M: got from coro(2): ... eht tahw
> C: sigh ... I've had enough of this: last warning
> ret: so long and thanks for all the fish
>    main2''
> C: coro init done
> C: test init args init 2
> 2 tini
> C: got new args: dbe
> ebd
> C: sigh ... I've had enough of this: abb
> ret: so long and thanks for all the fish
>
> I'm curious to hear any comments, improvements, ... It looks reasonable to
> me, but I have not tested it enough to know it will not fail somewhere.
>
> Jan-Pieter
>
> Op zo 1 mei 2022 om 23:15 schreef Henry Rich <[email protected]>:
>
> > Maybe I oughta do it:
> >
> > NB. *** semaphore verbs
> >
> > NB. x is remote pyx, y is data to send
> > NB. result is local pyx (for recv data)
> > sendsema =: {{ 6 T. x,<(localpyx =. 5 T. 30);<y
> > localpyx }}
> >
> > NB. y is local pyx, result is remote pyx ; recv data
> > recvsema =: {{ >y }}
> >
> > NB. *** create a thread
> >
> > 0 T. ''
> >
> > NB. *** the conjoined routines
> >
> > coroutine =: {{
> > 'remotepyx arg'=.y
> > smoutput arg
> > localpyx =.  remotepyx sendsema 'got data 1'
> > 'remotepyx arg' =. recvsema localpyx
> >
> > smoutput arg
> > localpyx =.  remotepyx sendsema 'got data 2'
> > 'remotepyx arg' =. recvsema localpyx
> >
> > smoutput arg
> > localpyx =.  remotepyx sendsema 'got data 3'
> > 'remotepyx arg' =. recvsema localpyx
> > arg
> > }}
> >
> > mainroutine =: {{
> > rc =. coroutine t. '' (localpyx=. 5 T. 30);'data 1'
> >
> > 'remotepyx arg' =. recvsema localpyx
> > smoutput arg
> > localpyx =.  remotepyx sendsema 'data 2'
> >
> > 'remotepyx arg' =. recvsema localpyx
> > smoutput arg
> > localpyx =.  remotepyx sendsema 'data 3'
> >
> > 'remotepyx arg' =. recvsema localpyx
> > smoutput arg
> > localpyx =.  remotepyx sendsema 'finished'
> >
> >  >rc }}
> >
> >
> >
> > NB. Now run it
> >
> >     mainroutine''
> > data 1
> > got data 1
> > data 2
> > got data 2
> > data 3
> > got data 3
> > finished
> >
> > Henry Rich
> >
> > On 5/1/2022 4:20 PM, Jan-Pieter Jacobs wrote:
> > > Thanks for those answers, Henry.
> > >
> > > With regards to point 3., I find it hard to see how the example on the
> T.
> > > vocabulary page works, I don't understand it (also because there is a
> > > sequence described, but no actual code). What I am currently missing is
> > > having a way of using non-throw-away tasks, that can be supplied with
> new
> > > data, can return data and continue working afterwards. An example I had
> > in
> > > mind when writing the remark you responded to, could be written like
> this
> > > in Lua (note, coroutines are actually not parallel in Lua, but I don't
> > > think that's a conceptual problem; yield transfers control to the
> > function
> > > calling resume, and allows passing results; resume does the opposite):
> > >
> > > co = coroutine.create(
> > >          function(a,b)
> > >                  print("co got initial arguments",a,b)
> > >                  for k=1,3 do
> > >                          local res = a+b
> > >                          print("co sends",res)
> > >                          a,b = coroutine.yield(res)
> > >                          print("co received",a,b)
> > >                  end
> > >                  print"last iteration"
> > >                  return 100*(a+b)
> > >          end
> > >          )
> > > print"created coroutine"
> > > print("first resume", coroutine.resume(co,1,2))
> > > print("second resume", coroutine.resume(co,2,3))
> > > print("third resume", coroutine.resume(co,3,4))
> > > print("fourth resume", coroutine.resume(co,4,5))
> > > print("fifth resume", coroutine.resume(co,5,6))
> > >
> > > I think it's fairly readable without knowing the intricacies of Lua. It
> > > would return the output below:
> > > created coroutine
> > > co got initial arguments        1       2
> > > co sends        3
> > > first resume    true    3
> > > co received     2       3
> > > co sends        5
> > > second resume   true    5
> > > co received     3       4
> > > co sends        7
> > > third resume    true    7
> > >
> > > co received     4       5
> > >
> > > last iteration
> > >
> > > fourth resume   true    900
> > >
> > > fifth resume    false   cannot resume dead coroutine
> > >
> > > I tried imagining how to write it with the 5, 6 and 10-14 T.  but
> > couldn't
> > > figure it out, and certainly not in a way that is user-friendly. At
> best
> > it
> > > would require locales, or updating random global variables, and it
> would
> > be
> > > far from user-friendly.
> > > I could imagine an interface to behaviour as in Lua implemented in J as
> > > follows (note, not an expert language implementer, just a user):
> > >
> > > Coroutines in this J concept described below are just tasks running on
> > > threads.
> > > Tasks containing y. are coroutines (y. for yield).
> > > y. takes its y argument, and puts those arguments in the pyx that is
> > > returned when the verb resulting from u t. '' (referred to as
> > > "corresponding verb" below) is called, and blocks the coroutine until
> > > resumed.
> > > A coroutine is resumed when its corresponding verb is called, which
> also
> > > passes the new arguments to the coroutine as result of the call to y. .
> > >
> > > t. could be extended to the following cases: add possibility to give t.
> > an
> > > array of gerund as left argument, starting a plurality of tasks and
> > > returning a gerund of verbs for accessing them. n can be a literal for
> > > options one might have in mind ('' for now). Boxes can be used if:
> > options
> > > are needed for multiple threads, or in case options and thread numbers
> > are
> > > combined. If u/m does not have the same length (#) as n, it should
> throw
> > a
> > > length error.
> > > The corresponding verb for each task returns a pyx for each task,
> filled
> > > with either results returned, or in case of coroutines resulting from a
> > > call to y. . For instance, c0`c1`c2"0 i. 3 3 passes 0 1 2 to c0, 4 5 6
> to
> > > c1 and 7 8 9 to c2, returning 3 pyxes that will contain the results
> > > yielded/returned by c0, c1 and c2.
> > >
> > > Another adverb/conjunction could be used to check threads being
> resumable
> > > based on their corresponding verb (e.g. t: ; if T. were a
> > > conjunction/adverb it could as well be a case of T. when called with a
> > verb
> > > as left argument): e.g. 0 dead (task not running or not containing
> y.); 1
> > > waiting for input (yielded) (perhaps to be extended to other statuses,
> > e.g.
> > > idle, running, which could be interesting for other purposes, I guess.
> > >
> > > If implemented as such, it could provide for a lot of interesting
> > patterns
> > > I think:
> > > - generators/iterators, especially if iterators would be allowed in
> e.g.
> > > for/while loops instead of arrays, e.g. like
> > >    (gen =: genverb t '') initial_args
> > >    for_k. gen do. ... would continue calling the generated coroutine
> > (e.g.
> > > with default arg '') until it is dead, or only once when the verb did
> not
> > > yield, and just returned normally. See also the last example below.
> > > - scatter/gather operations where the verb's state could be kept from
> > > previous iterations.
> > >
> > > This would also render it easy to communicate in two directions with
> > > running tasks, as well as allow easy translation from other languages
> > > having these constructs. The Lua case points out coroutines can also
> be a
> > > useful abstraction when running in a single thread (as in Lua is single
> > > threaded, and coroutines are just an abstraction, see e.g.
> > > https://www.lua.org/pil/9.3.html).
> > >
> > > With the above interface, this Lua example above could as follows
> > > 0 T. '' NB. create thread 1
> > > co_fun =: {{
> > >    'a b'=. y   NB. received from the creating thread when first calling
> > the
> > > verb resulting from v t. ''
> > >    echo 'co got initial arguments ', ": a,b
> > >    for. i. 3 do.
> > >            res=. a + b
> > >            echo 'co sends ',":res
> > >            'a b' =. y. res NB. blocks, so no pyxes needed
> > >            echo 'co received ',": a,b
> > >    end.
> > >    echo 'last iteration'
> > >    100*a+b
> > > }}
> > > NB. the example here would print about the same as the Lua version:
> > >
> > > co=: co_fun t. ''  NB. create new task (in this case,coroutine, since
> it
> > > calls y.), co is a the corresponding verb
> > > echo 'created coroutine'
> > > echo 'first  resume ', > co 1 2 NB. corresponds to initially starting
> > > co_fun, setting the y argument.
> > > echo 'second resume ', > co 2 3
> > > echo 'third  resume ', > co 3 4
> > > echo 'fourth resume ', > co 4 5
> > > echo 'fifth  resume ', > co 5 6 NB. opening pyx signals domain error?
> > > resuming a dead coroutine is impossible.
> > >
> > > NB. Another example using t: for checking coroutine c is still alive
> (1)
> > or
> > > dead (0):
> > > co =. co_fun t. ''
> > > NB. as long as co is alive, resume it
> > > (1, >@co)^:(co t:)^:(_) 1 2
> > > NB. would print:
> > > NB. co got initial arguments 1 2
> > > NB. co sends 3
> > > NB. co received 1 3
> > > NB. co sends 4
> > > NB. co received 1 4
> > > NB. co sends 5
> > > NB. co received 1 6
> > > NB. last iteration
> > > NB. 1 700 NB. return value, not printed.
> > >
> > > A last example loops over different permutations without having to
> > > pre-calculate all (mildly adapted from section 9.3 of the Lua book
> linked
> > > above):
> > > permgen = {{ NB. takes a, n as inputs
> > >    'a nn'=.y
> > >    if. nn = 0 do. y. a else.
> > >      for_i. i. nn do.
> > >        a =. (a {~ i,nn) (nn,i)} a NB. swap index i and n
> > >        permgen a;<:nn             NB. generate all permutations of the
> > other
> > > elements
> > >        a =. (a {~ i,nn) (nn,i)} a NB. swap back
> > >      end.
> > >    end.
> > > }}
> > > perm =: {{ NB. adverb creating a verb corresponding to the permgen
> > > coroutine, created with initial arguments m and #m
> > >    co =. permgen t. ''
> > >    co (;#) m
> > >    co
> > > }}
> > >
> > > NB. now the following, with a for loop adapted to allow verbs being
> > passed
> > > to be called with '' until c t: is 0.
> > >
> > > for_p. 'abc' perm do.
> > >    echo p
> > > end.
> > >
> > > NB. would, if I did everything correctly, print:
> > > bca
> > > cba
> > > cab
> > > acb
> > > bac
> > > abc
> > >
> > > I hope that I didn't bore people to death and that I did not make any
> > > reasoning errors, it's hard to verify if it's not executable. One thing
> > > that could be an issue is that if the returned pyxes are not opened
> > before
> > > the coroutine is called again, you could have multiple pyxes waiting
> > before
> > > previous ones are opened. I don't know whether this could lead to
> > > troubles...
> > > Main advantage of the above proposal I see is that it would make it
> very
> > > easy to translate generator/iterator patterns from other languages
> (e.g.
> > > Python, Lua), thereby lowering the entry threshold for people
> acquainted
> > > with such language. I think that as these multi-tasking primitives are
> > > introduced now, providing low-level control is good, but it would be
> > > beneficial if they were also general and easy to use in common cases.
> > > Let me know what you think. Anyhow, I had fun with this little thought
> > > experiment.
> > >
> > > Jan-Pieter.
> > >
> > >
> > >
> > > Op wo 27 apr. 2022 om 16:47 schreef Henry Rich <[email protected]>:
> > >
> > >> 0. Space accounting is wrong in beta-b when memory is allocated in a
> > >> task and freed in the master.  This will be fixed in beta-c.
> > >>
> > >> 1. Yes, if you write > u t. '' y   it will block immediately.
> > >>
> > >> 2. You started a task and then lost interest in the result.  I guess
> we
> > >> will need a way to terminate a task but I don't see that it needs to
> be
> > >> automatic.
> > >>
> > >> 3. beta-c has new T.-verbs that allow you to create semaphores. This
> > >> would do what you are describing.  More verbs will certainly be needed
> > >> (mutexes, for example) and we welcome suggestions.
> > >>
> > >> 4. Globals should 'work' in that when you read a global that is being
> > >> modified in another thread you will get either the old or the new
> value;
> > >> same for files and mapped files. wd may have problems.  Without
> > >> synchronization primitives that's not enough for useful work.  There
> was
> > >> a synchronization error in beta-b (that took several days to find), to
> > >> be fixed in beta-c.
> > >>
> > >> Henry Rich
> > >>
> > >>
> > >>
> > >> On 4/27/2022 10:28 AM, Jan-Pieter Jacobs wrote:
> > >>> Hi there,
> > >>>
> > >>> I finally managed to find an example where the parallel processing
> > made a
> > >>> measurable improvement (that is, previously, I did not find verbs
> where
> > >> the
> > >>> overhead was less than the computational improvement).
> > >>> Doing so, I also wrote some cover verbs that some might find useful.
> > Note
> > >>> that these leave chunking up the input to the user, opposed to the
> > verbs
> > >>> Elijah Stone recently posted.
> > >>>
> > >>> startpool =: (0 T. ''"_)^:] NB. starts a thread pool of y threads
> > >>> peach =: (t.'')(&>)     NB. parallel version of each; still looking
> for
> > >> an
> > >>> apricot verb.
> > >>> pevery =: (t.'')(&>)(>@:)  NB. parallel version of every (note, @:,
> > not @
> > >>> for speed improvement)
> > >>>
> > >>> For instance:
> > >>>      startpool 8 NB. start 8 threads (note, I have a core i5 with 2
> > >> physical
> > >>> cores, 4 virtual ones).
> > >>> 8
> > >>>      in =: <"2]30 300 300 ?@$100 NB. 30 boxed matrices
> > >>> NB. calculating determinants
> > >>>      10 timespacex 'outseq =: -/ .* every in'  NB. normal every
> > >>> 1.22102 3.14651e7
> > >>>      10 timespacex 'outpar =: -/ .* pevery in' NB. parallel version
> > >>> 0.578477 3.1583e6
> > >>>      outseq -: outpar
> > >>> 1
> > >>>      10 timespacex 'outseq =: -/ .* each in'  NB. normal each
> > >>> 1.21831 3.14669e7
> > >>>      10 timespacex 'outpar =: -/ .* peach in' NB. cheating, due to
> > >>> asynchronous threads.
> > >>> 0.555217 4.20666e6
> > >>>      outseq -: outpar
> > >>> 1
> > >>> NB. inverting 30 matrices
> > >>>      10 timespacex 'outseq =: %. every in'
> > >>> 0.526015 3.90624e7
> > >>>      10 timespacex 'outpar =: %. pevery in'
> > >>> 0.30344 4.30001e7
> > >>>      outseq -: outpar
> > >>> 1
> > >>>
> > >>> A few things I noted:
> > >>> 0. Why does "-/ .* pevery" consume 10 times less memory than "-/ .*
> > >> every"?
> > >>> It is not the case with %. .
> > >>>
> > >>> 1. Note that in the definition of pevery, using @: instead of @ is
> > >>> necessary for better performance. Otherwise, it appears that the
> > created
> > >>> pyxes are immediately opened: For instance "%. pevery" generates as
> > verb
> > >>>> @:(%. t.''&>), in which the part in parenthesis has verb rank 0 0 0.
> > >> Using
> > >>>> @ would execute > on each pyx individually, which apparently in the
> > >>> current is immediately after its creation. This causes the master to
> > >> block,
> > >>> and effectively execute all calls sequentially (if I understood
> > >> correctly).
> > >>> I don't know whether this is considered an implementation detail not
> > >>> specified in the specification of the parsing, but it's important to
> > know
> > >>> in this case.
> > >>>
> > >>> 2. What happens with busy threads whose pyxes are no longer
> accessible?
> > >>> Should they be terminated, i.e. in the spirit of garbage collection
> of
> > >>> unreachable values? Perhaps they could, only if they can be assured
> not
> > >> to
> > >>> have side effects (e.g. do not have explicit components in the verb
> > >> that's
> > >>> being executed that could cause side effects, or perhaps some
> foreigns,
> > >>> e.g. file operations, or perhaps there could be an option to t.
> letting
> > >> the
> > >>> user "promise" the supplied verb does not have side-effects).
> > >>>
> > >>> 3. Would it make sense to have threads that can return values
> multiple
> > >>> times? I'd think of generators and iterators that could be more
> easily
> > >>> written, or perhaps used as in Lua's coroutines (
> > >>> https://www.lua.org/manual/5.3/manual.html#2.6) which, albeit not
> > being
> > >>> parallel, allow the user to resume a coroutine, and the function to
> > yield
> > >>> values, continuing after the resume where it left off. Now this is
> > >> possible
> > >>> only using locales containing some state, jumping around the next
> > >> function
> > >>> using goto., which feels more clumsy than it should (for an example,
> > see
> > >>> the classes "permutations" and "altflipsgen" in my fannkuch
> > >> implementation
> > >>> here:
> > http://www.jsoftware.com/pipermail/beta/2021-September/010048.html
> > >> ).
> > >>> I think this could be integrated with the pyx concept, where
> successive
> > >>> openings of a pyx could be equivalent to successive "resume" (or
> > "next")
> > >>> calls. This would however make the content of a pyx a read-only-once
> > >> value,
> > >>> that is changed by the parallely executing thread after it is opened
> > the
> > >>> first time. This would also need a primitive (and/or control word for
> > >>> explicit definitions) for yielding a value to the calling/opening
> > thread.
> > >>> This way, it would be impossible to pass values to the thread when
> > >> resuming
> > >>> though.
> > >>> Another way of implementing this without modifying the pyxes'
> behaviour
> > >> and
> > >>> allowing passing new values to the thread, would be to have a
> separate
> > >>> "resume" verb that can be called on a thread number. Perhaps this
> does
> > >> not
> > >>> even have to be integrated with the true concurrency, so that it
> could
> > as
> > >>> well be used as a conceptual abstraction (as in Lua, see the link
> > above)
> > >> on
> > >>> systems not supporting true multi-threading (as appears to be the
> case
> > >> for
> > >>> 32 bit J at the moment).
> > >>> Aside from allowing implementation of generators and whatnots, it
> would
> > >>> also allow making specific threads dedicated to executing a specific
> > verb
> > >>> repetitively, on different data, which might (or not, I'm far from
> > >> expert)
> > >>> be more efficient as it could keep in memory the routine it is
> > executing.
> > >>>
> > >>> 4. Did anyone try to combine this new threading model with
> > >> reading/writing
> > >>> globals (or even more fun, memory mapped files)? With the C FFI, or
> the
> > >> wd
> > >>> subsystem?
> > >>>
> > >>> Just my 5 cents.
> > >>>
> > >>> Jan-Pieter
> > >>>
> ----------------------------------------------------------------------
> > >>> For information about J forums see
> http://www.jsoftware.com/forums.htm
> > >>
> > >> --
> > >> This email has been checked for viruses by AVG.
> > >> https://www.avg.com
> > >>
> > >> ----------------------------------------------------------------------
> > >> For information about J forums see
> http://www.jsoftware.com/forums.htm
> > >>
> > > ----------------------------------------------------------------------
> > > For information about J forums see http://www.jsoftware.com/forums.htm
> >
> >
> > --
> > This email has been checked for viruses by AVG.
> > https://www.avg.com
> >
> > ----------------------------------------------------------------------
> > For information about J forums see http://www.jsoftware.com/forums.htm
> >
> ----------------------------------------------------------------------
> For information about J forums see http://www.jsoftware.com/forums.htm
>
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to