Sorry for the late answer, I was unavailable in the last few days. About send() and receive(), it depends on if the communication is local or not. For a local communication, anything can be passed since only the reference is sent. This is the base model for Stackless channels. For a remote communication (between two interpreters), any picklable object (a copy will then be made) and it includes channels and tasklets (for which a reference will automatically be created).
The use of the PyPy proxy object space is to make remote communication more Stackless like by passing object by reference. If a ref_object is made, only a reference will be passed when a tasklet is moved or the object is sent on a channel. The object always resides where it was created. A move() operation will also be implemented on those objects so they can be moved around like tasklets. I hope it helps, Gabriel 2010/7/29 Kevin Ar18 <[email protected]> > > > Hello Kevin, > > I don't know if it can be a solution to your problem but for my > > Master Thesis I'm working on making Stackless Python distributed. What > > I did is working but not complete and I'm right now in the process of > > writing the thesis (in french unfortunately). My code currently works > > with PyPy's "stackless" module onlyis and use some PyPy specific > > things. Here's what I added to Stackless: > > > > - Possibility to move tasklets easily (ref_tasklet.move(node_id)). A > > node is an instance of an interpreter. > > - Each tasklet has its global namespace (to avoid sharing of data). The > > state is also easier to move to another interpreter this way. > > - Distributed channels: All requests are known by all nodes using the > > channel. > > - Distributed objets: When a reference is sent to a remote node, the > > object is not copied, a reference is created using PyPy's proxy object > > space. > > - Automated dependency recovery when an object or a tasklet is loaded > > on another interpreter > > > > With a proper scheduler, many tasklets could be automatically spread in > > multiple interpreters to use multiple cores or on multiple computers. A > > bit like the N:M threading model where N lightweight threads/coroutines > > can be executed on M threads. > > Was able to have a look at the API... > If others don't mind my asking this on the mailing list: > > * .send() and .receive() > What type of data can you send and receive between the tasklets? Can you > pass entire Python objects? > > * .send() and .receive() memory model > When you send data between tasklets (pass messages) or whateve you want to > call it, how is this implemented under the hood? Does it use shared memory > under the hood or does it involve a more costly copying of the data? I > realize that if it is on another machine you have to copy the data, but what > about between two threads? You mentioned PyPy's proxy object.... guess I'll > need to read up on that. > _______________________________________________ > [email protected] > http://codespeak.net/mailman/listinfo/pypy-dev > -- Gabriel Lavoie [email protected]
_______________________________________________ [email protected] http://codespeak.net/mailman/listinfo/pypy-dev
