I have to assume the following is either not useful, not practical, or both -but I have to ask...
Version I. It seems to me that PyPy might be useful as a kind of "Co-Python" -viz. the PyPy VM running as a CPython extension, as a way to utilise a second core from a single CPython process. I'm realistic about the level of integration that could be achieved, -but even something that amounted to (fast) RPC inside a process might be useful given the second vm was arguably "free for the using" on a multicore system. If this was in fact useful, it might be an on-ramp into the Python mainstream for PyPy. PyPy needs on-ramps imho. Version II. Is "multiple isolated (but bridged) VM's in a single process" a possible general technique? If multiple cooperating processes is a reasonable response to the "single threaded VM meets multi-core architecture" problem -then is "multiple VM's in a single process" a better one? I'm talking about an ugly, brute force alternative to a multi threaded VM. Viz. simply duplicating / munging / mangling / tagging as much of the interpreter code as was necessary to allow say two (but not n) threads to safely run at once. 2 heaps, 2 GILS and any given thread only ever running against one of these "sub vm's". Then perhaps add explicitly managed opportunities for memory sharing and other integration strategies betwen the two. It strikes me as something that you wouldn't want to attempt in C, but might be possible in the PyPy translation. "Dual-core Python" .."Hydra" -or is this just crazy talk? Pete F _______________________________________________ [email protected] http://codespeak.net/mailman/listinfo/pypy-dev
