> 
> I haven't tried getting the SciPy stack running with PyParallel yet.

That would be essential for my use. I would assume a lot of potential 
PyParallel users are in the same boat.

Thanks for the info about PyPy limits. You have a really interesting project. 

-- 

Gary Robinson
gary...@me.com
http://www.garyrobinson.net

> On Sep 9, 2015, at 7:02 PM, Trent Nelson <tr...@snakebite.org> wrote:
> 
> On Wed, Sep 09, 2015 at 04:52:39PM -0400, Gary Robinson wrote:
>> I’m going to seriously consider installing Windows or using a
>> dedicated hosted windows box next time I have this problem so that I
>> can try your solution. It does seem pretty ideal, although the STM
>> branch of PyPy (using http://codespeak.net/execnet/ to access SciPy)
>> might also work at this point.
> 
> I'm not sure how up-to-date this is:
> 
> http://pypy.readthedocs.org/en/latest/stm.html
> 
> But it sounds like there's a 1.5GB memory limit (or maybe 2.5GB now, I
> just peaked at core.h linked in that page) and a 4-core segment limit.
> 
> PyParallel has no memory limit (although it actually does have support
> for throttling back memory pressure by not accepting new connections
> when the system hits 90% physical memory used) and no core limit, and it
> scales linearly with cores+concurrency.
> 
> PyPy-STM and PyParallel are both pretty bleeding edge and experimental
> though so I'm sure we both crash as much as each other when exercised
> outside of our comfort zones :-)
> 
> I haven't tried getting the SciPy stack running with PyParallel yet.
> 
>    Trent.

_______________________________________________
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com

Reply via email to