Hi,

you can run it as a daemon/server(for example a little flask app). This
optimization also works for cpython apps if you want to avoid the
startup/import time.

Can the work be split up per xml file easily? Then perhaps multiprocessing
will work nicely for you.

Do you need to process all the files each time? Or can you avoid work?

cheers,


On Wed, Feb 13, 2019 at 3:57 PM Joseph Reagle <joseph.2...@reagle.org>
wrote:

>
> On 2/13/19 9:38 AM, Maciej Fijalkowski wrote:
> > My first intuition would be to run it for a bit longer (can you run
> > it in a loop couple times and see if it speeds up?) 2s might not be
> > enough for JIT to kick in on something as complicated
>
> It's a single use utility where each run processes about a 100 XML files,
> doing things like string regex and munging thousands of times.
>
> Is it possible for pypy to remember optimizations across instantiations?
> _______________________________________________
> pypy-dev mailing list
> pypy-dev@python.org
> https://mail.python.org/mailman/listinfo/pypy-dev
>
_______________________________________________
pypy-dev mailing list
pypy-dev@python.org
https://mail.python.org/mailman/listinfo/pypy-dev

Reply via email to