Hi folks.

I'm looking at a test process that takes about 16 minutes for a full run.

Naturally, I'd like to speed it up.  We've already parallelized it - mostly.

It seems like the next thing to look at is setting up a local pypi, and
building some of the packages that're compiled from C/C++ every time we do
a full test run.  (We're using docker and building dependencies for each
full test run)

Also, we could conceivably set up a web proxy...?

Does having a local pypi obviate the web proxy?

And what local pypi servers do folks recommend for speed?

We need support mostly for CPython 3.x, but we still have a little CPython
2.x we require, and it's possible we'll need the 2.x for a while.

Thanks!
-- 
https://mail.python.org/mailman/listinfo/python-list

Reply via email to