I have given this some thought honestly don't know if splitting into separate jobs will help. - I have seen race conditions with running setuptools in parallel, so more isolation is better. OTOH, if 2 setuptools scripts run at the same time on the same machine they might still race (same homedir, same /tmp dir). - Retrying due to flakes will be faster, but if something is broken you'll need to write 4x the number of "run python precommit" phrases. - Increased parallelism may also run into quota issues with Dataflow.
What benefits do you see from splitting up the jobs? On Mon, Dec 9, 2019 at 4:17 PM Chad Dombrova <chad...@gmail.com> wrote: > After this PR goes in should we revisit breaking up the python tests into > separate jenkins jobs by python version? One of the problems with that > plan originally was that we lost the parallelism that gradle provides > because we were left with only one tox task per jenkins job, and so the > total time to complete all python jenkins jobs went up a lot. With > pytest + xdist we should hopefully be able to keep the parallelism even > with just one tox task. This could be a big win. I feel like I'm spending > more time monitoring and re-queuing timed-out jenkins jobs lately than I am > writing code. > > On Mon, Dec 9, 2019 at 10:32 AM Udi Meiri <eh...@google.com> wrote: > >> This PR <https://github.com/apache/beam/pull/10322> (in review) migrates >> py27-gcp to using pytest. >> It reduces the testPy2Gcp task down to ~13m >> <https://scans.gradle.com/s/kj7ogemnd3toe/timeline?details=ancsbov425524> >> (from ~45m). This speedup will probably be lower once all 8 tasks are using >> pytest. >> It also adds 5 previously uncollected tests. >> >
smime.p7s
Description: S/MIME Cryptographic Signature