Hi,

I haven't spent a lot of time working on the python side of spark before so
apologize if this is a basic question, but I'm trying to figure out the
best way to run a small subset of python tests in a tight loop while
developing.  The closer I can get to sbt's "~test-only *FooSuite -- -z
test-blah" the better.

I'm familiar with the "--modules" in python/run-tests, but even running one
module takes a long time when I want to just run one teeny test
repeatedly.  Is there a way to run just one file?  And a way to run only
one test within a file?

So far, I know I can assembly my own command line like run-tests does, with
all the env vars like PYSPARK_SUBMIT_ARGS etc. and just pass in one test
file.  Seems tedious.  Would it be helpful to add a "--single-test" option
(or something) to run-tests.py?

And for running one test within a file, I know for the unit test files
(like tests.py), I could modify the "main" section to have it run just one
test, but would be nice to be able to do that from the command line.
(maybe there is something similar for doctests, not sure.)  Again, could
add a command line option to run-tests for that, though would be more work
to plumb it through to each suite.

thanks,
Imran

Reply via email to