thanks, that helps!

So run-tests.py adds a bunch more env variables:
https://github.com/apache/spark/blob/master/python/run-tests.py#L74-L97

those don't matter in most cases I guess?

On Sun, Aug 19, 2018 at 11:54 PM Hyukjin Kwon <gurwls...@gmail.com> wrote:

> There's informal way to test specific tests. For instance:
>
> SPARK_TESTING=1 ../bin/pyspark pyspark.sql.tests VectorizedUDFTests
>
> I have a partial fix for our testing script to support this way in my
> local but couldn't have enough time to make a PR for it yet.
>
>
> 2018년 8월 20일 (월) 오전 11:08, Imran Rashid <iras...@cloudera.com.invalid>님이
> 작성:
>
>> Hi,
>>
>> I haven't spent a lot of time working on the python side of spark before
>> so apologize if this is a basic question, but I'm trying to figure out the
>> best way to run a small subset of python tests in a tight loop while
>> developing.  The closer I can get to sbt's "~test-only *FooSuite -- -z
>> test-blah" the better.
>>
>> I'm familiar with the "--modules" in python/run-tests, but even running
>> one module takes a long time when I want to just run one teeny test
>> repeatedly.  Is there a way to run just one file?  And a way to run only
>> one test within a file?
>>
>> So far, I know I can assembly my own command line like run-tests does,
>> with all the env vars like PYSPARK_SUBMIT_ARGS etc. and just pass in one
>> test file.  Seems tedious.  Would it be helpful to add a "--single-test"
>> option (or something) to run-tests.py?
>>
>> And for running one test within a file, I know for the unit test files
>> (like tests.py), I could modify the "main" section to have it run just one
>> test, but would be nice to be able to do that from the command line.
>> (maybe there is something similar for doctests, not sure.)  Again, could
>> add a command line option to run-tests for that, though would be more work
>> to plumb it through to each suite.
>>
>> thanks,
>> Imran
>>
>

Reply via email to