Github user HyukjinKwon commented on the issue:

    https://github.com/apache/spark/pull/20204
  
    @icexelloss, for 
https://github.com/apache/spark/pull/20204#issuecomment-356431026, yes, that's 
the way I usually use too. My worry is though I wonder if this is a proper 
official way to do it because I have been thinking this way is rather meant to 
be internal. 
    
    For this reason, I left a comment in this script for now:
    
    ```diff
    +# If you'd like to run a specific unittest class, you could do such as
    +# SPARK_TESTING=1 ../bin/pyspark pyspark.sql.tests VectorizedUDFTests
    +./run-tests $@
    ```
    
    If I remember this correctly, I think I also had a short talk about it with 
@nchammas and @BryanCutler before (I think it was about more detailed control 
tho).
    
    I think we should better fix `run_tests` script to accept the unittest 
class as an option. I took a look this before and I guess it won't be too 
difficult to introduce another option. Once we fix it, it will also be 
available in this script too because the script here wraps `run_tests`. 
    
    Will probably try to take a look again and open another PR separately 
(maybe within the following week?)


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to