[ 
https://issues.apache.org/jira/browse/SPARK-4746?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14279524#comment-14279524
 ] 

Imran Rashid commented on SPARK-4746:
-------------------------------------

This doesn't work as well as I thought -- all of the junit tests get skipped.  
The problem is a mismatch between the way test args are handled by the junit 
test runner and the scalatest runner.

I think our options are:

1) abandon a tag-based approach: just use directories / file names to separate 
out unit tests & integration tests

2) change all of our junit tests to scalatest.  (its perfectly fine to test 
java code w/ scalatest.)

3) See if we can get scalatest to also run our junit tests

4) change the sbt task to first run scalatest, with all junit tests turned off, 
and then just run the junit tests, so that we can pass in different args to 
each one.

5) just live w/ the fact that the junit tests never match the tags so they are 
effectively considered integration tests.

Note that junit has a notion similar to tags in categories: 
https://github.com/junit-team/junit/wiki/Categories
The main problem here is the difference in the args for the two test runners.

> integration tests should be separated from faster unit tests
> ------------------------------------------------------------
>
>                 Key: SPARK-4746
>                 URL: https://issues.apache.org/jira/browse/SPARK-4746
>             Project: Spark
>          Issue Type: Bug
>            Reporter: Imran Rashid
>            Priority: Trivial
>
> Currently there isn't a good way for a developer to skip the longer 
> integration tests.  This can slow down local development.  See 
> http://apache-spark-developers-list.1001551.n3.nabble.com/Spurious-test-failures-testing-best-practices-td9560.html
> One option is to use scalatest's notion of test tags to tag all integration 
> tests, so they could easily be skipped



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to