I noticed the PR builder builds are all failing with:

[info] - correctly builds R packages included in a jar with --packages !!!
IGNORED !!!
[info] - include an external JAR in SparkR *** FAILED *** (32 milliseconds)
[info]   new java.io.File(rScriptDir).exists() was false
(SparkSubmitSuite.scala:531)
[info]   org.scalatest.exceptions.TestFailedException:
[info]   at
org.scalatest.Assertions$class.newAssertionFailedException(Assertions.scala:500)
[info]   at
org.scalatest.FunSuite.newAssertionFailedException(FunSuite.scala:1555)
[info]   at
org.scalatest.Assertions$AssertionsHelper.macroAssert(Assertions.scala:466)
[info]   at
org.apache.spark.deploy.SparkSubmitSuite$$anonfun$23.apply$mcV$sp(SparkSubmitSuite.scala:531)
...

It seems to only affect the SBT builds; the Maven builds show this test is
cancelled because R isn't installed:

- correctly builds R packages included in a jar with --packages !!! IGNORED
!!!
- include an external JAR in SparkR !!! CANCELED !!!
  org.apache.spark.api.r.RUtils.isSparkRInstalled was false SparkR is not
installed in this build. (SparkSubmitSuite.scala:528)

It seems to have started after:

https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-sbt-hadoop-2.6/3081/

but I don't see how those changes relate.

Did anything happen to chane w.r.t. R tests or the env in the last day?

Reply via email to