+1 (non-binding) in reference to all k8s tests for 2.11 (including SparkR
Tests with R version being 3.4.1)

*[INFO] --- scalatest-maven-plugin:1.0:test (integration-test) @
spark-kubernetes-integration-tests_2.11 ---*
*Discovery starting.*
*Discovery completed in 202 milliseconds.*
*Run starting. Expected test count is: 15*
*KubernetesSuite:*
*- Run SparkPi with no resources*
*- Run SparkPi with a very long application name.*
*- Use SparkLauncher.NO_RESOURCE*
*- Run SparkPi with a master URL without a scheme.*
*- Run SparkPi with an argument.*
*- Run SparkPi with custom labels, annotations, and environment variables.*
*- Run extraJVMOptions check on driver*
*- Run SparkRemoteFileTest using a remote data file*
*- Run SparkPi with env and mount secrets.*
*- Run PySpark on simple pi.py example*
*- Run PySpark with Python2 to test a pyfiles example*
*- Run PySpark with Python3 to test a pyfiles example*
*- Run PySpark with memory customization*
*- Run SparkR on simple dataframe.R example*
*- Run in client mode.*
*Run completed in 6 minutes, 47 seconds.*
*Total number of tests run: 15*
*Suites: completed 2, aborted 0*
*Tests: succeeded 15, failed 0, canceled 0, ignored 0, pending 0*
*All tests passed.*

Sean, in reference to your issues, the comment you linked is correct in
that you would need to build a Kubernetes distribution:
i.e.
*dev/make-distribution.sh --pip --r --tgz -Psparkr -Phadoop-2.7
-Pkubernetes*setup minikube
i.e. *minikube start --insecure-registry=localhost:5000 --cpus 6 --memory
6000*
and then run appropriate tests:
i.e. *dev/dev-run-integration-tests.sh --spark-tgz
.../spark-2.4.0-bin-2.7.3.tgz*

The newest PR that you linked allows us to point to the local Kubernetes
cluster deployed via docker-for-mac as opposed to minikube which gives us
another way to test, but does not change the workflow of testing AFAICT.

On Tue, Oct 23, 2018 at 9:14 AM Sean Owen <sro...@gmail.com> wrote:

> (I should add, I only observed this with the Scala 2.12 build. It all
> seemed to work with 2.11. Therefore I'm not too worried about it. I
> don't think it's a Scala version issue, but perhaps something looking
> for a spark 2.11 tarball and not finding it. See
> https://github.com/apache/spark/pull/22805#issuecomment-432304622 for
> a change that might address this kind of thing.)
>
> On Tue, Oct 23, 2018 at 11:05 AM Sean Owen <sro...@gmail.com> wrote:
> >
> > Yeah, that's maybe the issue here. This is a source release, not a git
> checkout, and it still needs to work in this context.
> >
> > I just added -Pkubernetes to my build and didn't do anything else. I
> think the ideal is that a "mvn -P... -P... install" to work from a source
> release; that's a good expectation and consistent with docs.
> >
> > Maybe these tests simply don't need to run with the normal suite of
> tests, and can be considered tests run manually by developers running these
> scripts? Basically, KubernetesSuite shouldn't run in a normal mvn install?
> >
> > I don't think this has to block the release even if so, just trying to
> get to the bottom of it.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

Reply via email to