To be clear I'm currently +1 on this release, with much commentary.

OK, the explanation for kubernetes tests makes sense. Yes I think we need
to propagate the scala-2.12 build profile to make it work. Go for it, if
you have a lead on what the change is.
This doesn't block the release as it's an issue for tests, and only affects
2.12. However if we had a clean fix for this and there were another RC, I'd
include it.

Dongjoon has a good point about the spark-kubernetes-integration-tests
artifact. That doesn't sound like it should be published in this way,
though, of course, we publish the test artifacts from every module already.
This is only a bit odd in being a non-test artifact meant for testing. But
it's special testing! So I also don't think that needs to block a release.

This happens because the integration tests module is enabled with the
'kubernetes' profile too, and also this output is copied into the release
tarball at kubernetes/integration-tests/tests. Do we need that in a binary
release?

If these integration tests are meant to be run ad hoc, manually, not part
of a normal test cycle, then I think we can just not enable it with
-Pkubernetes. If it is meant to run every time, then it sounds like we need
a little extra work shown in recent PRs to make that easier, but then, this
test code should just be the 'test' artifact parts of the kubernetes
module, no?


On Tue, Oct 23, 2018 at 1:46 PM Dongjoon Hyun <dongjoon.h...@gmail.com>
wrote:

> BTW, for that integration suite, I saw the related artifacts in the RC4
> staging directory.
>
> Does Spark 2.4.0 need to start to release these 
> `spark-kubernetes-integration-tests`
> artifacts?
>
>    -
>    
> https://repository.apache.org/content/repositories/orgapachespark-1290/org/apache/spark/spark-kubernetes-integration-tests_2.11/
>    -
>    
> https://repository.apache.org/content/repositories/orgapachespark-1290/org/apache/spark/spark-kubernetes-integration-tests_2.12/
>
> Historically, Spark released `spark-docker-integration-tests` at Spark
> 1.6.x era and stopped since Spark 2.0.0.
>
>    -
>    
> http://central.maven.org/maven2/org/apache/spark/spark-docker-integration-tests_2.10/
>    -
>    
> http://central.maven.org/maven2/org/apache/spark/spark-docker-integration-tests_2.11/
>
>
> Bests,
> Dongjoon.
>
> On Tue, Oct 23, 2018 at 11:43 AM Stavros Kontopoulos <
> stavros.kontopou...@lightbend.com> wrote:
>
>> Sean,
>>
>> Ok makes sense, im using a cloned repo. I built with Scala 2.12 profile
>> using the related tag v2.4.0-rc4:
>>
>> ./dev/change-scala-version.sh 2.12
>> ./dev/make-distribution.sh  --name test --r --tgz -Pscala-2.12 -Psparkr
>> -Phadoop-2.7 -Pkubernetes -Phive
>> Pushed images to dockerhub (previous email) since I didnt use the
>> minikube daemon (default behavior).
>>
>> Then run tests successfully against minikube:
>>
>> TGZ_PATH=$(pwd)/spark-2.4.0-bin-test.gz
>> cd resource-managers/kubernetes/integration-tests
>>
>> ./dev/dev-run-integration-tests.sh --spark-tgz $TGZ_PATH
>> --service-account default --namespace default
>> --image-tag k8s-scala-12 --image-repo skonto
>>
>

Reply via email to