I will vote for this. It's pretty helpful to have managed Spark images.
Currently, user have to download Spark binaries and build their own.
With this supported, user journey will be simplified and we only need to
build an application image on top of base image provided by community.

Do we have different OS or architecture support? If not, there will be
Java, R, Python total 3 container images for every release.


On Wed, Feb 5, 2020 at 2:56 PM Sean Owen <sro...@gmail.com> wrote:

> What would the images have - just the image for a worker?
> We wouldn't want to publish N permutations of Python, R, OS, Java, etc.
> But if we don't then we make one or a few choices of that combo, and
> then I wonder how many people find the image useful.
> If the goal is just to support Spark testing, that seems fine and
> tractable, but does it need to be 'public' as in advertised as a
> convenience binary? vs just some image that's hosted somewhere for the
> benefit of project infra.
>
> On Wed, Feb 5, 2020 at 12:16 PM Dongjoon Hyun <dongjoon.h...@gmail.com>
> wrote:
> >
> > Hi, All.
> >
> > From 2020, shall we have an official Docker image repository as an
> additional distribution channel?
> >
> > I'm considering the following images.
> >
> >     - Public binary release (no snapshot image)
> >     - Public non-Spark base image (OS + R + Python)
> >       (This can be used in GitHub Action Jobs and Jenkins K8s
> Integration Tests to speed up jobs and to have more stabler environments)
> >
> > Bests,
> > Dongjoon.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

-- 
Best Regards!
Jiaxin Shan
Tel:  412-230-7670
Address: 470 2nd Ave S, Kirkland, WA

Reply via email to