The whole idea of creating a docker container is to have a reployable self contained utility. A Docker container image is a lightweight, standalone, executable package of software that includes everything needed to run an application: code, runtime, system tools, system libraries and settings. The concepts are explained in the http://sparkcommunitytalk.slack.com/ slack under section https://sparkcommunitytalk.slack.com/archives/C051KFWK9TJ
Back to AWS, GCP use case, we are currently creating an Istio mesh for GCP to AWS k8s fail-over using the same docker image in both gc <https://cloud.google.com/container-registry>r and ecr <https://docs.aws.amazon.com/AmazonECR/latest/userguide/Registries.html> (container registries) HTH Mich Talebzadeh, Lead Solutions Architect/Engineering Lead Palantir Technologies London United Kingdom view my Linkedin profile <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/> https://en.everybodywiki.com/Mich_Talebzadeh *Disclaimer:* Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction. On Wed, 5 Apr 2023 at 10:59, Ken Peng <k.p...@t-online.de> wrote: > > > ashok34...@yahoo.com.INVALID wrote: > > Is it possible to use Spark docker built on GCP on AWS without > > rebuilding from new on AWS? > > I am using the spark image from bitnami for running on k8s. > And yes, it's deployed by helm. > > > -- > https://kenpeng.pages.dev/ > > --------------------------------------------------------------------- > To unsubscribe e-mail: user-unsubscr...@spark.apache.org > >