Thanks Rob for spotting the error!


   view my Linkedin profile
<https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>



*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.




On Fri, 10 Dec 2021 at 10:02, Rob Vesse <rve...@dotnetrdf.org> wrote:

> Mich
>
>
>
> I think you may just have a typo in your configuration.
>
>
>
> These properties all have container in the name, e.g.
> spark.kubernetes.driver.container.image, BUT you seem to be replacing
> container with docker in your configuration files so Spark doesn’t
> recognise the property (i.e. you have spark.kubernetes.driver.docker.image
> which isn’t a valid property)
>
>
>
> Hope this helps,
>
>
>
> Rob
>
>
>
> *From: *Mich Talebzadeh <mich.talebza...@gmail.com>
> *Date: *Friday, 10 December 2021 at 08:57
> *To: *Spark dev list <dev@spark.apache.org>
> *Subject: *In Kubernetes Must specify the driver container image
>
>
>
>
> Hi,
>
>
>
> In the following under Spark Kubernetes configuration
> <https://spark.apache.org/docs/latest/running-on-kubernetes.html#configuration>,
> it states
>
>
>
> spark.kubernetes.container.image, default None, meaning:Container image
> to use for the Spark application. This is usually of the form
> example.com/repo/spark:v1.0.0. *This configuration is required and must
> be provided by the user, unless explicit images are provided for each
> different container type.*
>
>
>
> I interpret this as if you specify both the driver and executor container
> images, then you don't need to specify the container image itself. However,
> if both the driver and executor images are provided with NO container
> image, the job fails.
>
>
>
> Spark config:
>
> (*spark.kubernetes.driver.docker.image,*
> eu.gcr.io/axial-glow-224522/spark-py:3.1.1-scala_2.12-8-jre-slim-buster-container
> )
>
> (*spark.kubernetes.executor.docker.image*,
> eu.gcr.io/axial-glow-224522/spark-py:3.1.1-scala_2.12-8-jre-slim-buster-container
> )
>
>
>
> Using Spark's default log4j profile:
> org/apache/spark/log4j-defaults.properties
>
> 21/12/10 08:24:03 INFO SparkKubernetesClientFactory: Auto-configuring K8S
> client using current context from users K8S config file
>
> *Exception in thread "main" org.apache.spark.SparkException: Must specify
> the driver container image*
>
>
>
> Sounds like that regardless you still have to specify the container image
> explicitly
>
>
>
> HTH
>
>
>
>  [image: Image removed by sender.]  view my Linkedin profile
> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>

Reply via email to