shkolar-sisense opened a new issue #12700:
URL: https://github.com/apache/airflow/issues/12700
**Apache Airflow version**: 1.10.12
**Kubernetes version (if you are using kubernetes)** (use `kubectl
version`):v1.18.10
**Environment**: Minikube 1.14.2
**What happened**:
I have airflow running KubernetesPodOperator in order to do a Spark-submit
call:
(I Built the spark image on my own).
```
spark_image = f'{getenv("REGISTRY")}/myApp:{getenv("TAG")}'
j2g = KubernetesPodOperator(
dag=dag,
task_id='myApp',
name='myApp',
namespace='data',
image=spark_image,
cmds=['/opt/spark/bin/spark-submit'],
configmaps=["data"],
arguments=[
'--master k8s://https://10.96.0.1:443',
'--deploy-mode cluster',
'--name myApp',
f'--conf spark.kubernetes.container.image={spark_image}',
'local:///app/run.py'
],
```
However, I'm getting the following error:
`Error: Unrecognized option: --master k8s://https://10.96.0.1:443
`**What you expected to happen**:
I expect the arguments to be passed to the command line as expected
When I do a bin/bash to a running container and running the spark-submit
command with the same arguments, it works.
Therefore; it looks like there is a problem with the arguments being passed
to the command.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]