GitHub user YasaTjoe added a comment to the discussion: SparkKubernetesOperator 
: FileNotFoundError: [Errno 2] No such file or directory

For some reason, the file is being double opened here. 

run_etl = SparkKubernetesOperator(
        task_id="sparkies",
        namespace=NAMESPACE,
        application_file="/opt/airflow/dags/spark-apps/spark_job_template.txt",
        kubernetes_conn_id='kubernetes_default',
    )

Instead of using a json or yaml for application_file, convert it to txt. I 
converted my json to txt and it worked fine. There is probably another 
solution, something to do with template_spec or template_extension, but I have 
not figured that out. Anyone who has, kindly leave some pointers here. Thank 
you.

GitHub link: 
https://github.com/apache/airflow/discussions/39098#discussioncomment-15051647

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to