devscheffer opened a new issue, #40657:
URL: https://github.com/apache/airflow/issues/40657

   ### Apache Airflow version
   
   2.9.2
   
   ### If "Other Airflow 2 version" selected, which one?
   
   _No response_
   
   ### What happened?
   
   I'm trying to use spark kubernetes operator but I'm getting this error.
   
   File "/home/airflow/.local/lib/python3.10/site-packages/jinja2/loaders.py", 
line 204, in get_source
       raise TemplateNotFound(template)
   jinja2.exceptions.TemplateNotFound: 
/opt/airflow/dags/repo/airflow/dags/manual/pocs/sparkoperatortask/sparkapplication.yaml
   
   The manifest is in the same folder as my dag file
   Here is my dag
   
   from datetime import timedelta
   from pathlib import Pathimport pendulum
   from airflow.providers.cncf.kubernetes.operators.spark_kubernetes import (
       SparkKubernetesOperator,
   )
   from airflow import DAG
   
   default_args = {
       "owner": "airflow",
       "email_on_failure": False,
       "email_on_retry": False,
       "retries": 0,
       "retry_delay": timedelta(minutes=1),
   }
   
   base_folder = Path(__file__).parents[0]
   spark_app_manifest = str(base_folder.joinpath("sparkapplication.yaml"))
   
   with DAG(
       dag_id="kubernetes_spark_operator_example",
       default_args=default_args,
       description="An example DAG using Kubernetes Spark Operator",
       schedule=None,
       catchup=False,
       render_template_as_native_obj=True,
       max_active_runs=1,
       max_active_tasks=20,
       start_date=pendulum.datetime(2023, 1, 1, tz="UTC"),
       is_paused_upon_creation=False,
   ) as dag:
       spark_task = SparkKubernetesOperator(
           task_id="spark_pi_task",
           namespace="airflow-dags-dai-gigahorse",
           application_file=spark_app_manifest,
           kubernetes_conn_id="kubernetes_default",
           delete_on_termination=False,
       )
   
   ### What you think should happen instead?
   
   the operator should find the file without errors
   
   ### How to reproduce
   
   create a dag with a sparkkubernetesoperator and add a sparkapplication file 
yaml on the same folder
   
   ### Operating System
   
   ubuntu
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow = "^2.8.1"
   apache-airflow-providers-cncf-kubernetes = "^7.14.0"
   
   ### Deployment
   
   Official Apache Airflow Helm Chart
   
   ### Deployment details
   
   _No response_
   
   ### Anything else?
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to