xs2tarunkukreja commented on issue #34987:
URL: https://github.com/apache/airflow/issues/34987#issuecomment-1766133929

   I am just creating a spark job. When I pass file name as application_file = 
"pipeline.yaml" in SparkKubernetesOperator, it works fine. But when I pass the 
yaml as string e.g. application_file = load_template_v1(), it start to fail.
   
   I am getting following error -
   [2023-10-17 10:23:24,375] Task failed with exception
   Traceback (most recent call last):
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/cncf/kubernetes/operators/spark_kubernetes.py",
 line 114, in execute
       body = _load_body_to_dict(self.application_file)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/cncf/kubernetes/hooks/kubernetes.py",
 line 46, in _load_body_to_dict
       body_dict = yaml.safe_load(body)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/yaml.py", line 
46, in safe_load
       return orig(stream, SafeLoader)
     File "/home/airflow/.local/lib/python3.8/site-packages/yaml/__init__.py", 
line 79, in load
       loader = Loader(stream)
     File "/home/airflow/.local/lib/python3.8/site-packages/yaml/cyaml.py", 
line 26, in __init__
       CParser.__init__(self, stream)
     File "yaml/_yaml.pyx", line 288, in yaml._yaml.CParser.__init__
   TypeError: a string or stream input is required


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to