motherhubbard opened a new issue #16140:
URL: https://github.com/apache/airflow/issues/16140


   <!--
   
   Welcome to Apache Airflow!  For a smooth issue process, try to answer the 
following questions.
   Don't worry if they're not all applicable; just try to include what you can 
:-)
   
   If you need to include code snippets or logs, please put them in fenced code
   blocks.  If they're super-long, please use the details tag like
   <details><summary>super-long log</summary> lots of stuff </details>
   
   Please delete these comment blocks before submitting the issue.
   
   -->
   
   <!--
   
   IMPORTANT!!!
   
   PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
   NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
   
   PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
   
   Please complete the next sections or the issue will be closed.
   These questions are the first thing we need to know to understand the 
context.
   
   -->
   
   **Apache Airflow version**:
   2.1.0
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
   N/A
   
   **Environment**:
   
   3.10.0-1062.1.2.el7.x86_64 #1 SMP Mon Sep 30 14:19:46 UTC 2019 x86_64 x86_64 
x86_64 GNU/Linux
   
   **What happened**:
   
   Nasty stack trace:
   
   ```[ESC[34m2021-05-28 14:30:03,379ESC[0m] {ESC[34mdagbag.py:ESC[0m320} 
ERRORESC[0m - Failed to import: /airflow/dags/dag1.pyESC[0m
   Traceback (most recent call last):
     File "/airflow_venv/lib/python3.6/site-packages/airflow/models/dagbag.py", 
line 317, in _load_modules_from_file
       loader.exec_module(new_module)
     File "<frozen importlib._bootstrap_external>", line 678, in exec_module
     File "<frozen importlib._bootstrap>", line 219, in 
_call_with_frames_removed
     File "/airflow/dags/dag1.py", line 3, in <module>
       from airflow.contrib.operators.spark_submit_operator import 
SparkSubmitOperator
     File 
"/airflow_venv/lib/python3.6/site-packages/airflow/contrib/operators/spark_submit_operator.py",
 line 23, in <module>
       from airflow.providers.apache.spark.operators.spark_submit import 
SparkSubmitOperator  # noqa
     File 
"/airflow_venv/lib/python3.6/site-packages/airflow/providers/apache/spark/operators/spark_submit.py",
 line 28, in <module>
       class SparkSubmitOperator(BaseOperator):
     File 
"/airflow_venv/lib/python3.6/site-packages/airflow/providers/apache/spark/operators/spark_submit.py",
 line 149, in SparkSubmitOperator
       ) -> None:
     File 
"/airflow_venv/lib/python3.6/site-packages/airflow/utils/decorators.py", line 
44, in apply_defaults
       stacklevel=3,
     File "/python-3.6.4/lib/python3.6/warnings.py", line 99, in _showwarnmsg
       msg.file, msg.line)
     File "/airflow_venv/lib/python3.6/site-packages/airflow/settings.py", line 
115, in custom_show_warning
       write_console.print(msg, soft_wrap=True)
     File "/airflow_venv/lib/python3.6/site-packages/rich/console.py", line 
1130, in print
       self._buffer.extend(new_segments)
     File "/airflow_venv/lib/python3.6/site-packages/rich/console.py", line 
553, in __exit__
       self._exit_buffer()
     File "/airflow_venv/lib/python3.6/site-packages/rich/console.py", line 
531, in _exit_buffer
       self._check_buffer()
     File "/airflow_venv/lib/python3.6/site-packages/rich/console.py", line 
1262, in _check_buffer
       self.file.flush()
   OSError: [Errno 5] Input/output error
   ````
   
   
   **What you expected to happen**:
   
   SparkSubmitOperator worked in 2.0.2 but not in 2.1.0. I suspect its related 
to the deprecation of apply_defaults.
   
   ```
   from datetime import datetime, timedelta
   from airflow import DAG
   from airflow.contrib.operators.spark_submit_operator import 
SparkSubmitOperator
   from airflow.models import Variable
   
   from airflow.hooks.base_hook import BaseHook
   
   default_args = {
       'owner': 'me',
       'depends_on_past': False,
       'start_date': datetime(2021, 5, 25),
       'email': ['[email protected]'],
       'email_on_failure': False,
       'email_on_retry': False,
       'retries': 0,
       'retry_delay': timedelta(minutes=5)
   }
   
   dag = DAG(dag_id='pi',
             concurrency=1,
             default_args=default_args,
             catchup=True,
             schedule_interval="*/5 * * * *")
   
   rollup_jobby=  SparkSubmitOperator(task_id='pi_task',
   conn_id='spark',
   application=f'/home/me/pi.py',
   executor_cores=4,
   num_executors=4,
   executor_memory='1g',
   driver_memory='1g',
   name='pi',
   execution_timeout=timedelta(minutes=60),
   dag=dag
   )
   ```
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to