maganaluis opened a new issue #8177: provide_context=True not working with 
CustomPythonVirtualenvOperator
URL: https://github.com/apache/airflow/issues/8177
 
 
   <!--
   
   Welcome to Apache Airflow!  For a smooth issue process, try to answer the 
following questions.
   Don't worry if they're not all applicable; just try to include what you can 
:-)
   
   If you need to include code snippets or logs, please put them in fenced code
   blocks.  If they're super-long, please use the details tag like
   <details><summary>super-long log</summary> lots of stuff </details>
   
   Please delete these comment blocks before submitting the issue.
   
   -->
   
   <!--
   
   IMPORTANT!!!
   
   Please complete the next sections or the issue will be closed.
   This questions are the first thing we need to know to understand the context.
   
   -->
   
   **Apache Airflow version**: 1.10.9
   
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl 
version`): 1.14.8
   
   **Environment**: Docker (Ubuntu 18.4 - Python 3.7)
   
   - **Cloud provider or hardware configuration**: Azure
   - **OS** (e.g. from /etc/os-release): Docker (Ubuntu 18.4 - Python 3.7)
   - **Kernel** (e.g. `uname -a`): Docker (Ubuntu 18.4 - Python 3.7)
   - **Install tools**: N/A
   - **Others**: N/A
   **What happened**:
   
   When we enable provide_context=True for CustomPythonVirtualenvOperator we 
get the error below. 
   
   ```
   [2020-04-07 15:08:51,940] {taskinstance.py:1128} ERROR - can't pickle module 
objects
   Traceback (most recent call last):
     File 
"/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 
966, in _run_raw_task
       result = task_copy.execute(context=context)
     File 
"/usr/local/lib/python3.7/site-packages/airflow/operators/python_operator.py", 
line 113, in execute
       return_value = self.execute_callable()
     File 
"/usr/local/lib/python3.7/site-packages/airflow/operators/python_operator.py", 
line 297, in execute_callable
       self._write_args(input_filename)
     File 
"/usr/local/lib/python3.7/site-packages/airflow/operators/python_operator.py", 
line 339, in _write_args
       pickle.dump(arg_dict, f)
   TypeError: can't pickle module objects
   ```
   
   One way to get around this issue is to create your own 
CustomPythonVirtualenvOperator and overwrite _write_args, but this should not 
be the case. Feel free to use this if you're encountering the same issue:
   
   ```python
   class CustomPythonVirtualenvOperator(PythonVirtualenvOperator):
       def _write_args(self, input_filename):
           # serialize args to file
           if self._pass_op_args():
               with open(input_filename, 'wb') as f:
                   # we only need dag_run to access conf at run time
                   arg_dict = ({'args': self.op_args, 'kwargs': {'dag_run': 
self.op_kwargs['dag_run']}})
                   if self.use_dill:
                       dill.dump(arg_dict, f)
                   else:
                       pickle.dump(arg_dict, f)
   ```
   
   <!-- (please include exact error messages if you can) -->
   
   **What you expected to happen**:
   
   Ideally we should be able to use the context so we can run these tasks with 
run-time arguments via the CLI or the REST API. 
   
   **How to reproduce it**:
   
   ```python
   from airflow.operators.python_operator import PythonOperator, 
PythonVirtualenvOperator
   from airflow.utils.dates import days_ago
   from datetime import timedelta
   from airflow import DAG
   import pickle
   import dill
   
   default_args = {
       'owner': 'Luis M',
       'depends_on_past': False,
       'start_date': days_ago(0),
       'email': ['[email protected]'],
       'email_on_failure': False,
       'email_on_retry': False,
       'retries': 0,
       'retry_delay': timedelta(minutes=5),
       # 'queue': 'bash_queue'
   }
   dag = DAG(
       'bug',
       default_args=default_args,
       description='bug',
       schedule_interval=timedelta(days=1))
   
   
   class CustomPythonVirtualenvOperator(PythonVirtualenvOperator):
       def _write_args(self, input_filename):
           # serialize args to file
           if self._pass_op_args():
               with open(input_filename, 'wb') as f:
                   arg_dict = ({'args': self.op_args, 'kwargs': {'dag_run': 
self.op_kwargs['dag_run']}})
                   if self.use_dill:
                       dill.dump(arg_dict, f)
                   else:
                       pickle.dump(arg_dict, f)
   
   
   def passf(**kwargs):
       pass
   
   def failf(**kwargs):
       pass
    
   task1 = CustomPythonVirtualenvOperator(
           task_id='task1',
           python_callable=passf,
           python_version='3',
           dag=dag,
           provide_context=True
   )
   
   task2 = PythonVirtualenvOperator(
           task_id='task2',
           python_callable=failf,
           python_version='3',
           dag=dag,
           provide_context=True
   )
   ```
   
   <!---
   
   As minimally and precisely as possible. Keep in mind we do not have access 
to your cluster or dags.
   
   If you are using kubernetes, please attempt to recreate the issue using 
minikube or kind.
   
   ## Install minikube/kind
   
   - Minikube https://minikube.sigs.k8s.io/docs/start/
   - Kind https://kind.sigs.k8s.io/docs/user/quick-start/
   
   If this is a UI bug, please provide a screenshot of the bug or a link to a 
youtube video of the bug in action
   
   You can include images using the .md sytle of
   ![alt text](http://url/to/img.png)
   
   To record a screencast, mac users can use QuickTime and then create an 
unlisted youtube video with the resulting .mov file.
   
   --->
   
   
   **Anything else we need to know**:
   
   <!--
   
   How often does this problem occur? Once? Every time etc?
   
   Any relevant logs to include? Put them here in side a detail tag:
   <details><summary>x.log</summary> lots of stuff </details>
   
   -->
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to