ks233ever opened a new issue, #36738:
URL: https://github.com/apache/airflow/issues/36738

   ### Apache Airflow version
   
   Other Airflow 2 version (please specify below)
   
   ### If "Other Airflow 2 version" selected, which one?
   
   2.7.2
   
   ### What happened?
   
   I have the following PythonVirtualEnvOperator DAG that runs without issue in 
Airflow 2.2. In Airflow 2.7.2 however it returns: ValueError: The key 
'data_interval_end' in args is a part of kwargs and therefore reserved. I've 
tried playing around with how '{{ data_interval_end }}' is passed in and am 
unable to figure it out.
   
   I've instead gone ahead and removed the entire op_args=['{{ 
data_interval_end }}'],from the PythonVirtualEnvOperator, and the corresponding 
data_interval_end paramter from process(). It now keeps returning:
   
   file.write_bytes(self.pickling_library.dumps({"args": self.op_args, 
"kwargs": self.op_kwargs})) TypeError: cannot pickle 'module' object
   
   But I'm not passing it any op_args or op_kwargs. Not sure why it throws this.
   
   ```
   def process(data_interval_end, dag_run):
        pass
       
           
       with DAG(
                       dag_id='da_survey_presentation',
                       default_args=DEFAULT_ARGS,
                       schedule_interval='0 8 * * 1',  # run once a week at 
midnight Pacific every Monday morning
                       max_active_runs=1,
                       catchup=False,
                       concurrency=1,
                       on_failure_callback=partial(on_failure, stage),
                       sla_miss_callback=partial(sla_miss, stage),
                       start_date=datetime.datetime(2022, 1, 1),
                       is_paused_upon_creation=(stage != 'prod'),
                       tags=['da']
                ) as dag:
                   PythonVirtualenvOperator(
                       task_id="virtualenv_task",
                       python_callable=process,
                       op_args=[
                           '{{ data_interval_end }}'
                       ],
                       provide_context=True,
                       requirements=["google-api-python-client==1.6.7", 
"df2gspread==1.0.4"],
                       system_site_packages=True,
                       dag=dag
                   )
   ```
   
   ### What you think should happen instead?
   
   There shouldn't be a serialization for op_args and op_kwargs if none are 
passed
   
   ### How to reproduce
   
   See above
   
   ### Operating System
   
   mac
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Docker-Compose
   
   ### Deployment details
   
   _No response_
   
   ### Anything else?
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to