AndrewTsao opened a new issue, #28146:
URL: https://github.com/apache/airflow/issues/28146
### Apache Airflow version
2.5.0
### What happened
When I upgrade to 2.5.0, run dynamic task test failed.
```py
from airflow.decorators import task, dag
import pendulum as pl
@dag(
dag_id='test-dynamic-tasks',
schedule=None,
start_date=pl.today().add(days=-3),
tags=['example'])
def test_dynamic_tasks():
@task.virtualenv(requirements=[])
def sum_it(values):
print(values)
@task.virtualenv(requirements=[])
def add_one(value):
return value + 1
added_values = add_one.expand(value = [1,2])
sum_it(added_values)
dag = test_dynamic_tasks()
```
```log
*** Reading local file:
/home/andi/airflow/logs/dag_id=test-dynamic-tasks/run_id=manual__2022-12-06T10:07:41.355423+00:00/task_id=sum_it/attempt=1.log
[2022-12-06, 18:07:53 CST] {taskinstance.py:1087} INFO - Dependencies all
met for <TaskInstance: test-dynamic-tasks.sum_it
manual__2022-12-06T10:07:41.355423+00:00 [queued]>
[2022-12-06, 18:07:53 CST] {taskinstance.py:1087} INFO - Dependencies all
met for <TaskInstance: test-dynamic-tasks.sum_it
manual__2022-12-06T10:07:41.355423+00:00 [queued]>
[2022-12-06, 18:07:53 CST] {taskinstance.py:1283} INFO -
--------------------------------------------------------------------------------
[2022-12-06, 18:07:53 CST] {taskinstance.py:1284} INFO - Starting attempt 1
of 1
[2022-12-06, 18:07:53 CST] {taskinstance.py:1285} INFO -
--------------------------------------------------------------------------------
[2022-12-06, 18:07:53 CST] {taskinstance.py:1304} INFO - Executing
<Task(_PythonVirtualenvDecoratedOperator): sum_it> on 2022-12-06
10:07:41.355423+00:00
[2022-12-06, 18:07:53 CST] {standard_task_runner.py:55} INFO - Started
process 25873 to run task
[2022-12-06, 18:07:53 CST] {standard_task_runner.py:82} INFO - Running:
['airflow', 'tasks', 'run', 'test-dynamic-tasks', 'sum_it',
'manual__2022-12-06T10:07:41.355423+00:00', '--job-id', '41164', '--raw',
'--subdir', 'DAGS_FOLDER/andi/test-dynamic-task.py', '--cfg-path',
'/tmp/tmphudvake2']
[2022-12-06, 18:07:53 CST] {standard_task_runner.py:83} INFO - Job 41164:
Subtask sum_it
[2022-12-06, 18:07:53 CST] {task_command.py:389} INFO - Running
<TaskInstance: test-dynamic-tasks.sum_it
manual__2022-12-06T10:07:41.355423+00:00 [running]> on host
sh-dataops-airflow.jinde.local
[2022-12-06, 18:07:53 CST] {taskinstance.py:1511} INFO - Exporting the
following env vars:
[email protected]
AIRFLOW_CTX_DAG_OWNER=andi
AIRFLOW_CTX_DAG_ID=test-dynamic-tasks
AIRFLOW_CTX_TASK_ID=sum_it
AIRFLOW_CTX_EXECUTION_DATE=2022-12-06T10:07:41.355423+00:00
AIRFLOW_CTX_TRY_NUMBER=1
AIRFLOW_CTX_DAG_RUN_ID=manual__2022-12-06T10:07:41.355423+00:00
[2022-12-06, 18:07:53 CST] {process_utils.py:179} INFO - Executing cmd:
/home/andi/airflow/venv38/bin/python -m virtualenv /tmp/venv7lc4m6na
--system-site-packages
[2022-12-06, 18:07:53 CST] {process_utils.py:183} INFO - Output:
[2022-12-06, 18:07:54 CST] {process_utils.py:187} INFO - created virtual
environment CPython3.8.0.final.0-64 in 220ms
[2022-12-06, 18:07:54 CST] {process_utils.py:187} INFO - creator
CPython3Posix(dest=/tmp/venv7lc4m6na, clear=False, no_vcs_ignore=False,
global=True)
[2022-12-06, 18:07:54 CST] {process_utils.py:187} INFO - seeder
FromAppData(download=False, pip=bundle, setuptools=bundle, wheel=bundle,
via=copy, app_data_dir=/home/andi/.local/share/virtualenv)
[2022-12-06, 18:07:54 CST] {process_utils.py:187} INFO - added seed
packages: pip==22.2.1, setuptools==63.2.0, wheel==0.37.1
[2022-12-06, 18:07:54 CST] {process_utils.py:187} INFO - activators
BashActivator,CShellActivator,FishActivator,NushellActivator,PowerShellActivator,PythonActivator
[2022-12-06, 18:07:54 CST] {process_utils.py:179} INFO - Executing cmd:
/tmp/venv7lc4m6na/bin/pip install -r /tmp/venv7lc4m6na/requirements.txt
[2022-12-06, 18:07:54 CST] {process_utils.py:183} INFO - Output:
[2022-12-06, 18:07:55 CST] {process_utils.py:187} INFO - Looking in indexes:
http://pypi:8081
[2022-12-06, 18:08:00 CST] {process_utils.py:187} INFO -
[2022-12-06, 18:08:00 CST] {process_utils.py:187} INFO - [notice] A new
release of pip available: 22.2.1 -> 22.3.1
[2022-12-06, 18:08:00 CST] {process_utils.py:187} INFO - [notice] To update,
run: python -m pip install --upgrade pip
[2022-12-06, 18:08:00 CST] {taskinstance.py:1772} ERROR - Task failed with
exception
Traceback (most recent call last):
File
"/home/andi/airflow/venv38/lib/python3.8/site-packages/airflow/decorators/base.py",
line 217, in execute
return_value = super().execute(context)
File
"/home/andi/airflow/venv38/lib/python3.8/site-packages/airflow/operators/python.py",
line 356, in execute
return super().execute(context=serializable_context)
File
"/home/andi/airflow/venv38/lib/python3.8/site-packages/airflow/operators/python.py",
line 175, in execute
return_value = self.execute_callable()
File
"/home/andi/airflow/venv38/lib/python3.8/site-packages/airflow/operators/python.py",
line 553, in execute_callable
return self._execute_python_callable_in_subprocess(python_path, tmp_path)
File
"/home/andi/airflow/venv38/lib/python3.8/site-packages/airflow/operators/python.py",
line 397, in _execute_python_callable_in_subprocess
self._write_args(input_path)
File
"/home/andi/airflow/venv38/lib/python3.8/site-packages/airflow/operators/python.py",
line 367, in _write_args
file.write_bytes(self.pickling_library.dumps({"args": self.op_args,
"kwargs": self.op_kwargs}))
_pickle.PicklingError: Can't pickle <class
'sqlalchemy.orm.session.Session'>: it's not the same object as
sqlalchemy.orm.session.Session
[2022-12-06, 18:08:00 CST] {taskinstance.py:1322} INFO - Marking task as
FAILED. dag_id=test-dynamic-tasks, task_id=sum_it,
execution_date=20221206T100741, start_date=20221206T100753,
end_date=20221206T100800
[2022-12-06, 18:08:00 CST] {warnings.py:109} WARNING -
/home/andi/airflow/venv38/lib/python3.8/site-packages/airflow/utils/email.py:120:
RemovedInAirflow3Warning: Fetching SMTP credentials from configuration
variables will be deprecated in a future release. Please set credentials using
a connection instead.
send_mime_email(e_from=mail_from, e_to=recipients, mime_msg=msg,
conn_id=conn_id, dryrun=dryrun)
[2022-12-06, 18:08:00 CST] {configuration.py:635} WARNING - section/key
[smtp/smtp_user] not found in config
[2022-12-06, 18:08:00 CST] {email.py:229} INFO - Email alerting: attempt 1
[2022-12-06, 18:08:01 CST] {email.py:241} INFO - Sent an alert email to
['[email protected]']
[2022-12-06, 18:08:01 CST] {standard_task_runner.py:100} ERROR - Failed to
execute job 41164 for task sum_it (Can't pickle <class
'sqlalchemy.orm.session.Session'>: it's not the same object as
sqlalchemy.orm.session.Session; 25873)
[2022-12-06, 18:08:01 CST] {local_task_job.py:159} INFO - Task exited with
return code 1
[2022-12-06, 18:08:01 CST] {taskinstance.py:2582} INFO - 0 downstream tasks
scheduled from follow-on schedule check
```
### What you think should happen instead
I expect this sample run passed.
### How to reproduce
_No response_
### Operating System
centos 7.9 3.10.0-1160.el7.x86_64
### Versions of Apache Airflow Providers
```
airflow-code-editor==5.2.2
apache-airflow-providers-celery==3.0.0
apache-airflow-providers-microsoft-mssql==3.1.0
apache-airflow-providers-microsoft-psrp==2.0.0
apache-airflow-providers-microsoft-winrm==3.0.0
apache-airflow-providers-mysql==3.0.0
apache-airflow-providers-redis==3.0.0
apache-airflow-providers-samba==4.0.0
apache-airflow-providers-sftp==3.0.0
autopep8==1.6.0
brotlipy==0.7.0
chardet==3.0.4
pip-chill==1.0.1
pyopenssl==19.1.0
pysocks==1.7.1
python-ldap==3.4.2
requests-credssp==2.0.0
swagger-ui-bundle==0.0.9
tqdm==4.51.0
virtualenv==20.16.2
yapf==0.32.0
```
### Deployment
Official Apache Airflow Helm Chart
### Deployment details
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]