michal-cech opened a new issue, #25061:
URL: https://github.com/apache/airflow/issues/25061
### Apache Airflow version
2.3.3 (latest released)
### What happened
When tasks return dictionary value, it is possible to access the value on
the DAG level by using `return_value['key']`. This does not seem to be possible
when using the newly added `expand` functionality. It fails to load the DAG
with (see example below): `ValueError: cannot map over XCom with custom key
'list' from <Task(_PythonDecoratedOperator): params>`
### What you think should happen instead
I think Airflow should behave in a consistent way while using
```
task(return_value['something'])
```
and
```
task.expand(arg=return_value['something'])
```
### How to reproduce
from datetime import datetime
from airflow.decorators import dag, task
import pendulum
@dag(
default_args={
'owner': 'airflow',
'start_date': pendulum.datetime(2022, 7, 14, tz="Europe/Prague"),
'depends_on_past': False,
'retries': 0,
},
schedule_interval="* * * * *",
max_active_runs=1,
)
def test():
@task(multiple_outputs=True)
def params():
return {"a": 1, "b": 1, "list": [1, 2, 3]}
parameters = params()
@task
def print_list(list):
return list
@task
def print_elements(element):
return element
#this works
print_list(parameters['list'])
#this also works
print_elements.expand(element=[1,2,3])
#this does not
print_elements.expand(element=parameters['list'])
test_dag = test()
### Operating System
Ubuntu 20.0.4
### Versions of Apache Airflow Providers
_No response_
### Deployment
Docker-Compose
### Deployment details
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]