GitHub user Dev-iL added a comment to the discussion: Cannot schedule a DAG on
a DatasetAlias when using a clean Airflow docker image (for CI)
## Possible solution:
Modify `airflow/datasets/__init__.py` as follows:
```diff
@@ 26 @@
- from sqlalchemy import select
+ from sqlalchemy import exc, select
@@ 142 @@
@internal_api_call
@provide_session
def expand_alias_to_datasets(
alias: str | DatasetAlias, *, session: Session = NEW_SESSION
) -> list[BaseDataset]:
"""Expand dataset alias to resolved datasets."""
from airflow.models.dataset import DatasetAliasModel
alias_name = alias.name if isinstance(alias, DatasetAlias) else alias
+ try:
dataset_alias_obj = session.scalar(
select(DatasetAliasModel).where(DatasetAliasModel.name ==
alias_name).limit(1)
)
+ except exc.OperationalError:
+ return []
if dataset_alias_obj:
return [Dataset(uri=dataset.uri, extra=dataset.extra) for dataset in
dataset_alias_obj.datasets]
return []
```
GitHub link:
https://github.com/apache/airflow/discussions/45236#discussioncomment-11676187
----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]