atul-astronomer opened a new issue, #63442:
URL: https://github.com/apache/airflow/issues/63442
### Apache Airflow version
main (development)
### If "Other Airflow 3 version" selected, which one?
_No response_
### What happened?
```typescript
Error updating Task Instance state. Setting the task to failed.
[airflow.api_fastapi.execution_api.routes.task_instances]
correlation_id=019ce219-95ba-7385-a884-654528bf7b8a loc=task_instances.py:356
paylo
ad=TISuccessStatePayload(state=<TerminalTIState.SUCCESS: 'success'>,
end_date=datetime.datetime(2026, 3, 12, 12, 50, 53, 497531,
tzinfo=Timezone('UTC')),
task_outlets=[AssetProfile(name='file://aip76/hourly_asset.csv',
uri='file://aip76/hourly_a
sset.csv', type='Asset')], outlet_events=[], rendered_map_index=None)
ti_id=019ce219-8f39-7197-a3e5-03e7b52c48ff
Traceback (most recent call last):
File
"/opt/airflow/airflow-core/src/airflow/api_fastapi/execution_api/routes/task_instances.py",
line 346, in ti_update_state
query, updated_state = _create_ti_state_update_query_and_update_state(
File
"/opt/airflow/airflow-core/src/airflow/api_fastapi/execution_api/routes/task_instances.py",
line 425, in _create_ti_state_update_query_and_update_state
TI.register_asset_changes_in_db(
File "/opt/airflow/airflow-core/src/airflow/utils/session.py", line 98, in
wrapper
return func(*args, **kwargs)
File "/opt/airflow/airflow-core/src/airflow/models/taskinstance.py", line
1390, in register_asset_changes_in_db
asset_manager.register_asset_change(
File "/opt/airflow/airflow-core/src/airflow/assets/manager.py", line 295,
in register_asset_change
cls._queue_dagruns(
File "/opt/airflow/airflow-core/src/airflow/assets/manager.py", line 355,
in _queue_dagruns
cls._queue_partitioned_dags(
File "/opt/airflow/airflow-core/src/airflow/assets/manager.py", line 419,
in _queue_partitioned_dags
).to_downstream(partition_key)
File
"/opt/airflow/airflow-core/src/airflow/partition_mappers/temporal.py", line 40,
in to_downstream
dt = datetime.strptime(key, self.input_format)
File "/usr/python/lib/python3.10/_strptime.py", line 568, in
_strptime_datetime
tt, fraction, gmtoff_fraction = _strptime(data_string, format)
File "/usr/python/lib/python3.10/_strptime.py", line 349, in _strptime
raise ValueError("time data %r does not match format %r" %
ValueError: time data '2026-01-01T00' does not match format
'%Y-%m-%dT%H:%M:%S'
```
### What you think should happen instead?
_No response_
### How to reproduce
Trigger the below Dag with partition key '2026-01-01T00'
```python
from datetime import datetime
from airflow import DAG
from airflow.sdk import Asset, CronPartitionTimetable,
PartitionedAssetTimetable, HourlyMapper, DailyMapper, YearlyMapper
from airflow.decorators import task
from airflow.sdk import get_current_context
hourly_asset = Asset(uri="file://aip76/hourly_asset.csv")
with DAG(
dag_id="test_hourly_mapper_producer",
start_date=datetime(2026,1,1),
schedule=CronPartitionTimetable("0 * * * *", timezone="UTC"),
catchup=False,
) as producer:
@task(outlets=[hourly_asset])
def produce():
ctx = get_current_context()
print("Producer partition:", ctx["dag_run"].partition_key)
produce()
with DAG(
dag_id="test_hourly_mapper_consumer",
start_date=datetime(2026,1,1),
schedule=PartitionedAssetTimetable(
assets=hourly_asset,
default_partition_mapper=YearlyMapper(),
),
catchup=False,
) as consumer:
@task
def consume():
ctx = get_current_context()
print("Consumer partition:", ctx["dag_run"].partition_key)
consume()
```
### Operating System
Linux
### Versions of Apache Airflow Providers
_No response_
### Deployment
Other
### Deployment details
_No response_
### Anything else?
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [x] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]