GitHub user guocity created a discussion: Best Way to Track Step Execution Like 
Airflow Tasks in a Python Data Workflow

In Airflow 3, I want to process a large DataFrame entirely in memory but still 
track each step like separate Airflow tasks. When I split it into multiple 
tasks, each runs in a separate process, so the DataFrame can’t stay in memory. 
How can I track each step as tasks while keeping everything in the same process?


GitHub link: https://github.com/apache/airflow/discussions/58088

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to