Hi, Not directly no, but you could perhaps write into the existing `models.Log` assuming that you're inside a PythonOperator.
We've discussed before having some sort of `TaskChainOperator` that would execute a set of tasks with the guarantee that they run atomically on the same worker. This operator would receive other tasks and somehow run them inside its own execute method as opposed to through the executor, guaranteeing local, atomic execution. I'm not sure if this potential solution would fit your use-case. Max On Tue, Oct 4, 2016 at 3:11 PM, Brandon White <[email protected]> wrote: > Hello! > > Airflow does a great job of tracking metrics at the task level and I am > wondering if there is any support for tracking metrics within a task. Say I > have a task which downloads data, processes it, then stores it. Are there > any Airflow features which allow me to track how long these subtasks take? > > Brandon > > -- > This e-mail is private and confidential and is for the addressee only. If > misdirected, please notify us by telephone, confirming that it has been > deleted from your system and any hard copies destroyed. You are strictly > prohibited from using, printing, distributing or disseminating it or any > information contained in it save to the intended recipient. >
