Hi,

We have a use case where we are running some R/Python based data science 
models, which execute over a single node. The execution time of the models is 
constantly increasing and we are now planning to split the model training by a 
partition key and distribute the workload over multiple machines.

Does Airflow provide some simple way to split a task into multiple tasks, all 
of which will work on a specific value of the key.

--
Regards,
Ashish



Reply via email to