Hi - Can anyone please provide some pointers for this use case over Airflow?
> On 03-Aug-2017, at 9:13 PM, Ashish Rawat <ashish.ra...@myntra.com> wrote:
> We have a use case where we are running some R/Python based data science
> models, which execute over a single node. The execution time of the models is
> constantly increasing and we are now planning to split the model training by
> a partition key and distribute the workload over multiple machines.
> Does Airflow provide some simple way to split a task into multiple tasks, all
> of which will work on a specific value of the key.