Hi Everyone I'm curious what people usually use to build ETL workflows based on DataFrames and Spark API?
In Hadoop/Hive world people usually use Oozie. Is it different in Spark world?
Hi Everyone I'm curious what people usually use to build ETL workflows based on DataFrames and Spark API?
In Hadoop/Hive world people usually use Oozie. Is it different in Spark world?