I am trying to build a prototype of some workflow application, where task
in a workflow is a spark application. For the same i am using Data and
Service grid. (Can't use Compute grid due to limitations from Customer)
What i am trying to do is, encapsulate spark execution inside an Ignite
service and deploying it on Ignite node so that different services (which
are executing as spark applications) will share RDDs among themselves. It's
working fine with spark master as local.
I just wanted to know, with this approach, if there could be any challenges
on production environment when i will be using Yarn for Spark.
Also how to configure Ignite on HDP without igfs ? Is it possible ?
Thanks & Regards,