Why do you think that spark module is not needed in our hadoop build? On Fri, Nov 18, 2016 at 5:44 PM, Valentin Kulichenko < valentin.kuliche...@gmail.com> wrote:
> Folks, > > Is there anyone who understands the purpose of including ignite-spark > module in the Hadoop Accelerator build? I can't figure out a use case for > which it's needed. > > In case we actually need it there, there is an issue then. We actually have > two ignite-spark modules, for 2.10 and 2.11. In Fabric build everything is > good, we put both in 'optional' folder and user can enable either one. But > in Hadoop Accelerator there is only 2.11 which means that the build doesn't > work with 2.10 out of the box. > > We should either remove the module from the build, or fix the issue. > > -Val >