Vladimir,

Update - I think I solved the ClassNotFound exception. It looks like the
Ignite installation document for Spark and CDH is outdated and doesn't
contain complete information on integrating Ignite with Spark running in a
'Yarn' (cluster) mode on CDH which I have. This is what I have done and now
am able to run my Spark program (however, with exception described below):
went to
Cloudera Manager -> YARN (MR2 Included) -> Configuration -> Service Wide ->
Advanced -> Spark Client Advanced Configuration Snippet (Safety Valve) for
spark-conf/spark-defaults.conf
Added etc/ignite-fabric-1.5.0/libs/* to already present
spark.executor.extraClassPath=/opt/cloudera/parcels/CDH-5.5.2-1.cdh5.5.2.p0.4/jars/htrace-core-3.2.0-incubating.jar.
 
The combined line looks like this:
spark.executor.extraClassPath=/opt/cloudera/parcels/CDH-5.5.2-1.cdh5.5.2.p0.4/jars/htrace-core-3.2.0-incubating.jar:/etc/ignite-fabric-1.5.0/libs/*

However, after starting Ignite and submitting my Spark program (in another
window) I see the following exception:
java.lang.OutOfMemoryError: GC overhead limit exceeded

The only line of code where I put something into the Ignite cache where I'm
trying to save RDD there:
cacheIgRDD.savePairs(partRDD, true)

One thing to note is it looks like IgniteConfigurationis not Serializable so
I had to create MyIgniteConfiguration extends IgniteConfiguration with
Serializable




--
View this message in context: 
http://apache-ignite-users.70518.x6.nabble.com/Ignite-Installation-with-Spark-under-CDH-tp4457p4624.html
Sent from the Apache Ignite Users mailing list archive at Nabble.com.

Reply via email to