hi.
first question
if we could create rdd with spark and store in ignite rdd or we only can
create rdd with ignite and share with spark job?
second question
what exactly the piece of bellow code?
object RDDProducer extends App {
val conf = new SparkConf().setAppName("SparkIgnite")
val sc = new SparkContext(conf)
val ic = new IgniteContext[Int, Int](sc, () => new IgniteConfiguration())
val sharedRDD: IgniteRDD[Int,Int] = ic.fromCache("partitioned")
sharedRDD.savePairs(sc.parallelize(1 to 100000, 10).map(i => (i, i)))
}
--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/