Folks,
I have the following program :
SparkConf conf = new
SparkConf().setMaster("local").setAppName("Indexer").set("spark.driver.maxResultSize",
"2g");conf.set("es.index.auto.create", "true");conf.set("es.nodes",
"localhost");conf.set("es.port", "9200");conf.set("es.write.operation",
You don't need an explicit association between your JavaEsSpark and the
SparkConf.
Actuall when you will make transformations/filtering/.. on your "sc" then
you can strore the final RDD in your ELS. Example:
val generateRDD = sc.makeRDD(Seq(SOME_STUFF))
JavaEsSpark.saveToEs(generateRDD, "foo");