You don't need an explicit association between your JavaEsSpark and the
SparkConf.
Actuall when you will make transformations/filtering/.. on your "sc" then
you can strore the final RDD in your ELS. Example:

val generateRDD = sc.makeRDD(Seq(SOME_STUFF))
JavaEsSpark.saveToEs(generateRDD, "foo");

That's it...
At last, be carreful while defining your sets of your "conf". For instance
you may end-up changing the localhost by the real IP adresse of your
Elasticsearch node...

Ali Gouta.

On Mon, Dec 14, 2015 at 1:52 PM, Spark Enthusiast <sparkenthusi...@yahoo.in>
wrote:

> Folks,
>
> I have the following program :
>
> SparkConf conf = new SparkConf().setMaster("local")
> .setAppName("Indexer").set("spark.driver.maxResultSize", "2g");
> conf.set("es.index.auto.create", "true");
> conf.set("es.nodes", "localhost");
> conf.set("es.port", "9200");
> conf.set("es.write.operation", "index");
> JavaSparkContext sc = new JavaSparkContext(conf);
>
>           .
>           .
>
> JavaEsSpark.saveToEs(filteredFields, "foo");
>
> I get an error saying cannot find storage. Looks like the driver program
> cannot the Elastic Search Server. Seeing the program, I have not associated
> JavaEsSpark to the SparkConf.
>
> Question: How do I associate JavaEsSpark to SparkConf?
>
>
>

Reply via email to