Folks,
I have the following program :
SparkConf conf = new 
SparkConf().setMaster("local").setAppName("Indexer").set("spark.driver.maxResultSize",
 "2g");conf.set("es.index.auto.create", "true");conf.set("es.nodes", 
"localhost");conf.set("es.port", "9200");conf.set("es.write.operation", 
"index");JavaSparkContext sc = new JavaSparkContext(conf);
          .          .
JavaEsSpark.saveToEs(filteredFields, "foo");

I get an error saying cannot find storage. Looks like the driver program cannot 
the Elastic Search Server. Seeing the program, I have not associated 
JavaEsSpark to the SparkConf. 
Question: How do I associate JavaEsSpark to SparkConf?

Reply via email to