Anyone has an idea?
At least if I get to know whether this is possible or not, that'll be a 
 great help.

Thanks.



On Wednesday, October 1, 2014 3:46:51 PM UTC+5:30, Preeti Raj - Buchhada 
wrote:
>
> I am using ES version 1.3.2, and Spark 1.1.0.
> I can successfully read and write records from/to ES using newAPIHadoopRDD() 
> and saveAsNewAPIHadoopDataset().
> However, I am struggling to find a way to update records. Even I specify a 
> 'key' in ESOutputFormat it gets ignored, as documented clearly.
> So my question is : Is there a way to specify document ID and custom 
> routing values when writing to ES using Spark? If yes, how?
>

-- 
You received this message because you are subscribed to the Google Groups 
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/elasticsearch/b6e4628a-5106-4f2b-997d-e790a8aeb455%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to