?
Thanks
Regards,
Laurent T
- Mail original -
De: Mayur Rustagi mayur.rust...@gmail.com
À: laurent thoulon laurent.thou...@ldmobile.net
Envoyé: Mercredi 21 Mai 2014 13:51:46
Objet: Re: Ignoring S3 0 files exception
You can try newhaoopapi in spark context. Should be able
://+existingFilenamePattern) JavaRDD aPlusB =
a.union(b);aPlusB.reduceByLey(MyReducer); // -- This throws the error
I'd like to ignore the exception caused by a to process b without
troubles.Thanks
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Ignoring-S3-0-files-exception
with the
RDDs that actually found files ?
Thanks
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Ignoring-S3-0-files-exception-tp6101.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.