to ignore 0 files ?
Thanks
Regards,
Laurent T
- Mail original -
De: "Mayur Rustagi"
À: "laurent thoulon"
Envoyé: Mercredi 21 Mai 2014 13:51:46
Objet: Re: Ignoring S3 0 files exception
You can try newhaoopapi in spark context. Should be able to c
Hi,
I've been trying to run my newly created spark job on my local master instead
of just runing it using maven and i haven't been able to make it work. My main
issue seems to be related to that error:
14/05/14 09:34:26 ERROR EndpointWriter: AssociationError
[akka.tcp://sparkMaster@devsrv:
(I've never actually received my previous mail so i'm resending it . Sorry if
it creates a duplicate. )
Hi,
I'm quite new to spark (and scala) but has anyone ever successfully compiled
and run a spark job using java and maven ?
Packaging seems to go fine but when i try to execute the job u
Hi,
I've been trying to run my newly created spark job on my local master instead
of just runing it using maven and i haven't been able to make it work. My main
issue seems to be related to that error:
14/05/14 09:34:26 ERROR EndpointWriter: AssociationError
[akka.tcp://sparkMaster@devsrv:70
Hi,
I'm quite new to spark (and scala) but has anyone ever successfully compiled
and run a spark job using java and maven ?
Packaging seems to go fine but when i try to execute the job using
mvn package
java -Xmx4g -cp target/jobs-1.4.0.0-jar-with-dependencies.jar
my.jobs.spark.TestJob
I