Hi,
Thanks for your responses.
We already tried the one jar approach and it worked - but it is a real pain to
compile ~15 project every time we need to do a small change in one of them.
Just to make sure I understand you correctly - below is what we've tried to
pass in our test constructor:
JavaSparkContext sc = new JavaSparkContext(
"yarn-client",
"SPARK YARN TEST"
, "/app/spark/"
, new String[] {"/app/iot/test/test_kafka.jar"}
);
Despite the above, the executor / mapper function doesn't know the function
inside the above jar (test_kafka.jar).
Are we doing something wrong in the constructor?
Is there a code change / fix which we can quickly apply?
Thanks,
Ido
From: Liu, Raymond [mailto:[email protected]]
Sent: Tuesday, December 24, 2013 07:53
To: [email protected]
Subject: RE: Unable to load additional JARs in yarn-client mode
Ido, when you say add external JARS, do you mean by -addJars which adding some
jar for SparkContext to use in the AM env?
If so, I think you don't need it for yarn-cilent mode at all, for yarn-client
mode, SparkContext running locally, I think you just need to make sure those
jars are in the java classpath.
And for those need by executors / tasks, I think , you can package it as Matei
said. Or maybe we can expose some env for yarn-client mode to allowing adding
multiple jars as needed.
Best Regards,
Raymond Liu
From: Matei Zaharia [mailto:[email protected]]
Sent: Tuesday, December 24, 2013 1:17 PM
To: [email protected]<mailto:[email protected]>
Subject: Re: Unable to load additional JARs in yarn-client mode
I'm surprised by this, but one way that will definitely work is to assemble
your application into a single JAR. If passing them to the constructor doesn't
work, that's probably a bug.
Matei
On Dec 23, 2013, at 12:03 PM, Karavany, Ido
<[email protected]<mailto:[email protected]>> wrote:
Hi All,
For our application we need to use the yarn-client mode featured in 0.8.1.
(Yarn 2.0.5)
We've successfully executed it both yarn-client and yarn-standalone with our
java applications.
While in yarn-standalone there is a way to add external JARs - we couldn't find
a way to add those in yarn-client.
Adding jars in spark context constructor or setting the SPARK_CLASSPATH didn't
work as well.
Are we missing something?
Can you please advise?
If it is currently impossible - can you advise a patch / workaround?
It is crucial for us to get it working with external dependencies.
Many Thanks,
Ido
---------------------------------------------------------------------
Intel Electronics Ltd.
This e-mail and any attachments may contain confidential material for
the sole use of the intended recipient(s). Any review or distribution
by others is strictly prohibited. If you are not the intended
recipient, please contact the sender and delete all copies.
---------------------------------------------------------------------
Intel Electronics Ltd.
This e-mail and any attachments may contain confidential material for
the sole use of the intended recipient(s). Any review or distribution
by others is strictly prohibited. If you are not the intended
recipient, please contact the sender and delete all copies.