Looks like I stuck then, I am using mesos. Adding these 2 jars to all executors might be a problem for me, I will probably try to remove the dependency on the otj-logging lib then and just use log4j.
On Tue, Aug 25, 2015 at 2:15 PM, Marcelo Vanzin <van...@cloudera.com> wrote: > On Tue, Aug 25, 2015 at 1:50 PM, Utkarsh Sengar <utkarsh2...@gmail.com> > wrote: > > So do I need to manually copy these 2 jars on my spark executors? > > Yes. I can think of a way to work around that if you're using YARN, > but not with other cluster managers. > > > On Tue, Aug 25, 2015 at 10:51 AM, Marcelo Vanzin <van...@cloudera.com> > > wrote: > >> > >> On Tue, Aug 25, 2015 at 10:48 AM, Utkarsh Sengar <utkarsh2...@gmail.com > > > >> wrote: > >> > Now I am going to try it out on our mesos cluster. > >> > I assumed "spark.executor.extraClassPath" takes csv as jars the way > >> > "--jars" > >> > takes it but it should be ":" separated like a regular classpath jar. > >> > >> Ah, yes, those options are just raw classpath strings. Also, they > >> don't cause jars to be copied to the cluster. You'll need the jar to > >> be available at the same location on all cluster machines. > > > -- > Marcelo > -- Thanks, -Utkarsh