Are you able to import any class from you jars within spark-shell? -----Original Message----- From: Marcelo Vanzin [mailto:van...@cloudera.com] Sent: Wednesday, June 11, 2014 9:36 PM To: user@spark.apache.org Subject: Re: Adding external jar to spark-shell classpath in spark 1.0
Ah, not that it should matter, but I'm on Linux and you seem to be on Windows... maybe there is something weird going on with the Windows launcher? On Wed, Jun 11, 2014 at 10:34 AM, Marcelo Vanzin <van...@cloudera.com> wrote: > Just tried this and it worked fine for me: > > ./bin/spark-shell --jars jar1,jar2,etc,etc > > On Wed, Jun 11, 2014 at 10:25 AM, Ulanov, Alexander > <alexander.ula...@hp.com> wrote: >> Hi, >> >> >> >> I am currently using spark 1.0 locally on Windows 7. I would like to >> use classes from external jar in the spark-shell. I followed the instruction >> in: >> http://mail-archives.apache.org/mod_mbox/spark-user/201402.mbox/%3CCA >> LrNVjWWF6k=c_jrhoe9w_qaacjld4+kbduhfv0pitr8h1f...@mail.gmail.com%3E >> >> >> >> I have set ADD_JARS=”my.jar” SPARK_CLASSPATH=”my.jar” in >> spark-shell.cmd but this didn’t work. >> >> >> >> I also tried running “spark-shell.cmd --jars my.jar >> --driver-class-path my.jar --driver-library-path my.jar” and it didn’t work >> either. >> >> >> >> I cannot load any class from my jar into spark shell. Btw my.jar >> contains a simple Scala class. >> >> >> >> Best regards, Alexander > > > > -- > Marcelo -- Marcelo