What about if you run ./bin/spark-shell
--driver-class-path=/path/to/your/jar.jar

I think either this or the --jars flag should work, but it's possible there
is a bug with the --jars flag when calling the Repl.


On Mon, Apr 28, 2014 at 4:30 PM, Roger Hoover <roger.hoo...@gmail.com>wrote:

> A couple of issues:
> 1) the jar doesn't show up on the classpath even though SparkSubmit had it
> in the --jars options.  I tested this by running > :cp in spark-shell
> 2) After adding it the classpath using (:cp
> /Users/rhoover/Work/spark-etl/target/scala-2.10/spark-etl_2.10-1.0.jar), it
> still fails.  When I do that in the scala repl, it works.
>
> BTW, I'm using the latest code from the master branch
> (8421034e793c0960373a0a1d694ce334ad36e747)
>
>
> On Mon, Apr 28, 2014 at 3:40 PM, Roger Hoover <roger.hoo...@gmail.com>wrote:
>
>> Matei,  thank you.  That seemed to work but I'm not able to import a
>> class from my jar.
>>
>> Using the verbose options, I can see that my jar should be included
>>
>> Parsed arguments:
>> ...
>>   jars
>>  /Users/rhoover/Work/spark-etl/target/scala-2.10/spark-etl_2.10-1.0.jar
>>
>> And I see the class I want to load in the jar:
>>
>> jar -tf
>> /Users/rhoover/Work/spark-etl/target/scala-2.10/spark-etl_2.10-1.0.jar |
>> grep IP2IncomeJob
>> etl/IP2IncomeJob$$anonfun$1.class
>> etl/IP2IncomeJob$$anonfun$4.class
>> etl/IP2IncomeJob$.class
>> etl/IP2IncomeJob$$anonfun$splitOverlappingRange$1.class
>> etl/IP2IncomeJob.class
>> etl/IP2IncomeJob$$anonfun$3.class
>> etl/IP2IncomeJob$$anonfun$2.class
>>
>> But the import fails
>>
>> scala> import etl.IP2IncomeJob
>> <console>:10: error: not found: value etl
>>        import etl.IP2IncomeJob
>>
>> Any ideas?
>>
>>
>>
>> On Sun, Apr 27, 2014 at 3:46 PM, Matei Zaharia 
>> <matei.zaha...@gmail.com>wrote:
>>
>>> Hi Roger,
>>>
>>> You should be able to use the --jars argument of spark-shell to add JARs
>>> onto the classpath and then work with those classes in the shell. (A recent
>>> patch, https://github.com/apache/spark/pull/542, made spark-shell use
>>> the same command-line arguments as spark-submit). But this is a great
>>> question, we should test it out and see whether anything else would make
>>> development easier.
>>>
>>> SBT also has an interactive shell where you can run classes in your
>>> project, but unfortunately Spark can't deal with closures typed directly in
>>> that the right way. However you write your Spark logic in a method and just
>>> call that method from the SBT shell, that should work.
>>>
>>> Matei
>>>
>>> On Apr 27, 2014, at 3:14 PM, Roger Hoover <roger.hoo...@gmail.com>
>>> wrote:
>>>
>>> > Hi,
>>> >
>>> > From the meetup talk about the 1.0 release, I saw that spark-submit
>>> will be the preferred way to launch apps going forward.
>>> >
>>> > How do you recommend launching such jobs in a development cycle?  For
>>> example, how can I load an app that's expecting to a given to spark-submit
>>> into spark-shell?
>>> >
>>> > Also, can anyone recommend other tricks for rapid development?  I'm
>>> new to Scala, sbt, etc.  I think sbt can watch for changes in source files
>>> and compile them automatically.
>>> >
>>> > I want to be able to make code changes and quickly get into a
>>> spark-shell to play around with them.
>>> >
>>> > I appreciate any advice.  Thanks,
>>> >
>>> > Roger
>>>
>>>
>>
>

Reply via email to