+1 to question about serializaiton. SparkContext is still in driver
process(even if it has several threads from which you submit jobs)
as for the problem, check your classpath, scala version, spark version etc.
such errors usually happens when there is some conflict in classpath. Maybe
you compiled your jar with different versions?

On 5 June 2015 at 21:55, Lee McFadden <splee...@gmail.com> wrote:

> You can see an example of the constructor for the class which executes a
> job in my opening post.
>
> I'm attempting to instantiate and run the class using the code below:
>
> ```
>     val conf = new SparkConf()
>       .setAppName(appNameBase.format("Test"))
>
>     val connector = CassandraConnector(conf)
>
>     val sc = new SparkContext(conf)
>
>     // Set up the threadpool for running Jobs.
>     val pool = Executors.newFixedThreadPool(10)
>
>     pool.execute(new SecondRollup(sc, connector, start))
> ```
>
> There is some surrounding code that then waits for all the jobs entered
> into the thread pool to complete, although it's not really required at the
> moment as I am only submitting one job until I get this issue straightened
> out :)
>
> Thanks,
>
> Lee
>
> On Fri, Jun 5, 2015 at 11:50 AM Marcelo Vanzin <van...@cloudera.com>
> wrote:
>
>> On Fri, Jun 5, 2015 at 11:48 AM, Lee McFadden <splee...@gmail.com> wrote:
>>
>>> Initially I had issues passing the SparkContext to other threads as it
>>> is not serializable.  Eventually I found that adding the @transient
>>> annotation prevents a NotSerializableException.
>>>
>>
>> This is really puzzling. How are you passing the context around that you
>> need to do serialization?
>>
>> Threads run all in the same process so serialization should not be needed
>> at all.
>>
>> --
>> Marcelo
>>
>

Reply via email to