Yes.

Thanks
Best Regards

On Mon, Jun 22, 2015 at 8:33 PM, Murthy Chelankuri <kmurt...@gmail.com>
wrote:

> I have more than one jar. can we set sc.addJar multiple times with each
> dependent jar ?
>
> On Mon, Jun 22, 2015 at 8:30 PM, Akhil Das <ak...@sigmoidanalytics.com>
> wrote:
>
>> Try sc.addJar instead of setJars
>>
>> Thanks
>> Best Regards
>>
>> On Mon, Jun 22, 2015 at 8:24 PM, Murthy Chelankuri <kmurt...@gmail.com>
>> wrote:
>>
>>> I have been using the spark from the last 6 months with the version
>>> 1.2.0.
>>>
>>> I am trying to migrate to the 1.3.0 but the same problem i have written
>>> is not wokring.
>>>
>>> Its giving class not found error when i try to load some dependent jars
>>> from the main program.
>>>
>>> This use to work in 1.2.0 when set all the dependent jars array to the
>>> spark context but not working in 1.3.0
>>>
>>>
>>> Please help me how to resolve this.
>>>
>>>
>>> Thanks,
>>> Murthy Chelankuri
>>>
>>
>>
>

Reply via email to