so assembly will be required if the master and worker sit on different
machines?


On Fri, Oct 25, 2013 at 4:17 AM, Sergey Soldatov
<[email protected]>wrote:

> In this case package would be enough. If you run a cluster the assembly
> may be required.
>
>
> On Fri, Oct 25, 2013 at 3:08 PM, Umar Javed <[email protected]> wrote:
>
>> I'm sorry I didn't quite understand that. I'm using and running spark on
>> my local machine. What did you mean by infrastructure?
>>
>>
>> On Fri, Oct 25, 2013 at 4:01 AM, Sergey Soldatov <
>> [email protected]> wrote:
>>
>>> For building Spark only you may use sbt/sbt package. It's much faster.
>>> But if you use Spark in some infrastructure which requires the assembly,
>>> the only way is to use assembly target.
>>>
>>>
>>> On Fri, Oct 25, 2013 at 2:34 PM, Umar Javed <[email protected]>wrote:
>>>
>>>> thanks. This takes a lot of time though. It takes me 10 mins for the
>>>> build to finish after changing a single line of code. Is there a quicker
>>>> way?
>>>>
>>>>
>>>> On Fri, Oct 25, 2013 at 2:35 AM, Sergey Soldatov <
>>>> [email protected]> wrote:
>>>>
>>>>> Hi Umar,
>>>>> Exactly. You need to use sbt/sbt assembly
>>>>> It will compile only changed files and build the assembly.
>>>>>
>>>>> Thanks,
>>>>> Sergey
>>>>>
>>>>>
>>>>> On Fri, Oct 25, 2013 at 1:29 PM, Umar Javed <[email protected]>wrote:
>>>>>
>>>>>> Hi,
>>>>>>
>>>>>> After making a change to a .scala file in the spark source code, how
>>>>>> I build it using sbt? Do I have to use 'sbt/sbt assembly' again?
>>>>>>
>>>>>> cheers,
>>>>>> Umar
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>

Reply via email to