> >>>>> Pramod
> > > >>>>>
> > > >>>>> On Sun, May 3, 2015 at 3:40 PM, Mark Hamstra <
> > > m...@clearstorydata.com>
> > > >>>>> wrote:
> > > >>>>>
> > > >>>>>
> > > >&g
gt; > >>>>>> On Sun, May 3, 2015 at 2:54 PM, Pramod Biligiri <
> > >>>>>>
> > >>>>> pramodbilig...@gmail.com>
> > >>>>>
> > >>>>>> wrote:
> > >>>>
gt;>>>>
> >>>>> pramodbilig...@gmail.com>
> >>>>>
> >>>>>> wrote:
> >>>>>>
> >>>>>> This is great. I didn't know about the mvn script in the build
> >>>
;>>> brennon.y...@capitalone.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>> Following what Ted said, if you leverage the `mvn` from within the
>>>>>>>> `build/` directory of Spark you¹ll get zinc for free
mail.com]
Sent: Friday, May 01, 2015 1:46 AM
To: dev@spark.apache.org
Subject: Speeding up Spark build during development
Hi,
I'm making some small changes to the Spark codebase and trying
it out
on a
cluster. I was wondering if there's a faster way to build than
running
the
package target
; > On 5/1/15, 9:45 AM, "Ted Yu" wrote:
>>> >> >
>>> >> > >Pramod:
>>> >> > >Please remember to run Zinc so that the build is faster.
>>> >> > >
>>> >> > >Cheers
>>> >> > >
&
gt; > >
>> >> > >wrote:
>> >> > >
>> >> > >> Hi Pramod,
>> >> > >>
>> >> > >> For cluster-like tests you might want to use the same code as in
>> >> mllib's
>> >> >
Pramod,
> >> > >>
> >> > >> For cluster-like tests you might want to use the same code as in
> >> mllib's
> >> > >> LocalClusterSparkContext. You can rebuild only the package that you
> >> > >>change
> >> &g
; mllib's
>> > >> LocalClusterSparkContext. You can rebuild only the package that you
>> > >>change
>> > >> and then run this main class.
>> > >>
>> > >> Best regards, Alexander
>> > >>
>> > &
> >> For cluster-like tests you might want to use the same code as in
> mllib's
> > >> LocalClusterSparkContext. You can rebuild only the package that you
> > >>change
> > >> and then run this main class.
> > >>
> > >>
t; Best regards, Alexander
> >>
> >> -----Original Message-
> >> From: Pramod Biligiri [mailto:pramodbilig...@gmail.com]
> >> Sent: Friday, May 01, 2015 1:46 AM
> >> To: dev@spark.apache.org
> >> Subject: Speeding up Spark build during devel
nd then run this main class.
>>
>> Best regards, Alexander
>>
>> -Original Message-
>> From: Pramod Biligiri [mailto:pramodbilig...@gmail.com]
>> Sent: Friday, May 01, 2015 1:46 AM
>> To: dev@spark.apache.org
>> Subject: Speeding up Spark build
package that you change
> and then run this main class.
>
> Best regards, Alexander
>
> -Original Message-
> From: Pramod Biligiri [mailto:pramodbilig...@gmail.com]
> Sent: Friday, May 01, 2015 1:46 AM
> To: dev@spark.apache.org
> Subject: Speeding up Spark build during
bilig...@gmail.com]
Sent: Friday, May 01, 2015 1:46 AM
To: dev@spark.apache.org
Subject: Speeding up Spark build during development
Hi,
I'm making some small changes to the Spark codebase and trying it out on a
cluster. I was wondering if there's a faster way to build than running the
package t
Hi Pramod,
If you are using sbt as your build, then you need to do sbt assembly once
and use sbt ~compile. Also export SPARK_PREPEND_CLASSES=1 this in your
shell and all nodes.
You can may be try this out ?
Thanks,
Prashant Sharma
On Fri, May 1, 2015 at 2:16 PM, Pramod Biligiri
wrote:
> Hi,
Hi,
I'm making some small changes to the Spark codebase and trying it out on a
cluster. I was wondering if there's a faster way to build than running the
package target each time.
Currently I'm using: mvn -DskipTests package
All the nodes have the same filesystem mounted at the same mount point.
16 matches
Mail list logo