Re: How to incrementally compile spark examples using mvn

2014-12-05 Thread Ted Yu
I tried the following:

  511  rm -rf
~/.m2/repository/org/apache/spark/spark-core_2.10/1.3.0-SNAPSHOT/
  513  mvn -am -pl streaming package -DskipTests

[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM .. SUCCESS [4.976s]
[INFO] Spark Project Networking .. SUCCESS [1.279s]
[INFO] Spark Project Shuffle Streaming Service ... SUCCESS [0.499s]
[INFO] Spark Project Core  SUCCESS
[1:03.302s]
[INFO] Spark Project Streaming ... SUCCESS [26.777s]
[INFO]

[INFO] BUILD SUCCESS

Cheers

On Fri, Dec 5, 2014 at 4:53 PM, Marcelo Vanzin  wrote:

> I've never used it, but reading the help it seems the "-am" option
> might help here.
>
> On Fri, Dec 5, 2014 at 4:47 PM, Sean Owen  wrote:
> > Maven definitely compiles "what is needed", but not if you tell it to
> > only compile one module alone. Unless you have previously built and
> > installed the other local snapshot artifacts it needs, that invocation
> > can't proceed because you have restricted it to build one module whose
> > dependencies don't exist.
> >
> > On Fri, Dec 5, 2014 at 6:44 PM, Koert Kuipers  wrote:
> >> i think what changed is that core now has dependencies on other sub
> >> projects. ok... so i am forced to install stuff because maven cannot
> compile
> >> "what is needed". i will install
> >>
> >> On Fri, Dec 5, 2014 at 7:12 PM, Koert Kuipers 
> wrote:
> >>>
> >>> i suddenly also run into the issue that maven is trying to download
> >>> snapshots that dont exists for other sub projects.
> >>>
> >>> did something change in the maven build?
> >>>
> >>> does maven not have capability to smartly compile the other
> sub-projects
> >>> that a sub-project depends on?
> >>>
> >>> i rather avoid "mvn install" since this creates a local maven repo. i
> have
> >>> been stung by that before (spend a day trying to do something and got
> weird
> >>> errors because some toy version i once build was stuck in my local
> maven
> >>> repo and it somehow got priority over a real maven repo).
> >>>
> >>> On Fri, Dec 5, 2014 at 5:28 PM, Marcelo Vanzin 
> >>> wrote:
> >>>>
> >>>> You can set SPARK_PREPEND_CLASSES=1 and it should pick your new mllib
> >>>> classes whenever you compile them.
> >>>>
> >>>> I don't see anything similar for examples/, so if you modify example
> >>>> code you need to re-build the examples module ("package" or "install"
> >>>> - just "compile" won't work, since you need to build the new jar).
> >>>>
> >>>> On Thu, Dec 4, 2014 at 10:23 PM, MEETHU MATHEW <
> meethu2...@yahoo.co.in>
> >>>> wrote:
> >>>> > Hi all,
> >>>> >
> >>>> > I made some code changes  in mllib project and as mentioned in the
> >>>> > previous
> >>>> > mails I did
> >>>> >
> >>>> > mvn install -pl mllib
> >>>> >
> >>>> > Now  I run a program in examples using run-example, the new code is
> not
> >>>> > executing.Instead the previous code itself is running.
> >>>> >
> >>>> > But if I do an  "mvn install" in the entire spark project , I can
> see
> >>>> > the
> >>>> > new code running.But installing the entire spark takes a lot of time
> >>>> > and so
> >>>> > its difficult to do this each time  I make some changes.
> >>>> >
> >>>> > Can someone tell me how to compile mllib alone and get the changes
> >>>> > working?
> >>>> >
> >>>> > Thanks & Regards,
> >>>> > Meethu M
> >>>> >
> >>>> >
> >>>> > On Friday, 28 November 2014 2:39 PM, MEETHU MATHEW
> >>>> > 
> >>>> > wrote:
> >>>> >
> >>>> >
> >>>> > Hi,
> >>>> > I have a similar problem.I modified the code in mllib and examples.
> >>>> > I did
> >>>> > mvn install -pl mllib
> >>>> > mvn install -pl examples
> >>>> >
> >>>> > B

Re: How to incrementally compile spark examples using mvn

2014-12-05 Thread Marcelo Vanzin
I've never used it, but reading the help it seems the "-am" option
might help here.

On Fri, Dec 5, 2014 at 4:47 PM, Sean Owen  wrote:
> Maven definitely compiles "what is needed", but not if you tell it to
> only compile one module alone. Unless you have previously built and
> installed the other local snapshot artifacts it needs, that invocation
> can't proceed because you have restricted it to build one module whose
> dependencies don't exist.
>
> On Fri, Dec 5, 2014 at 6:44 PM, Koert Kuipers  wrote:
>> i think what changed is that core now has dependencies on other sub
>> projects. ok... so i am forced to install stuff because maven cannot compile
>> "what is needed". i will install
>>
>> On Fri, Dec 5, 2014 at 7:12 PM, Koert Kuipers  wrote:
>>>
>>> i suddenly also run into the issue that maven is trying to download
>>> snapshots that dont exists for other sub projects.
>>>
>>> did something change in the maven build?
>>>
>>> does maven not have capability to smartly compile the other sub-projects
>>> that a sub-project depends on?
>>>
>>> i rather avoid "mvn install" since this creates a local maven repo. i have
>>> been stung by that before (spend a day trying to do something and got weird
>>> errors because some toy version i once build was stuck in my local maven
>>> repo and it somehow got priority over a real maven repo).
>>>
>>> On Fri, Dec 5, 2014 at 5:28 PM, Marcelo Vanzin 
>>> wrote:
>>>>
>>>> You can set SPARK_PREPEND_CLASSES=1 and it should pick your new mllib
>>>> classes whenever you compile them.
>>>>
>>>> I don't see anything similar for examples/, so if you modify example
>>>> code you need to re-build the examples module ("package" or "install"
>>>> - just "compile" won't work, since you need to build the new jar).
>>>>
>>>> On Thu, Dec 4, 2014 at 10:23 PM, MEETHU MATHEW 
>>>> wrote:
>>>> > Hi all,
>>>> >
>>>> > I made some code changes  in mllib project and as mentioned in the
>>>> > previous
>>>> > mails I did
>>>> >
>>>> > mvn install -pl mllib
>>>> >
>>>> > Now  I run a program in examples using run-example, the new code is not
>>>> > executing.Instead the previous code itself is running.
>>>> >
>>>> > But if I do an  "mvn install" in the entire spark project , I can see
>>>> > the
>>>> > new code running.But installing the entire spark takes a lot of time
>>>> > and so
>>>> > its difficult to do this each time  I make some changes.
>>>> >
>>>> > Can someone tell me how to compile mllib alone and get the changes
>>>> > working?
>>>> >
>>>> > Thanks & Regards,
>>>> > Meethu M
>>>> >
>>>> >
>>>> > On Friday, 28 November 2014 2:39 PM, MEETHU MATHEW
>>>> > 
>>>> > wrote:
>>>> >
>>>> >
>>>> > Hi,
>>>> > I have a similar problem.I modified the code in mllib and examples.
>>>> > I did
>>>> > mvn install -pl mllib
>>>> > mvn install -pl examples
>>>> >
>>>> > But when I run the program in examples using run-example,the older
>>>> > version
>>>> > of  mllib (before the changes were made) is getting executed.
>>>> > How to get the changes made in mllib while  calling it from examples
>>>> > project?
>>>> >
>>>> > Thanks & Regards,
>>>> > Meethu M
>>>> >
>>>> >
>>>> > On Monday, 24 November 2014 3:33 PM, Yiming (John) Zhang
>>>> > 
>>>> > wrote:
>>>> >
>>>> >
>>>> > Thank you, Marcelo and Sean, "mvn install" is a good answer for my
>>>> > demands.
>>>> >
>>>> > -邮件原件-
>>>> > 发件人: Marcelo Vanzin [mailto:van...@cloudera.com]
>>>> > 发送时间: 2014年11月21日 1:47
>>>> > 收件人: yiming zhang
>>>> > 抄送: Sean Owen; user@spark.apache.org
>>>> > 主题: Re: How to incrementally compile spark examples using mvn
>>>> >
>>>> > Hi Yiming,
>>>> >
>>>> >

Re: How to incrementally compile spark examples using mvn

2014-12-05 Thread Sean Owen
Maven definitely compiles "what is needed", but not if you tell it to
only compile one module alone. Unless you have previously built and
installed the other local snapshot artifacts it needs, that invocation
can't proceed because you have restricted it to build one module whose
dependencies don't exist.

On Fri, Dec 5, 2014 at 6:44 PM, Koert Kuipers  wrote:
> i think what changed is that core now has dependencies on other sub
> projects. ok... so i am forced to install stuff because maven cannot compile
> "what is needed". i will install
>
> On Fri, Dec 5, 2014 at 7:12 PM, Koert Kuipers  wrote:
>>
>> i suddenly also run into the issue that maven is trying to download
>> snapshots that dont exists for other sub projects.
>>
>> did something change in the maven build?
>>
>> does maven not have capability to smartly compile the other sub-projects
>> that a sub-project depends on?
>>
>> i rather avoid "mvn install" since this creates a local maven repo. i have
>> been stung by that before (spend a day trying to do something and got weird
>> errors because some toy version i once build was stuck in my local maven
>> repo and it somehow got priority over a real maven repo).
>>
>> On Fri, Dec 5, 2014 at 5:28 PM, Marcelo Vanzin 
>> wrote:
>>>
>>> You can set SPARK_PREPEND_CLASSES=1 and it should pick your new mllib
>>> classes whenever you compile them.
>>>
>>> I don't see anything similar for examples/, so if you modify example
>>> code you need to re-build the examples module ("package" or "install"
>>> - just "compile" won't work, since you need to build the new jar).
>>>
>>> On Thu, Dec 4, 2014 at 10:23 PM, MEETHU MATHEW 
>>> wrote:
>>> > Hi all,
>>> >
>>> > I made some code changes  in mllib project and as mentioned in the
>>> > previous
>>> > mails I did
>>> >
>>> > mvn install -pl mllib
>>> >
>>> > Now  I run a program in examples using run-example, the new code is not
>>> > executing.Instead the previous code itself is running.
>>> >
>>> > But if I do an  "mvn install" in the entire spark project , I can see
>>> > the
>>> > new code running.But installing the entire spark takes a lot of time
>>> > and so
>>> > its difficult to do this each time  I make some changes.
>>> >
>>> > Can someone tell me how to compile mllib alone and get the changes
>>> > working?
>>> >
>>> > Thanks & Regards,
>>> > Meethu M
>>> >
>>> >
>>> > On Friday, 28 November 2014 2:39 PM, MEETHU MATHEW
>>> > 
>>> > wrote:
>>> >
>>> >
>>> > Hi,
>>> > I have a similar problem.I modified the code in mllib and examples.
>>> > I did
>>> > mvn install -pl mllib
>>> > mvn install -pl examples
>>> >
>>> > But when I run the program in examples using run-example,the older
>>> > version
>>> > of  mllib (before the changes were made) is getting executed.
>>> > How to get the changes made in mllib while  calling it from examples
>>> > project?
>>> >
>>> > Thanks & Regards,
>>> > Meethu M
>>> >
>>> >
>>> > On Monday, 24 November 2014 3:33 PM, Yiming (John) Zhang
>>> > 
>>> > wrote:
>>> >
>>> >
>>> > Thank you, Marcelo and Sean, "mvn install" is a good answer for my
>>> > demands.
>>> >
>>> > -邮件原件-
>>> > 发件人: Marcelo Vanzin [mailto:van...@cloudera.com]
>>> > 发送时间: 2014年11月21日 1:47
>>> > 收件人: yiming zhang
>>> > 抄送: Sean Owen; user@spark.apache.org
>>> > 主题: Re: How to incrementally compile spark examples using mvn
>>> >
>>> > Hi Yiming,
>>> >
>>> > On Wed, Nov 19, 2014 at 5:35 PM, Yiming (John) Zhang 
>>> > wrote:
>>> >> Thank you for your reply. I was wondering whether there is a method of
>>> >> reusing locally-built components without installing them? That is, if
>>> >> I have
>>> >> successfully built the spark project as a whole, how should I
>>> >> configure it
>>> >> so that I can incrementally build (only) the "spark-examples" sub
>>> >> project
>>> >> without the need of downloading or installation?
>>> >
>>> > As Sean suggest, you shouldn't need to install anything. After "mvn
>>> > install", your local repo is a working Spark installation, and you can
>>> > use
>>> > spark-submit and other tool directly within it.
>>> >
>>> > You just need to remember to rebuild the assembly/ project when
>>> > modifying
>>> > Spark code (or the examples/ project when modifying examples).
>>> >
>>> >
>>> > --
>>> > Marcelo
>>> >
>>> >
>>> > -
>>> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> > For additional commands, e-mail: user-h...@spark.apache.org
>>> >
>>> >
>>> >
>>> >
>>>
>>>
>>>
>>> --
>>> Marcelo
>>>
>>> -
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>
>

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: How to incrementally compile spark examples using mvn

2014-12-05 Thread Koert Kuipers
i think what changed is that core now has dependencies on other sub
projects. ok... so i am forced to install stuff because maven cannot
compile "what is needed". i will install

On Fri, Dec 5, 2014 at 7:12 PM, Koert Kuipers  wrote:

> i suddenly also run into the issue that maven is trying to download
> snapshots that dont exists for other sub projects.
>
> did something change in the maven build?
>
> does maven not have capability to smartly compile the other sub-projects
> that a sub-project depends on?
>
> i rather avoid "mvn install" since this creates a local maven repo. i have
> been stung by that before (spend a day trying to do something and got weird
> errors because some toy version i once build was stuck in my local maven
> repo and it somehow got priority over a real maven repo).
>
> On Fri, Dec 5, 2014 at 5:28 PM, Marcelo Vanzin 
> wrote:
>
>> You can set SPARK_PREPEND_CLASSES=1 and it should pick your new mllib
>> classes whenever you compile them.
>>
>> I don't see anything similar for examples/, so if you modify example
>> code you need to re-build the examples module ("package" or "install"
>> - just "compile" won't work, since you need to build the new jar).
>>
>> On Thu, Dec 4, 2014 at 10:23 PM, MEETHU MATHEW 
>> wrote:
>> > Hi all,
>> >
>> > I made some code changes  in mllib project and as mentioned in the
>> previous
>> > mails I did
>> >
>> > mvn install -pl mllib
>> >
>> > Now  I run a program in examples using run-example, the new code is not
>> > executing.Instead the previous code itself is running.
>> >
>> > But if I do an  "mvn install" in the entire spark project , I can see
>> the
>> > new code running.But installing the entire spark takes a lot of time
>> and so
>> > its difficult to do this each time  I make some changes.
>> >
>> > Can someone tell me how to compile mllib alone and get the changes
>> working?
>> >
>> > Thanks & Regards,
>> > Meethu M
>> >
>> >
>> > On Friday, 28 November 2014 2:39 PM, MEETHU MATHEW <
>> meethu2...@yahoo.co.in>
>> > wrote:
>> >
>> >
>> > Hi,
>> > I have a similar problem.I modified the code in mllib and examples.
>> > I did
>> > mvn install -pl mllib
>> > mvn install -pl examples
>> >
>> > But when I run the program in examples using run-example,the older
>> version
>> > of  mllib (before the changes were made) is getting executed.
>> > How to get the changes made in mllib while  calling it from examples
>> > project?
>> >
>> > Thanks & Regards,
>> > Meethu M
>> >
>> >
>> > On Monday, 24 November 2014 3:33 PM, Yiming (John) Zhang <
>> sdi...@gmail.com>
>> > wrote:
>> >
>> >
>> > Thank you, Marcelo and Sean, "mvn install" is a good answer for my
>> demands.
>> >
>> > -邮件原件-
>> > 发件人: Marcelo Vanzin [mailto:van...@cloudera.com]
>> > 发送时间: 2014年11月21日 1:47
>> > 收件人: yiming zhang
>> > 抄送: Sean Owen; user@spark.apache.org
>> > 主题: Re: How to incrementally compile spark examples using mvn
>> >
>> > Hi Yiming,
>> >
>> > On Wed, Nov 19, 2014 at 5:35 PM, Yiming (John) Zhang 
>> > wrote:
>> >> Thank you for your reply. I was wondering whether there is a method of
>> >> reusing locally-built components without installing them? That is, if
>> I have
>> >> successfully built the spark project as a whole, how should I
>> configure it
>> >> so that I can incrementally build (only) the "spark-examples" sub
>> project
>> >> without the need of downloading or installation?
>> >
>> > As Sean suggest, you shouldn't need to install anything. After "mvn
>> > install", your local repo is a working Spark installation, and you can
>> use
>> > spark-submit and other tool directly within it.
>> >
>> > You just need to remember to rebuild the assembly/ project when
>> modifying
>> > Spark code (or the examples/ project when modifying examples).
>> >
>> >
>> > --
>> > Marcelo
>> >
>> >
>> > -
>> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> > For additional commands, e-mail: user-h...@spark.apache.org
>> >
>> >
>> >
>> >
>>
>>
>>
>> --
>> Marcelo
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>


Re: How to incrementally compile spark examples using mvn

2014-12-05 Thread Koert Kuipers
i suddenly also run into the issue that maven is trying to download
snapshots that dont exists for other sub projects.

did something change in the maven build?

does maven not have capability to smartly compile the other sub-projects
that a sub-project depends on?

i rather avoid "mvn install" since this creates a local maven repo. i have
been stung by that before (spend a day trying to do something and got weird
errors because some toy version i once build was stuck in my local maven
repo and it somehow got priority over a real maven repo).

On Fri, Dec 5, 2014 at 5:28 PM, Marcelo Vanzin  wrote:

> You can set SPARK_PREPEND_CLASSES=1 and it should pick your new mllib
> classes whenever you compile them.
>
> I don't see anything similar for examples/, so if you modify example
> code you need to re-build the examples module ("package" or "install"
> - just "compile" won't work, since you need to build the new jar).
>
> On Thu, Dec 4, 2014 at 10:23 PM, MEETHU MATHEW 
> wrote:
> > Hi all,
> >
> > I made some code changes  in mllib project and as mentioned in the
> previous
> > mails I did
> >
> > mvn install -pl mllib
> >
> > Now  I run a program in examples using run-example, the new code is not
> > executing.Instead the previous code itself is running.
> >
> > But if I do an  "mvn install" in the entire spark project , I can see the
> > new code running.But installing the entire spark takes a lot of time and
> so
> > its difficult to do this each time  I make some changes.
> >
> > Can someone tell me how to compile mllib alone and get the changes
> working?
> >
> > Thanks & Regards,
> > Meethu M
> >
> >
> > On Friday, 28 November 2014 2:39 PM, MEETHU MATHEW <
> meethu2...@yahoo.co.in>
> > wrote:
> >
> >
> > Hi,
> > I have a similar problem.I modified the code in mllib and examples.
> > I did
> > mvn install -pl mllib
> > mvn install -pl examples
> >
> > But when I run the program in examples using run-example,the older
> version
> > of  mllib (before the changes were made) is getting executed.
> > How to get the changes made in mllib while  calling it from examples
> > project?
> >
> > Thanks & Regards,
> > Meethu M
> >
> >
> > On Monday, 24 November 2014 3:33 PM, Yiming (John) Zhang <
> sdi...@gmail.com>
> > wrote:
> >
> >
> > Thank you, Marcelo and Sean, "mvn install" is a good answer for my
> demands.
> >
> > -邮件原件-
> > 发件人: Marcelo Vanzin [mailto:van...@cloudera.com]
> > 发送时间: 2014年11月21日 1:47
> > 收件人: yiming zhang
> > 抄送: Sean Owen; user@spark.apache.org
> > 主题: Re: How to incrementally compile spark examples using mvn
> >
> > Hi Yiming,
> >
> > On Wed, Nov 19, 2014 at 5:35 PM, Yiming (John) Zhang 
> > wrote:
> >> Thank you for your reply. I was wondering whether there is a method of
> >> reusing locally-built components without installing them? That is, if I
> have
> >> successfully built the spark project as a whole, how should I configure
> it
> >> so that I can incrementally build (only) the "spark-examples" sub
> project
> >> without the need of downloading or installation?
> >
> > As Sean suggest, you shouldn't need to install anything. After "mvn
> > install", your local repo is a working Spark installation, and you can
> use
> > spark-submit and other tool directly within it.
> >
> > You just need to remember to rebuild the assembly/ project when modifying
> > Spark code (or the examples/ project when modifying examples).
> >
> >
> > --
> > Marcelo
> >
> >
> > -
> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> > For additional commands, e-mail: user-h...@spark.apache.org
> >
> >
> >
> >
>
>
>
> --
> Marcelo
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Re: How to incrementally compile spark examples using mvn

2014-12-05 Thread Marcelo Vanzin
You can set SPARK_PREPEND_CLASSES=1 and it should pick your new mllib
classes whenever you compile them.

I don't see anything similar for examples/, so if you modify example
code you need to re-build the examples module ("package" or "install"
- just "compile" won't work, since you need to build the new jar).

On Thu, Dec 4, 2014 at 10:23 PM, MEETHU MATHEW  wrote:
> Hi all,
>
> I made some code changes  in mllib project and as mentioned in the previous
> mails I did
>
> mvn install -pl mllib
>
> Now  I run a program in examples using run-example, the new code is not
> executing.Instead the previous code itself is running.
>
> But if I do an  "mvn install" in the entire spark project , I can see the
> new code running.But installing the entire spark takes a lot of time and so
> its difficult to do this each time  I make some changes.
>
> Can someone tell me how to compile mllib alone and get the changes working?
>
> Thanks & Regards,
> Meethu M
>
>
> On Friday, 28 November 2014 2:39 PM, MEETHU MATHEW 
> wrote:
>
>
> Hi,
> I have a similar problem.I modified the code in mllib and examples.
> I did
> mvn install -pl mllib
> mvn install -pl examples
>
> But when I run the program in examples using run-example,the older version
> of  mllib (before the changes were made) is getting executed.
> How to get the changes made in mllib while  calling it from examples
> project?
>
> Thanks & Regards,
> Meethu M
>
>
> On Monday, 24 November 2014 3:33 PM, Yiming (John) Zhang 
> wrote:
>
>
> Thank you, Marcelo and Sean, "mvn install" is a good answer for my demands.
>
> -----邮件原件-
> 发件人: Marcelo Vanzin [mailto:van...@cloudera.com]
> 发送时间: 2014年11月21日 1:47
> 收件人: yiming zhang
> 抄送: Sean Owen; user@spark.apache.org
> 主题: Re: How to incrementally compile spark examples using mvn
>
> Hi Yiming,
>
> On Wed, Nov 19, 2014 at 5:35 PM, Yiming (John) Zhang 
> wrote:
>> Thank you for your reply. I was wondering whether there is a method of
>> reusing locally-built components without installing them? That is, if I have
>> successfully built the spark project as a whole, how should I configure it
>> so that I can incrementally build (only) the "spark-examples" sub project
>> without the need of downloading or installation?
>
> As Sean suggest, you shouldn't need to install anything. After "mvn
> install", your local repo is a working Spark installation, and you can use
> spark-submit and other tool directly within it.
>
> You just need to remember to rebuild the assembly/ project when modifying
> Spark code (or the examples/ project when modifying examples).
>
>
> --
> Marcelo
>
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>
>
>



-- 
Marcelo

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: How to incrementally compile spark examples using mvn

2014-12-04 Thread MEETHU MATHEW
Hi all,
I made some code changes  in mllib project and as mentioned in the previous 
mails I did 
mvn install -pl mllib 
Now  I run a program in examples using run-example, the new code is not 
executing.Instead the previous code itself is running.
But if I do an  "mvn install" in the entire spark project , I can see the new 
code running.But installing the entire spark takes a lot of time and so its 
difficult to do this each time  I make some changes.
Can someone tell me how to compile mllib alone and get the changes working? 
Thanks & Regards,
Meethu M 

 On Friday, 28 November 2014 2:39 PM, MEETHU MATHEW 
 wrote:
   

 Hi,I have a similar problem.I modified the code in mllib and examples.I did 
mvn install -pl mllib mvn install -pl examples
But when I run the program in examples using run-example,the older version of  
mllib (before the changes were made) is getting executed.How to get the changes 
made in mllib while  calling it from examples project? Thanks & Regards,
Meethu M 

 On Monday, 24 November 2014 3:33 PM, Yiming (John) Zhang 
 wrote:
   

 Thank you, Marcelo and Sean, "mvn install" is a good answer for my demands. 

-邮件原件-
发件人: Marcelo Vanzin [mailto:van...@cloudera.com] 
发送时间: 2014年11月21日 1:47
收件人: yiming zhang
抄送: Sean Owen; user@spark.apache.org
主题: Re: How to incrementally compile spark examples using mvn

Hi Yiming,

On Wed, Nov 19, 2014 at 5:35 PM, Yiming (John) Zhang  wrote:
> Thank you for your reply. I was wondering whether there is a method of 
> reusing locally-built components without installing them? That is, if I have 
> successfully built the spark project as a whole, how should I configure it so 
> that I can incrementally build (only) the "spark-examples" sub project 
> without the need of downloading or installation?

As Sean suggest, you shouldn't need to install anything. After "mvn install", 
your local repo is a working Spark installation, and you can use spark-submit 
and other tool directly within it.

You just need to remember to rebuild the assembly/ project when modifying Spark 
code (or the examples/ project when modifying examples).


--
Marcelo


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org




   

Re: How to incrementally compile spark examples using mvn

2014-11-28 Thread MEETHU MATHEW
Hi,I have a similar problem.I modified the code in mllib and examples.I did mvn 
install -pl mllib mvn install -pl examples
But when I run the program in examples using run-example,the older version of  
mllib (before the changes were made) is getting executed.How to get the changes 
made in mllib while  calling it from examples project? Thanks & Regards,
Meethu M 

 On Monday, 24 November 2014 3:33 PM, Yiming (John) Zhang 
 wrote:
   

 Thank you, Marcelo and Sean, "mvn install" is a good answer for my demands. 

-邮件原件-
发件人: Marcelo Vanzin [mailto:van...@cloudera.com] 
发送时间: 2014年11月21日 1:47
收件人: yiming zhang
抄送: Sean Owen; user@spark.apache.org
主题: Re: How to incrementally compile spark examples using mvn

Hi Yiming,

On Wed, Nov 19, 2014 at 5:35 PM, Yiming (John) Zhang  wrote:
> Thank you for your reply. I was wondering whether there is a method of 
> reusing locally-built components without installing them? That is, if I have 
> successfully built the spark project as a whole, how should I configure it so 
> that I can incrementally build (only) the "spark-examples" sub project 
> without the need of downloading or installation?

As Sean suggest, you shouldn't need to install anything. After "mvn install", 
your local repo is a working Spark installation, and you can use spark-submit 
and other tool directly within it.

You just need to remember to rebuild the assembly/ project when modifying Spark 
code (or the examples/ project when modifying examples).


--
Marcelo


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org


   

re: How to incrementally compile spark examples using mvn

2014-11-24 Thread Yiming (John) Zhang
Thank you, Marcelo and Sean, "mvn install" is a good answer for my demands. 

-邮件原件-
发件人: Marcelo Vanzin [mailto:van...@cloudera.com] 
发送时间: 2014年11月21日 1:47
收件人: yiming zhang
抄送: Sean Owen; user@spark.apache.org
主题: Re: How to incrementally compile spark examples using mvn

Hi Yiming,

On Wed, Nov 19, 2014 at 5:35 PM, Yiming (John) Zhang  wrote:
> Thank you for your reply. I was wondering whether there is a method of 
> reusing locally-built components without installing them? That is, if I have 
> successfully built the spark project as a whole, how should I configure it so 
> that I can incrementally build (only) the "spark-examples" sub project 
> without the need of downloading or installation?

As Sean suggest, you shouldn't need to install anything. After "mvn install", 
your local repo is a working Spark installation, and you can use spark-submit 
and other tool directly within it.

You just need to remember to rebuild the assembly/ project when modifying Spark 
code (or the examples/ project when modifying examples).


--
Marcelo


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: How to incrementally compile spark examples using mvn

2014-11-20 Thread Marcelo Vanzin
Hi Yiming,

On Wed, Nov 19, 2014 at 5:35 PM, Yiming (John) Zhang  wrote:
> Thank you for your reply. I was wondering whether there is a method of 
> reusing locally-built components without installing them? That is, if I have 
> successfully built the spark project as a whole, how should I configure it so 
> that I can incrementally build (only) the "spark-examples" sub project 
> without the need of downloading or installation?

As Sean suggest, you shouldn't need to install anything. After "mvn
install", your local repo is a working Spark installation, and you can
use spark-submit and other tool directly within it.

You just need to remember to rebuild the assembly/ project when
modifying Spark code (or the examples/ project when modifying
examples).


-- 
Marcelo

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



re: How to incrementally compile spark examples using mvn

2014-11-19 Thread Sean Owen
Why not install them? It doesn't take any work and is the only correct way
to do it.  mvn install is all you need.
On Nov 20, 2014 2:35 AM, "Yiming (John) Zhang"  wrote:

> Hi Sean,
>
> Thank you for your reply. I was wondering whether there is a method of
> reusing locally-built components without installing them? That is, if I
> have successfully built the spark project as a whole, how should I
> configure it so that I can incrementally build (only) the "spark-examples"
> sub project without the need of downloading or installation?
>
> Thank you!
>
> Cheers,
> Yiming
>
> -邮件原件-
> 发件人: Sean Owen [mailto:so...@cloudera.com]
> 发送时间: 2014年11月17日 17:40
> 收件人: yiming zhang
> 抄送: Marcelo Vanzin; user@spark.apache.org
> 主题: Re: How to incrementally compile spark examples using mvn
>
> The downloads just happen once so this is not a problem.
>
> If you are just building one module in a project, it needs a compiled copy
> of other modules. It will either use your locally-built and
> locally-installed artifact, or, download one from the repo if possible.
>
> This isn't needed if you are compiling all modules at once. If you want to
> compile everything and reuse the local artifacts later, you need 'install'
> not 'package'.
>
> On Mon, Nov 17, 2014 at 12:27 AM, Yiming (John) Zhang 
> wrote:
> > Thank you Marcelo. I tried your suggestion (# mvn -pl
> :spark-examples_2.10 compile), but it required to download many spark
> components (as listed below), which I have already compiled on my server.
> >
> > Downloading:
> > https://repo1.maven.org/maven2/org/apache/spark/spark-core_2.10/1.1.0/
> > spark-core_2.10-1.1.0.pom
> > ...
> > Downloading:
> > https://repo1.maven.org/maven2/org/apache/spark/spark-streaming_2.10/1
> > .1.0/spark-streaming_2.10-1.1.0.pom
> > ...
> > Downloading:
> > https://repository.jboss.org/nexus/content/repositories/releases/org/a
> > pache/spark/spark-hive_2.10/1.1.0/spark-hive_2.10-1.1.0.pom
> > ...
> >
> > This problem didn't happen when I compiled the whole project using ``mvn
> -DskipTests package''. I guess some configurations have to be made to tell
> mvn the dependencies are local. Any idea for that?
> >
> > Thank you for your help!
> >
> > Cheers,
> > Yiming
> >
> > -邮件原件-
> > 发件人: Marcelo Vanzin [mailto:van...@cloudera.com]
> > 发送时间: 2014年11月16日 10:26
> > 收件人: sdi...@gmail.com
> > 抄送: user@spark.apache.org
> > 主题: Re: How to incrementally compile spark examples using mvn
> >
> > I haven't tried scala:cc, but you can ask maven to just build a
> particular sub-project. For example:
> >
> >   mvn -pl :spark-examples_2.10 compile
> >
> > On Sat, Nov 15, 2014 at 5:31 PM, Yiming (John) Zhang 
> wrote:
> >> Hi,
> >>
> >>
> >>
> >> I have already successfully compile and run spark examples. My
> >> problem is that if I make some modifications (e.g., on SparkPi.scala
> >> or
> >> LogQuery.scala) I have to use “mvn -DskipTests package” to rebuild
> >> the whole spark project and wait a relatively long time.
> >>
> >>
> >>
> >> I also tried “mvn scala:cc” as described in
> >> http://spark.apache.org/docs/latest/building-with-maven.html, but I
> >> could only get infinite stop like:
> >>
> >> [INFO] --- scala-maven-plugin:3.2.0:cc (default-cli) @ spark-parent
> >> ---
> >>
> >> [INFO] wait for files to compile...
> >>
> >>
> >>
> >> Is there any method to incrementally compile the examples using mvn?
> >> Thank you!
> >>
> >>
> >>
> >> Cheers,
> >>
> >> Yiming
> >
> >
> >
> > --
> > Marcelo
> >
> >
> > -
> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For
> > additional commands, e-mail: user-h...@spark.apache.org
> >
>
>


re: How to incrementally compile spark examples using mvn

2014-11-19 Thread Yiming (John) Zhang
Hi Sean,

Thank you for your reply. I was wondering whether there is a method of reusing 
locally-built components without installing them? That is, if I have 
successfully built the spark project as a whole, how should I configure it so 
that I can incrementally build (only) the "spark-examples" sub project without 
the need of downloading or installation? 

Thank you!

Cheers,
Yiming

-邮件原件-
发件人: Sean Owen [mailto:so...@cloudera.com] 
发送时间: 2014年11月17日 17:40
收件人: yiming zhang
抄送: Marcelo Vanzin; user@spark.apache.org
主题: Re: How to incrementally compile spark examples using mvn

The downloads just happen once so this is not a problem.

If you are just building one module in a project, it needs a compiled copy of 
other modules. It will either use your locally-built and locally-installed 
artifact, or, download one from the repo if possible.

This isn't needed if you are compiling all modules at once. If you want to 
compile everything and reuse the local artifacts later, you need 'install' not 
'package'.

On Mon, Nov 17, 2014 at 12:27 AM, Yiming (John) Zhang  wrote:
> Thank you Marcelo. I tried your suggestion (# mvn -pl :spark-examples_2.10 
> compile), but it required to download many spark components (as listed 
> below), which I have already compiled on my server.
>
> Downloading: 
> https://repo1.maven.org/maven2/org/apache/spark/spark-core_2.10/1.1.0/
> spark-core_2.10-1.1.0.pom
> ...
> Downloading: 
> https://repo1.maven.org/maven2/org/apache/spark/spark-streaming_2.10/1
> .1.0/spark-streaming_2.10-1.1.0.pom
> ...
> Downloading: 
> https://repository.jboss.org/nexus/content/repositories/releases/org/a
> pache/spark/spark-hive_2.10/1.1.0/spark-hive_2.10-1.1.0.pom
> ...
>
> This problem didn't happen when I compiled the whole project using ``mvn 
> -DskipTests package''. I guess some configurations have to be made to tell 
> mvn the dependencies are local. Any idea for that?
>
> Thank you for your help!
>
> Cheers,
> Yiming
>
> -邮件原件-
> 发件人: Marcelo Vanzin [mailto:van...@cloudera.com]
> 发送时间: 2014年11月16日 10:26
> 收件人: sdi...@gmail.com
> 抄送: user@spark.apache.org
> 主题: Re: How to incrementally compile spark examples using mvn
>
> I haven't tried scala:cc, but you can ask maven to just build a particular 
> sub-project. For example:
>
>   mvn -pl :spark-examples_2.10 compile
>
> On Sat, Nov 15, 2014 at 5:31 PM, Yiming (John) Zhang  wrote:
>> Hi,
>>
>>
>>
>> I have already successfully compile and run spark examples. My 
>> problem is that if I make some modifications (e.g., on SparkPi.scala 
>> or
>> LogQuery.scala) I have to use “mvn -DskipTests package” to rebuild 
>> the whole spark project and wait a relatively long time.
>>
>>
>>
>> I also tried “mvn scala:cc” as described in 
>> http://spark.apache.org/docs/latest/building-with-maven.html, but I 
>> could only get infinite stop like:
>>
>> [INFO] --- scala-maven-plugin:3.2.0:cc (default-cli) @ spark-parent
>> ---
>>
>> [INFO] wait for files to compile...
>>
>>
>>
>> Is there any method to incrementally compile the examples using mvn?
>> Thank you!
>>
>>
>>
>> Cheers,
>>
>> Yiming
>
>
>
> --
> Marcelo
>
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For 
> additional commands, e-mail: user-h...@spark.apache.org
>


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: How to incrementally compile spark examples using mvn

2014-11-17 Thread Sean Owen
The downloads just happen once so this is not a problem.

If you are just building one module in a project, it needs a compiled
copy of other modules. It will either use your locally-built and
locally-installed artifact, or, download one from the repo if
possible.

This isn't needed if you are compiling all modules at once. If you
want to compile everything and reuse the local artifacts later, you
need 'install' not 'package'.

On Mon, Nov 17, 2014 at 12:27 AM, Yiming (John) Zhang  wrote:
> Thank you Marcelo. I tried your suggestion (# mvn -pl :spark-examples_2.10 
> compile), but it required to download many spark components (as listed 
> below), which I have already compiled on my server.
>
> Downloading: 
> https://repo1.maven.org/maven2/org/apache/spark/spark-core_2.10/1.1.0/spark-core_2.10-1.1.0.pom
> ...
> Downloading: 
> https://repo1.maven.org/maven2/org/apache/spark/spark-streaming_2.10/1.1.0/spark-streaming_2.10-1.1.0.pom
> ...
> Downloading: 
> https://repository.jboss.org/nexus/content/repositories/releases/org/apache/spark/spark-hive_2.10/1.1.0/spark-hive_2.10-1.1.0.pom
> ...
>
> This problem didn't happen when I compiled the whole project using ``mvn 
> -DskipTests package''. I guess some configurations have to be made to tell 
> mvn the dependencies are local. Any idea for that?
>
> Thank you for your help!
>
> Cheers,
> Yiming
>
> -邮件原件-
> 发件人: Marcelo Vanzin [mailto:van...@cloudera.com]
> 发送时间: 2014年11月16日 10:26
> 收件人: sdi...@gmail.com
> 抄送: user@spark.apache.org
> 主题: Re: How to incrementally compile spark examples using mvn
>
> I haven't tried scala:cc, but you can ask maven to just build a particular 
> sub-project. For example:
>
>   mvn -pl :spark-examples_2.10 compile
>
> On Sat, Nov 15, 2014 at 5:31 PM, Yiming (John) Zhang  wrote:
>> Hi,
>>
>>
>>
>> I have already successfully compile and run spark examples. My problem
>> is that if I make some modifications (e.g., on SparkPi.scala or
>> LogQuery.scala) I have to use “mvn -DskipTests package” to rebuild the
>> whole spark project and wait a relatively long time.
>>
>>
>>
>> I also tried “mvn scala:cc” as described in
>> http://spark.apache.org/docs/latest/building-with-maven.html, but I
>> could only get infinite stop like:
>>
>> [INFO] --- scala-maven-plugin:3.2.0:cc (default-cli) @ spark-parent
>> ---
>>
>> [INFO] wait for files to compile...
>>
>>
>>
>> Is there any method to incrementally compile the examples using mvn?
>> Thank you!
>>
>>
>>
>> Cheers,
>>
>> Yiming
>
>
>
> --
> Marcelo
>
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



re: How to incrementally compile spark examples using mvn

2014-11-16 Thread Yiming (John) Zhang
Thank you Marcelo. I tried your suggestion (# mvn -pl :spark-examples_2.10 
compile), but it required to download many spark components (as listed below), 
which I have already compiled on my server.

Downloading: 
https://repo1.maven.org/maven2/org/apache/spark/spark-core_2.10/1.1.0/spark-core_2.10-1.1.0.pom
...
Downloading: 
https://repo1.maven.org/maven2/org/apache/spark/spark-streaming_2.10/1.1.0/spark-streaming_2.10-1.1.0.pom
...
Downloading: 
https://repository.jboss.org/nexus/content/repositories/releases/org/apache/spark/spark-hive_2.10/1.1.0/spark-hive_2.10-1.1.0.pom
...

This problem didn't happen when I compiled the whole project using ``mvn 
-DskipTests package''. I guess some configurations have to be made to tell mvn 
the dependencies are local. Any idea for that?

Thank you for your help!

Cheers,
Yiming

-邮件原件-
发件人: Marcelo Vanzin [mailto:van...@cloudera.com] 
发送时间: 2014年11月16日 10:26
收件人: sdi...@gmail.com
抄送: user@spark.apache.org
主题: Re: How to incrementally compile spark examples using mvn

I haven't tried scala:cc, but you can ask maven to just build a particular 
sub-project. For example:

  mvn -pl :spark-examples_2.10 compile

On Sat, Nov 15, 2014 at 5:31 PM, Yiming (John) Zhang  wrote:
> Hi,
>
>
>
> I have already successfully compile and run spark examples. My problem 
> is that if I make some modifications (e.g., on SparkPi.scala or 
> LogQuery.scala) I have to use “mvn -DskipTests package” to rebuild the 
> whole spark project and wait a relatively long time.
>
>
>
> I also tried “mvn scala:cc” as described in 
> http://spark.apache.org/docs/latest/building-with-maven.html, but I 
> could only get infinite stop like:
>
> [INFO] --- scala-maven-plugin:3.2.0:cc (default-cli) @ spark-parent 
> ---
>
> [INFO] wait for files to compile...
>
>
>
> Is there any method to incrementally compile the examples using mvn? 
> Thank you!
>
>
>
> Cheers,
>
> Yiming



--
Marcelo


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: How to incrementally compile spark examples using mvn

2014-11-15 Thread Marcelo Vanzin
I haven't tried scala:cc, but you can ask maven to just build a
particular sub-project. For example:

  mvn -pl :spark-examples_2.10 compile

On Sat, Nov 15, 2014 at 5:31 PM, Yiming (John) Zhang  wrote:
> Hi,
>
>
>
> I have already successfully compile and run spark examples. My problem is
> that if I make some modifications (e.g., on SparkPi.scala or LogQuery.scala)
> I have to use “mvn -DskipTests package” to rebuild the whole spark project
> and wait a relatively long time.
>
>
>
> I also tried “mvn scala:cc” as described in
> http://spark.apache.org/docs/latest/building-with-maven.html, but I could
> only get infinite stop like:
>
> [INFO] --- scala-maven-plugin:3.2.0:cc (default-cli) @ spark-parent ---
>
> [INFO] wait for files to compile...
>
>
>
> Is there any method to incrementally compile the examples using mvn? Thank
> you!
>
>
>
> Cheers,
>
> Yiming



-- 
Marcelo

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org