Yeah, I'm fine with that.

On Mon, Feb 9, 2015 at 10:09 PM, Patrick Wendell <pwend...@gmail.com> wrote:

> Mark was involved in adding this code (IIRC) and has also been the
> most active in maintaining it. So I'd be interested in hearing his
> thoughts on that proposal. Mark - would you be okay deprecating this
> and having Spark instead work with the upstream projects that focus on
> packaging?
>
> My feeling is that it's better to just have nothing than to have
> something not usable out-of-the-box (which to your point, is a lot
> more work).
>
> On Mon, Feb 9, 2015 at 4:10 PM,  <n...@reactor8.com> wrote:
> > This could be something if the spark community wanted to not maintain
> debs/rpms directly via the project could direct interested efforts towards
> apache bigtop.  Right now debs/rpms of bigtop components, as well as
> related tests is a focus.
> >
> > Something that would be great is if at least one spark committer with
> interests in config/pkg/testing could be liason and pt for bigtop efforts.
> >
> > Right now focus on bigtop 0.9, which currently includes spark 1.2.  Jira
> for items included in 0.9 can be found here:
> >
> > https://issues.apache.org/jira/browse/BIGTOP-1480
> >
> >
> >
> > -----Original Message-----
> > From: Sean Owen [mailto:so...@cloudera.com]
> > Sent: Monday, February 9, 2015 3:52 PM
> > To: Nicholas Chammas
> > Cc: Patrick Wendell; Mark Hamstra; dev
> > Subject: Re: Keep or remove Debian packaging in Spark?
> >
> > What about this straw man proposal: deprecate in 1.3 with some kind of
> message in the build, and remove for 1.4? And add a pointer to any
> third-party packaging that might provide similar functionality?
> >
> > On Mon, Feb 9, 2015 at 6:47 PM, Nicholas Chammas <
> nicholas.cham...@gmail.com> wrote:
> >> +1 to an "official" deprecation + redirecting users to some other
> >> +project
> >> that will or already is taking this on.
> >>
> >> Nate?
> >>
> >>
> >>
> >> On Mon Feb 09 2015 at 10:08:27 AM Patrick Wendell <pwend...@gmail.com>
> >> wrote:
> >>>
> >>> I have wondered whether we should sort of deprecated it more
> >>> officially, since otherwise I think people have the reasonable
> >>> expectation based on the current code that Spark intends to support
> >>> "complete" Debian packaging as part of the upstream build. Having
> >>> something that's sort-of maintained but no one is helping review and
> >>> merge patches on it or make it fully functional, IMO that doesn't
> >>> benefit us or our users. There are a bunch of other projects that are
> >>> specifically devoted to packaging, so it seems like there is a clear
> >>> separation of concerns here.
> >>>
> >>> On Mon, Feb 9, 2015 at 7:31 AM, Mark Hamstra
> >>> <m...@clearstorydata.com>
> >>> wrote:
> >>> >>
> >>> >> it sounds like nobody intends these to be used to actually deploy
> >>> >> Spark
> >>> >
> >>> >
> >>> > I wouldn't go quite that far.  What we have now can serve as useful
> >>> > input to a deployment tool like Chef, but the user is then going to
> >>> > need to add some customization or configuration within the context
> >>> > of that tooling to get Spark installed just the way they want.  So
> >>> > it is not so much that the current Debian packaging can't be used
> >>> > as that it has never really been intended to be a completely
> >>> > finished product that a newcomer could, for example, use to install
> >>> > Spark completely and quickly to Ubuntu and have a fully-functional
> >>> > environment in which they could then run all of the examples,
> >>> > tutorials, etc.
> >>> >
> >>> > Getting to that level of packaging (and maintenance) is something
> >>> > that I'm not sure we want to do since that is a better fit with
> >>> > Bigtop and the efforts of Cloudera, Horton Works, MapR, etc. to
> >>> > distribute Spark.
> >>> >
> >>> > On Mon, Feb 9, 2015 at 2:41 AM, Sean Owen <so...@cloudera.com>
> wrote:
> >>> >
> >>> >> This is a straw poll to assess whether there is support to keep
> >>> >> and fix, or remove, the Debian packaging-related config in Spark.
> >>> >>
> >>> >> I see several oldish outstanding JIRAs relating to problems in the
> >>> >> packaging:
> >>> >>
> >>> >> https://issues.apache.org/jira/browse/SPARK-1799
> >>> >> https://issues.apache.org/jira/browse/SPARK-2614
> >>> >> https://issues.apache.org/jira/browse/SPARK-3624
> >>> >> https://issues.apache.org/jira/browse/SPARK-4436
> >>> >> (and a similar idea about making RPMs)
> >>> >> https://issues.apache.org/jira/browse/SPARK-665
> >>> >>
> >>> >> The original motivation seems related to Chef:
> >>> >>
> >>> >>
> >>> >>
> >>> >> https://issues.apache.org/jira/browse/SPARK-2614?focusedCommentId=
> >>> >> 14070908&page=com.atlassian.jira.plugin.system.issuetabpanels:comm
> >>> >> ent-tabpanel#comment-14070908
> >>> >>
> >>> >> Mark's recent comments cast some doubt on whether it is essential:
> >>> >>
> >>> >> https://github.com/apache/spark/pull/4277#issuecomment-72114226
> >>> >>
> >>> >> and in recent conversations I didn't hear dissent to the idea of
> >>> >> removing this.
> >>> >>
> >>> >> Is this still useful enough to fix up? All else equal I'd like to
> >>> >> start to walk back some of the complexity of the build, but I
> >>> >> don't know how all-else-equal it is. Certainly, it sounds like
> >>> >> nobody intends these to be used to actually deploy Spark.
> >>> >>
> >>> >> I don't doubt it's useful to someone, but can they maintain the
> >>> >> packaging logic elsewhere?
> >>> >>
> >>> >> ------------------------------------------------------------------
> >>> >> --- To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For
> >>> >> additional commands, e-mail: dev-h...@spark.apache.org
> >>> >>
> >>> >>
> >>>
> >>> ---------------------------------------------------------------------
> >>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For
> >>> additional commands, e-mail: dev-h...@spark.apache.org
> >>>
> >>
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For additional
> commands, e-mail: dev-h...@spark.apache.org
> >
> >
>

Reply via email to