Liane Praza wrote:
> (Trimmed the cc list.)
>
> John Plocher wrote:
>> Garrett D'Amore wrote:
>>> unison as documented for this case, IMO, does *not* belong in any 
>>> default distribution of Solaris, but in some value add location 
>>> where folks who want it can get at it easily.  
>>
>> I tend to agree.
>
> Ah, the value-add location like we had with /usr/sfw/gcc or value-add 
> repository of /opt/sfw?

No, not necessarily in a different path!  Just a different *respository* 
for the package.  Where the package installs into is not relevant (IMO) 
to this particular discussion.

>
> Seriously, please consider the gcc example illustrative.  Even though 
> it was *bundled* people didn't (still don't) think Solaris shipped 
> with it.

That's because it wasn't in /usr/bin.  Note that I'm not suggesting 
where a component is installed in (/usr/bin vs some other location), but 
rather whether it is installed *at all* (at least in the "default" 
installation).

>
> Making software harder to get or harder to find isn't necessarily the 
> right answer.  But I'm not even convinced the ARC should be in the 
> business of deciding which software is 'bundled' in the distro, versus 
> what's in an 'alternate' repository.  (And having ARC banish it to 
> another consolidation is equally bad -- consolidation boundaries 
> should be able to be set based on development efficiencies.)
>
> Shouldn't ARC just be in the business of reviewing a project team's 
> definition of how stable and architecturally integrated an included 
> piece of software is?  If the software is developed externally, it's a 
> matter of making sure the project team communicates to users (not by 
> location, but by documentation) when they're building on quicksand.

Well, the problem is that if we lower the bar at ARC for 3rd party FOSS, 
then what value exactly does ARC review of FOSS software have?  I'm not 
sure I believe that ARC review at this point should only be 
documentation verification.


>
> Heck, isn't that even true of OpenSolaris-specific software?  Why 
> should ARC spend much of its time second-guessing whether interfaces 
> should be at the stability the project team defined them at?

Stability is part of the architecture.  Defining an interface as stable 
when change may be expected, or when the interface should be avoided by 
users if at all possible (such as with obsolete software) is a mistake.  
Part of ARC's job (at least as I understand it) is making sure that 
these definitions are done sanely.

> We're getting to a world where creating distributions is much easier.  
> I hope that ARC will spend its time classifying interfaces and 
> components so that administrators, developers, users and distro 
> builders alike can make educated decisions, rather than having ARC 
> attempt to define the exact contents of a distribution.

Well, yes.  The problem is that right now part of what ARC does is help 
define the overall system.  We have no way to indicate via ARC (at the 
moment) which parts make up the "core" bits that users can depend on 
(and which most distributions would hopefully therefore include) and 
which bits are just random FOSS that may or may not be present, and 
which have only the most tenuous level of support guarantees, if any.

>
> John, I think the rest of your mail is a good way to start attacking 
> the problem.  But I'm very wary of the idea that banishing non-ideal 
> software to another location solves anything.  Focusing on what ARC 
> *can* do to classify seems like a much better approach.  (Especially 
> when those classifications can be categorized and fast-pathed so we 
> don't see the same discussion in every ARC case.)

Yes, we need to have a better way to simplify dealing with these.  An 
external repository of "non-core" platform software (and there maybe 
further separation that could be done as well) and some standard way to 
indicate (both to ARC and in user documentation) software is delivered 
strictly on a "caveat emptor" basis would be most helpful here.

    -- Garrett


Reply via email to