Am 30.09.2010 22:06, schrieb Enrico Weigelt:
> hmm, one thing we could do is:
>
> #1: adding download protocol specifiers, eg.:
>
>      <file buildtool.mk>
>       Protocol        curl
>          URL          http://....
>      </file>
>
> (if no protocol is given, falling back to cvs)
>
>
> #2: adding a new object type "tree"
>
> <tree src>
>      Protocol git-tree
>      URL              git://pubgit.metux.de/oss-qm/mypackage.git
>      Ref              refs/tags/VENDOR/leaf/mypackage-1.2.3.77
> </tree src>
>
> That means: fetch the _tree_ (not tarball) via git from given url
> and ref-name.
Surely. But I'm afraid you're missing the point about the main issue the 
leaf project has been facing from day one (even before then, but back 
then, it wasn't called "leaf") - to me, that point always seemed to be 
finding somebody to do the work. Or rather, finding more than one or two 
people (or 3 or 5 - you get my point) to do the work. If all the work is 
on the shoulders of a few of people all the time, it means that many 
things don't get taken care of, simply because other things are more 
important at that point in time, and things that could help long term 
just never get addressed, because nobody has the time.

Adding a handler for a new protocol (like "curl" which could be simply a 
wapper for calling the executable - but adding another dependency on the 
host for the build system in the process), is easy, and should only 
require a few lines of code. But it takes somebody to write and test them.

Your approach with fetching a "tree" rather than a tarball might clash 
with the standard buildtool approach (where buildtool.cfg specifies what 
gets downloaded, and buildtool.mk decides how it's extracted), but it 
could probably be integrated without some awful hacks, provided that 
somebody is willing to give it a shot.

But extending buildtool to handle git, seems like a waste of time to me 
- (to me, IMHO and all the other disclaimers) CVS offers everything 
needed for buildtool, so changing buildtool to be able to get sources 
from git does nothing other than replace tested, proven code, with code 
that still needs to be written and tested.

I have no doubt that I'm rather ignorant about what git can do - but 
from the buildtool point of view, CVS is just storage - if the specified 
version can be downloaded reliably, that's pretty much all that counts 
for buildtool. For other, more advanced, build systems, this may not be 
the case, but for buildtool, the underlying SCM is irrelevant.
So, cutting a long story short, switching to git to store the upstream 
source tarballs (or even source trees), is not going to change anything 
other than using git versus CVS. Since I have no deep hate for CVS (and 
trust me, I've gotten my hands "dirty" fixing my share or corrupted CVS 
repositories), it doesn't make the build process one bit better. As I 
said, I'm pretty ignorant about what git can do, and hence not see the 
big picture.
If migrating to git (and offering a CVS repository for "legacy" 
applications") is easy enough, maybe it would be more worthwhile to 
switch the build system first - that's where the current problems are 
(libs from the host leaking in during the build process and so on).

Which "SCM" the sources originally came from (be it FTP/HTTP download, 
download from CVS or a simple local file copy) has never been an issue 
to me in the past years. I've also never had issues (while working on 
leaf stuff) with what people complain about the most regarding CVS 
(branches work just fine for me, and I use them a lot - I guess if one 
started out with rcs and then moved up to SCCS, CVS was a real blessing. 
Oh, and I had a brief brush with early versions of "Visual SourceSafe" 
as well, and lost my share of code in the process). But I digress - in 
short, none of the things that CVS lacks seem (to me) to be things that 
hinder effective development of leaf images/packages.

Again, that conclusion might be due to my ignorance of git - but until I 
see a case where CVS prevents an effective development process, where 
things would have been just dandy with git, I remain sceptical. No 
doubt, git is better in some situations (my point is _not_ that CVS can 
do everything that git could. My point simply is that we've not been 
held back by the shortcomings of CVS in any way, as far as I can tell, 
and that for this reason, switching to another SCM, just because it's 
"better" might be a waste of resources).

> There could also be other protocol types, eg. directly fetching
> via web and automatic decompression.
Absolutely

> In longer terms, we should put all the separate files directly
> into the source tree (which is then managed via git) - no patching,
> no additional files.
I actually liked the idea of using upsteam sources and adding our 
patches through the build system. But maybe that's because I've been 
messing with RedHat srpms for too many years (going back to the days 
when Linus was still "happy" with using BitKeeper, or maybe even before 
that - I'm getting old, and things like that tend to get blurry over the 
years) :-)

Martin

P.S. Sorry - that turned out much longer than I planned

------------------------------------------------------------------------------
Start uncovering the many advantages of virtual appliances
and start using them to simplify application deployment and
accelerate your shift to cloud computing.
http://p.sf.net/sfu/novell-sfdev2dev

_______________________________________________
leaf-devel mailing list
leaf-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/leaf-devel

Reply via email to