-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Carlo Calica wrote:
> On 5/5/07, Isaac Dupree <[EMAIL PROTECTED]> wrote:
>> Noticing:
>> One important use of forcing an install attempt despite unmatched
>> dependencies is to check if those are indeed (still) needed dependencies.
>>
> That's great for automaticed testing.  Makes sure libraries are mininal 
> enough.
> 
>> Often there are dependencies for which the build process will _fail_
>> if unmet. (or perhaps a run-time process for runtime dependencies,
>> including packages not installed from recipe).  What would be really
>> neat is if most dependencies were like that(are they?) and annotated as
>> such, so there could be an automatic build-process that checks the
>> accuracy of those dependencies.(for each architecture maybe).  (The same
>> process could also check to see if the other required dependencies
>> actually fall under that classification, and say so, if they do.)
>> Naturally there will be some build processes that don't fail when they
>> should, and some things that a human would agree is a dependency even
>> when it can't be checked automatically... if those are few enough,
>> perhaps comments could be put explaining how those dependencies are
>> necessary/important (for the required, not necessarily the optional or
>> recommended dependencies, that is).
>>

Testing the _version_ requirements is probably more difficult since
things can break in subtle ways with the wrong version, but at least it
could complain if the _build process_ ends in an error, with any
combination of versions allowed by stated dependencies (not just oldest
of everything, it might be nice to test randomized combinations of
possibilities perhaps).

> 
> It sounds like you're talking about a build farm.  Such a thing would
> be great.  Iplementers talk to me.  I'd really like to help with the
> efforts.

I may look into this sometime.

Unfortunately multiple architectures (even x86 vs x86-64) tend to
multiply the amount of work excessively. Hmm... since most things work
the same on all architectures, possibly some combination of
randomization, marking known-troublesome packages, spreading out the
workload over different-architecture systems (if there's an error then
go test everywhere)... could help.  New package versions come out often
enough compared to how often something changes w.r.t
architecture-specificness or dependencies (I think), that it should be
okay for

Reminds me of Haskell's QuickCheck, see e.g.
http://lambda-the-ultimate.org/node/2064#comment-25290 - let's try to
search out those lurking bugs :)

also the presence of random additional packages (probably symlinked)
should: not cause a problem


> Going to be a student next year summer?

Yes

> Would be great to get a SoC.

We'll see what I'm feeling like then (and what Google thinks).  Usually
in summers I rest and recover from the school year (but then, that
"resting" tends to involve feverishly programming / using my computer in
any case :-)

Isaac
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.6 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iD8DBQFGWYt6HgcxvIWYTTURAizGAKDHeYfse4B7ZK2sY352eGTtr80jSwCgxtOV
Ryv2CWxM2UG0p3YcjaNaILw=
=vHBx
-----END PGP SIGNATURE-----
_______________________________________________
gobolinux-devel mailing list
gobolinux-devel@lists.gobolinux.org
http://lists.gobolinux.org/mailman/listinfo/gobolinux-devel

Reply via email to