Hello Adam,

I was thinking this morning about a way to have a simple way for D
library distribution. Let me run it by you guys, and we'll discuss if
it is a good plan. The short version is it takes a package -> URL
mapping file and downloads on http, based on the .d file being in an
expected folder.

The plan is simple. The program will take a file and parse out its
dependencies, using dmd -v, like rdmd does. Now, it opens a package
mapping file. If this file doesn't exist locally, it can download it
from a central server.

The file looks like this:

arsd   http://arsdnet.net/dcode
mylib  http://dsource.org/libs
mylib.container  http://domain.com
gui    http://dsource.org/libs


very clean, very simple, but a few thoughts:

1) Allow more than http
I'd also allow things like ftp, svn (revision locked and head), git, hg, etc. Ideally this would be done via a simple "plugin" system: if the xxx:// is unknown to the base program, look for a xxx executable along side the base program and pass off the url and the destination dir.

2) Allow local references.
Don't make local sources a second class citizen. Think of people who are working on a patched lib version.

3) Use a staging approach
You weren't clear on if everything would be globbed into one dir and passed as an import or if each source gets it's own dir and gets passed individually. I'd advocate the second as I suspect that it will make resolving conflicts easier (for instance if the map file changes between downloads)

4) have an "include" system
Allow one file to refer to another via any url that is valid for referring to a code source. For security reasons, the should be a way to white/black list sources and also to force the used of only cached copies.

5) Have a "OK each download" mode and make it the default.
Most builds will not download new code and those that do, deserve extra 
scrutiny.

--
... <IXOYE><



Reply via email to