In the company that I helped setting up a Maven-based build environment, a
public site like ibiblio.org is considered a potentially unsafe source. Like
it or not. Only JARs that have been approved internally may be used for
production. (BTW, this was within finance industry which is partly quite
sensitive at such points.)
Practically, that meant we had to set up an internal remote repository and
deploy all existing JARs to it. This worked fine in the end, but felt
difficult in the beginning - probably due to the lack of good documentation
and examples.
All the deployed artifacts have md5 sums. Your department could have
validated that all artifacts used were safe and compared the md5 sums
with the ones used locally and the ones downloaded.
> Why can't you solve this with an in-house remote repository, rather than
> zipping up jars for local repos?
>
You can do it with an in-house remote plugin repository, but how would you
populate it with all plugins that you need? Ok, you can set up a proxy for
this purpose. Then, you somehow have to make sure that Maven downloads all
relevant plugins (this is not trivial either). Afterwards you can disconnect
the proxy from Internet, or use the proxy's cache as your internal plugin
repository. (Keeping the proxy alive and connected to the Internet might be
unacceptable because you want to evaluate new plugins before you release
them for internal usage.)
This stems from an anal configuration management policy, one that does
not trust your development team to "do the right thing (tm)". It is
locking things down to prevent people from doing the wrong thing
instead of trusting them to do the right thing and then verifying that
you are. After all, if can't trust your team to stick to approved
versions of artifacts how can you trust them to write your precious
business code?
So, how to you verify instead of lock?
You have a parent pom that declares and defines all versions of
artifacts and plugins and in your module poms you declare that you
want a plugin but you provide no version information. Then your
configuration team only needs to check the parent pom against the
internal standards.
When I wanted to use Maven-proxy for this purpose, I soon encountered
problems because it did not support NTLM authentication, so I could not get
through the firewall. So we helped us with zipping up a local repo as a
workaround.
NTLM support would help but hey, it's a closed proprietary
undocumented authentication scheme from Microsoft, you can't expect
everyone to support it. As a workaround you can use NTLMAPS on
sourceforge as a local proxy for any apps that don't know how to
support NTLM.
Maybe it becomes clearer with an example: Let's say, your build uses plugin
X which has a dependency to Xerces. So you will have Xerces in your local
repository after running your build, because it is downloaded from the
internal plugin repository to your local repository.
Thereafter, you are able to declare a dependency to Xerces in your project,
and the build will run through - even if Xerces has not been released to the
internal remote repository, and there's no connection to the Internet.
Builds should only be able to use artifacts that have explicitly been
released to the internal remote repository. Nothing else. Even not JARs from
an internal plugin repository.
Same argument as above. Verify correctness don't try to enforce it.
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]