Gregg, with annotations, that's certainly a possibility, since
annotations are preserved in the class files and can be read from
bytecode at analysis time, our Package API analysis would typically be
performed on freshly compiled class files prior to generating a bundle
(jar file).
Where it could get a bit unwieldy is, classes within a package are
subcomponents, fine grained so to speak, bundles export an entire
package, so it's best to document these decisions in a packageinfo file,
one for each package. Bnd uses the packageinfo files from all the
packages within a bundle to create a Manifest.
However, if a particular version of an imported package, caused a bug,
related only to one class, it would make sense to document it in that
class with an annotation. If that class was not reachable at runtime,
then the bug wouldn't occur and it would be possible to pick this up
while resolving dependencies prior to loading using PackageAPI
metadata. Finer grained bug exclusion would result in more positive
matches. It might also be easier to capture the @IgnorePackageVersion
or @ExcludePackageVersion annotation using our tool. This information
might find its way into the Manifest perhaps by creating bnd plugin that
reads our PackageAPI metadata, allowing the bug documentation to stay
with the class in question. If the class in question gets refactored in
a future version the bug might no longer be a problem.
The packageinfo files live with the source code.
Some more info about bnd can be found here:
http://www.aqute.biz/Code/Bnd
N.B. OSGi modularity is a low level component of OSGi and doesn't
require the use of OSGi services for those that don't need or want them.
We'd need to make it very easy for application developers to utilise the
OSGi bundle format.
Cheers,
Peter.
Gregg Wonderly wrote:
I'm not familiar with all the details of how bnd configuration can be
used. What I'm wondering is if we want to support something in the
source code itself that the developer can specify.
@RequiresVersion(package=net.jini,version="4.0")
@IgnoreVersion(package=net.jini,version="4.1")
Is the kind of thing I am thinking of. The upcoming features for Java
that will make some of this supported directly by the JVM might
provide enough controls. But I'm just thinking out loud, wondering
what we might want to try and start providing ability to do, up front,
and then we can roll stuff into any future features of the JVM or
other places that this happens without being dependent on something
that works for one platform (OSGi for example) but which there is no
control point for in another platform. I do understand that you are
saying it can be in the JAR using the same mechanism that bnd uses.
I'm suggesting that something more visible in the source code, might
be helpful as a starting point, while allowing customization to happen
in the jar through "assembly" mechanisms particular to a type of
deployment.
Gregg Wonderly
Peter Firmstone wrote:
Sorry Gregg, my apologies, I was still caught in the explain loop ;)
An app developer would have to specify package versions as per the
OSGi bnd tool, including Packages to export, which is very simple,
they don't need to utilise OSGi services, if they don't want that
functionality.
The Package API signatures should be auto generated by a tool at
build time as an ant target, prior to generating jar files, it would
be nice to create a tool that displayed Package API metadata visually
as well.
It's relatively straight forward to exclude a particular version of a
bundle (jar file) in the bundle metadata (in the file format used by
bnd), a developer might want to do this in case of bugs or
concurrency issues and that's perfectly valid as we have no way of
measuring the buggy-ness of code.
Class metadata (its API and package version or package name and
version at the very least), could be sent with the marshalled object,
so the latest compatible byte code can be discovered and retrieved.
Thanks,
Peter.
Gregg Wonderly wrote:
What I was trying to get at, was whether this was something that we
wanted the developer to have no part of specifying explicitly, or if
we might want to provide some ways for developers to draw lines in
the sand where API compatibility is not enough to suggest that a
class is okay for use.
Gregg Wonderly
Peter Firmstone wrote:
Hi Gregg,
The metadata represents package and class api signatures, it's a
property of the code that already exists, that we capture. We can
then make the information available without needing to load a class
file first. Because Java doesn't allow reloading of class files in
the same classloader, well not yet at least, I'd like to verify
that the class being loaded is compatible prior to loading.
Once you identify a packages API, it makes sense to catalogue it
against a version number of some sort, OSGi already supports
versioning metadata so it seemed logical to use that convention.
There are some language changes in Java 7 to support modules, jsr
294 from memory I think.
Thanks,
Peter.
Gregg Wonderly wrote:
Is this meta data something we want to extract, or is it something
that we want developers to designate existence of either through
annotations or some other visible meta information?
Gregg Wonderly
Peter Firmstone wrote:
Dennis Reedy wrote:
So if this information is produced at build time, added into a
jar's manifest, what is the difference with using a convention
of something like a maven artifact to provide version and update
support?
A very brief statement is that it would describe the actual API
of a package; the methods and fields of public class files, this
information would be generated from the classpath, without
programmer input, and stored as files in the jar file, in
addition to the current bundle metadata, see below:
I haven't worked out how to represent the metadata in textual
form yet, perhaps xml? The tool, uses a number of options to
define its analysis / search scope and classpath from which it
builds a collection containing DependencyRelationship objects
that contain class metadata. Each DependencyRelationship object
contains two collections, one for dependants and one for
providers, each DependencyRelationship forms a node in the
dependency graph.
The DependencyRelationship objects can then grouped into a
collections, one for each package. You then iterate over the
collection to get all Provider's that are external to the
package. These DependencyRelationship Objects each represent a
class from an external (imported) package. This can then be
grouped into collections from different packages. Each of these
collections represent a dependency on an imported package.
The Package API itself is determined by class visibility. When
I refer to classes here, I'm referring to class files, so this
includes interfaces and abstract classes.
Only Public Classes form represent the Package API, from there
you drill down to the public class details:
* all public methods and field signatures.
* all protected method and field signatures.
* serialVersionUID
The DependencyPackageAPI signatures can be checked against the
PackageAPI signatures.
I guess this could be done with a method such as
public interface PackageMirror {
public boolean satisfies(DependencyPackageAPI depends);
}
The devils in the implementation details of the above method.
The devil is set out in Chapter 13 Binary Compatibility from the
Java Language Specification 3.0
Where this analysis gets really interesting is when a dependency
is published in return values of an exported PackageAPI, in this
case a client bundle importing a package from this bundle will
also have to import a Package that satisfies the common
dependency. The runtime will need to resolve all package
dependencies prior to loading.
To generate the metadata I might persist to xml, all the
DependencyRelationship objects for a bundles exported packages
as well as the subset DepenencyRelationship objects from the
import package requirements.
The metadata might be stored in three files in the jar:
depends.xml
provides.xml
client_requirements.xml - packages that use this package X will
depend on these other packages, Y and Z, full PackageAPI
signatures must be captured for Y and Z also.
Or to be consistent with OSGi:
imports.xml
exports.xml
client_requirements.xml
If the metadata is too large (though I think they'll be ok) a
SHA-1 checksum of these files might suffice, so they could be
looked up as required.
Currently my implementation only has the dependency
relationships, it doesn't yet harvest all the method and field
signatures, although this is quite straight forward with ASM.
As a result, a typical implentation will only depend upon the
interface, not package private classes or implementation
details. A later version of a Package could change a public
class to a public interface, and reimplement all methods in
several package private classes without altering the dependency
metadata and still satisfy backward compatibility. Just a quick
question, does this "Package API" differentiate between a bundle
simply using an interface vs one that implements it? Just
wondering.
Yes, very much so, see above; one that simply uses it would not
publish it in its client requirements, however one that
implements it will.