Hi everyone,

I'm a bit concerned about how we're approaching the byte-code  
instrumentation of JDK classes or framework classes.

There are basically two ways:
* we either replace existing methods/classes by new implementations; or
* we modify existing methods/classes by looking for patterns in the  
byte code

Both of these approaches are very much tied to the actual version of  
the instrumented class. When incompatible versions of those classes  
appear in later versions of the JDK or the frameworks, we basically  
have to rely on tests to fail (which doesn't really happen, since we  
don't test with all possible framework versions) or on users to report  
that something doesn't work as expected.

I'm wondering if we should be more strict about which classes we  
actually support for each instrumentation.

An initial idea I have is to make checksums of the byte code arrays  
for each class that is supported and to map those to the adaptors.  
This would also allow us to have clearly isolated adaptors for classes  
that have the same name, but have different implementations. When new  
versions of the JDK or frameworks appear and the implementation of an  
instrumented class has changed, we can fail-fast since the checksum  
will not be part of the ones we support. I personally think that this  
is much better than just hoping that it will work and wait until  
something bad happens at runtime. One downside is that creating the  
checksums for each class with custom instrumentation could be quite  
expensive at runtime.

Any thoughts?

Geert

--
Geert Bevin
Terracotta - http://www.terracotta.org
Uwyn "Use what you need" - http://uwyn.com
RIFE Java application framework - http://rifers.org
Music and words - http://gbevin.com

_______________________________________________
tc-dev mailing list
[email protected]
http://lists.terracotta.org/mailman/listinfo/tc-dev

Reply via email to