Stuart - comments inlined...
On Fri, Jan 27, 2012 at 12:15 AM, Stuart Monteith<stuk...@stoo.me.uk>wrote:
>-8 snip
1. While the API is a good idea, there is no demand from the community.
I agree that is true now - it wasn't when we started.
Yes - I think the answer is to produce something that would be in
demand. Kato won't graduate without a viable community.
2. The target audience is very limited.
3. Oracle's resources are limited, and the JSR is targeted at a limited
audience.
4. Would need more feedback from the community - but wouldn't be done in
the JDK 8 time frame.
We've been through the process of getting community feedback. The response
was generally positive. Steve can fill us in further.
The promise of the tools we put together did attract a lot of attention -
the long delay in delivering them has obviously turned people off.
The idea of the Java equivilent of "dbx for corefiles" is always
attractive.
Yes - I have my doubts though about making "dbx for corefiles" produce
useful information and also have it reliable and have a low overhead.
Obviously we'll exploit any improvements to JVM diagnostics as and when
they come along.
The target audience for the API was limited - tool vendors and JDK
support. The target audience for the tools that were to be based on the
Kato API were Java developers, support, and operations.
My view is that while the API is an interesting concept, it is too low a
level for there ever to be community interest in it's development - there
are so few vendors. When we started this there was at least Sun, Oracle
(who had just bought BEA), Intel and IBM. Now we are reduced to Oracle and
IBM, which is a very small community indeed, and as they are both
cooperating somewhat through OpenJDK that is even smaller.
Yes agreed.
I see us continuing in the following form:
-8 snip
I agree with all these points but I would point out that we said at the
beginning of Kato that the JSR 326 spec would just "fall out" of the Kato
implementation - ie a code lead spec. That's still true so I think we
should reserve judgement on the disconnection of JSR 326 from Kato. I'd
rather ignore it for now. We should just focus on producing the tools and
supporting API we want and worry about any form of standardisation later.
I agree - let's ignore the JSR for now. The API is now a means to an
end, rather than a end in itself. The goal of the Kato project then
changes to produce post-mortem debugging tooling, and other offline JVM
debugging tooling.
I propose we should vote on changing the projects goals. We do want to
be clear on what we are trying to achieve - otherwise people won't know
if they should be interested.
Can we produce something that is compelling?
-8 Snip
But is there anything else?
Creating a connector between jdi and a dump is good and useful since it
makes use of a familiar interface. I would like to go further and produce
more comprehensive diagnostic tools that would not fit into the jdi model.
I mentioned serialisation before as one example. Take a serialisation
stream and present it in a graphical view. For deserialisation failures
take a set of classes and a serialisation stream and show where and how the
stream doesnt match the classes provided. That one really bugs me. At my
Serialisation talk at JavaOne last year I got a ton of people asking how to
debug these sorts of problems.
Other ideas :
change or compliment the API with something more SAX style related (rather
than the DOM style we have today) and add a good query mechanism on
top.
Implement the snapshot dump API so that we can get smaller dumps.
Create Eclipse plugins for these new things so that developers have a
better experience than a command line tool.
I'm sure I will think of others :-)
The modules we have for the CLI (i.e. Tomcat diagnosis) could be one
approach. I still like the idea of FFDC tools that perform FFDC without
having to explicitly store information that might be incomplete. I will
have to rethink in more detail about what we have and how I'd like to
contribute.
Regards,
Stuart