Hi Alan,
My apologies, finalizer removal, that's a positive step, I thought it
was motivated by "the code is trusted so we don't need to worry about
finalizer attacks".
Also apologies for the long email, but it seems appropriate for
information handover.
Atomic construction guarantee; if invariant checks during construction
throw an exception, prior to calling Object's zero arg constructor, an
object instance will not be created, creation of the object is atomic,
either it succeeds with all invariants satisfied, or an object is not
created, such that it cannot be created in a broken state where
invariants aren't satisfied. We apply this constructor safety guarantee
to deserialization of data, it is a requirement of the code performing
deserialzation to check invariants and we require that users are
authenticated, this is to avoid parsing untrusted data, while still
validating user data. We enforce it using SM infrastructure.
I understand JEP 411 is a business decision, there wasn't enough
adoption, following the fallout of applets, businesses and users running
afoul of untrusted code, causing ongoing pain (publicly). The remaining
attempts of JEP 411 to explain why POLP is a technical failure only
apply to the default implementation and are incorrect when applied to
other implementations, it's a commercial failure, as suggested by low
rates of adoption in JEP 411, but that's due to a lack of investment in
tooling, however I suspect OpenJDK has underestimated adoption, although
probably not by a big margin, but I suspect it will be more painful than
OpenJDK anticipates. I have a perfectly good working reliable publicly
available example (for years) contrary to JEP 411's technical claims.
OpenJDK's decision has been made, and those affected must also assess
and make their own decisions, the following only serves to share my
thoughts and insights, no need to read further if not of interest. Our
Java programs are going into care and maintenance as we assess suitable
replacement development platforms.
<->
Applets relied on SM (perhaps SM only exists due to their success),
applets themselves weren't the cause of their own demise, for that we
have Java Serialization to thank, otherwise applets were a commercial
success, and had they remained so, then SM would have also remained, it
appears to be inexorably tied to the fate of applets now.
Serialization needed an atomic replacement before 2008, when it was
becoming obvious that Java serialization was insecure. OpenJDK could
still fix Java serialization without using white list filters
(ironically white listing is a complication of SM, which reduced
adoption, it's likely the same will occur with Serialization white
lists, if tooling isn't provided), by removing the ability to serialize
circular object graphs, or disabling it by default. We had circular
object graphs in JGDMS (which heavily utilised Java serialization), but
we refactored these out after implementing atomic de-serialization, we
did this in a way that didn't require breaking the serial form
compatibility of existing classes (unless they contained circular
references). This keeps serial data invariant validation code with the
object implementation, rather than as a separate white list (and it is
more powerful and less complex than white listing), reducing complexity
and maintenance, and because failure is atomic an attacker cannot
formulate a gadget chain, type safety is also read ahead and checked
prior to de-serialization of data. The development of atomic
serialization started with atomic deserialization which was completed a
few years ago, atomic serialization was under current development, with
new explicit public API methods for used for serialization, to avoid any
issues with reflection and module access controls, we were still using
Java serialization to serialize, but an alternative
AtomicObjectInputStream to de-serialize.
SM Performance isn't an issue, my policy implementation is high scaling
and has no hotspots, neither is deployment, we have tools to generate
policy files (more than one) and have been doing so for many years (the
first tool was written by Sun Microsystems circa 2004 I think, it still
required policy file editing, but listed permissions required), the
second tool was written 8 years ago approx. Our use cases have passed
the tests of time. I don't believe people hand author policy files in
the age of computing: I've seen examples of policy generation tools by
other authors on GitHub. Sure some developers might grant
AllPermission, to get something running, or for tests, but I haven't
seen anyone serious about security in production that does. I don't use
the built in policy provider (it has a blocking permission cache that
negatively impacts performance), my policy implementation doesn't have a
cache and it's many magnitudes faster and high scaling thanks to shared