A quick followup note on this, I was reading Microsoft's code-signing best
practices document and one comment caught my eye:

  If code is signed automatically as part of a build process, it is highly
  recommended that any code that is submitted to that build process be
  strongly authenticated.

Given that Realtek produce huge numbers of products and therefore even larger
numbers of drivers for different environments (and JMicron probably not much
less so), it's likely that they use a highly automated driver-signing process
to deal with this.  Even if they use "strong authentication" as per the
guidelines, for example making the network share on the company-wide signing
server non-public (i.e. requiring Windows domain authentication), this means
that anyone who compromises any PC on the network can now use the code-signing
server as an oracle.  So there may have been no need to steal the key, just
compromise one developer PC and you can get your malware signed.  So perhaps a
corollary requirement might be:

  Your code-signing system should create a tamper-resistant audit trail [0] of
  every signature applied and what it's applied to.


[0] By this I don't mean the usual cryptographic Rube-Goldbergery, just log
    the details to a separate server with a two-phase commit protocol to 
    minimise the chances of creation of phantom non-logged signatures.

The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majord...@metzdowd.com

Reply via email to