I agree with  Gunnar Sjödin of CICS-Kista/Stockholm when he says:
"I think it is imperative that the decision to use a particular voting system
in a public election should be an informed one. Thus, it is the civic duty
of those possessing the knowledge to understand the security aspects
of such systems to inform the public."

I find however Gunnar's  proposed Principles 1 and 2 much less helpful than
he suggests, and in direct contradiction with old and  known principles in the
design of secure communication systems. These are the two principles he
suggests:

> PRINCIPLE 1.
>
>   Any piece of information in a system, even if supposedly secret, must
>   be assumed known to an attacker unless proven otherwise.
>
> PRINCIPLE 2.
>
>   Unless there is a specific reason, information that could be used by an
>   attacker should be kept secret.

It has been customary in cryptography, including the work of De Vigenere in
1585, to consider that the enemy knows all about the encryption methods used
-- for example, by capturing an encryption machine, by collusion or by
observation.  The only secret information in a cryptographic system must
be the key itself, not the method used.  However, Gunnar's principles lump
both method and key as "information" and fail to make this important distinction.

Another problem with Gunnar's suggestion is that it is not possible to prove that
information is secret, whatever that information is.  Any successful breach of
security will, by definition, be unknown.  An unfaithful proxy may reveal a secret,
for example, and no one will ever know that it was even revealed.

Thus, much more useful is the traditional maxim that "All cryptographic methods
are supposed to be known by attackers; the only secret information in a cryptographic
system must be the key itself."

Another useful principle was introduced by Claude Shannon in 1949, with the paper
"A Communications Theory of Secrecy Systems" [Sha49], using the earlier defined
concept of data entropy [Sha48] in order to re-visit the concept of data secrecy.

According to Shannon, and now widely used as a security maxim, when all options are
equally possible,  there is no information gained from knowing all the options.  So, 
even
more effective than keeping information secret (which secret can be broken by 
collusion,
for example) is to keep information as a list of equally probable choices. This is the 
principle
called the "One Time Pad" encryption scheme, because of the use of pads of paper to
implement it in WW2.  This principle is unconditionally secure  (if the key is 
perfectly
random) -- Shannon noted that for any alphabetic substitution cipher with a  random key
of length greater than or equal to the length of the message, plaintext cannot ever be 
derived
from ciphertext alone .  This principle led to the definition of unicity by Shannon 
[Sha49],
which is also useful for elections [Ger99].

Safevote's protocols use these principles in several different ways , together with
symmetric and asymmetric encryption techniques as disclosed in
http://www.safevote.com/tech.htm.

In the case of the IP numbers of the Contra Costa Internet voting network defined
by  Safevote,  the range of IP numbers of the two election machines EM1.safevote.com
and EM2.safevote.com were  kept secret by using Shannon's concept of high-entropy
encoding -- not because some employee of Safevote kept it secret. Safevote is not a
security threat, by design.

The  public IP address of the router/hub was dynamic. It was randomly assigned by
whatever ISP was being used, from a list of possible 12 phone numbers. So, even we
at Safevote did not know what IP address was being used at each time because the IP
address was assigned dynamically in different dial-up events to different phone numbers
and different ISPs around the country (long-distance calls notwithstanding) -- with
millions of possible choices.

So, even though an attacker knowns all the public IP addresses on the Internet, and can
even reduce that list of IP addresses to those of ISPs physically located in the US (we
declared we would use ISP servers  in the US), all such IP addresses are equally 
probable
and the list is extremely large (many millions even if just ISPs in California are 
considered).
Thus, there is no information gained by knowing this list of IP addresses -- which is 
our
design goal and illustrates the second principle we use.

To be clear, Safevote uses two time-tested principles in its design:

Principle A (disclosure of all methods):

 All cryptographic methods are supposed to be known by attackers; the only secret
 information in a cryptographic system must be the key itself.

Principle B (maximum-entropy):

 When all options are equally possible,  there is no information gained from knowing
 all the options.

By combining Principles A and B, Safevote may use 100% open source software and
open peer review of protocols while not revealing which algorithms and which IP numbers
are being used, from a known list of choices.

A beneficial by-product of Principle A is thus the motivation for the IVTA and for 
Safevote's
public disclosure of information:

Principle C (open source principle):

"When there are many eyes, all bugs are shallow."

Overall, these three principles allow a  sound security design to be built, much more 
reliably than
simply using the "not telling" approach. However, "not telling" is also in our toolbox 
and is used
for secret keys.  In tests, however, we may make it easier for attackers and reduce 
the entropy
of the IP numbers used, as we did.

Our statement quoted from http://www.safevote.com/tech..htm
is thus based on sound design rules:

>  Why are we releasing this information? For several reasons, as we
>  explain in the October 2000 issue of the newsletter The Bell, page 3.
> Of course, in a real election we would not release any  information
> that could facilitate either type of attack -- network  or data.  We
> understand that such an strategy can be accomplished even with open
> source software and open peer review of protocols since, for example,
> we may not reveal which algorithm is being used for encryption (from
> a known list of certified algorithms). In a real election we would also not
>  reveal which range of IP numbers is being  used, and other configuration
> data.

Cheers,

Ed Gerck

-----------------------
REFERENCES:

[Sha48] Shannon, C. A Mathematical Theory of Communication. Bell Syst. Tech. J., vol. 
27, pp. 379-423, July 1948.  See also 
http://cm.bell-labs.com/cm/ms/what/shannonday/paper.html for a web copy.

[Sha49] Shannon, C. Communication Theory of Secrecy Systems. Bell Syst. Tech. J., vol. 
28, pp. 656-715, 1949. See also http://www3.edgenet.net/dcowley/docs.html for readable 
scanned
images of the complete original paper.

[Ger99] Gerck, Ed, "Unicity, DES Unicity, Open-Keys, Unknown-Keys" published
by the MCG in http://www.mcg.org.br/unicty.htm

Reply via email to