Tanner Lovelace <[EMAIL PROTECTED]> writes: >If all you used was the databases you would be correct. However, if >the databases disagreed, you shouldn't even try to guess which one is >correct. Instead, you fallback to using the printed version of the ballot.
Technological considerations aside, what we're talking about would be a checksum: something to indicate that what you think is your count may not be correct. But you've shifted the objective from "make sure the count is correct" to "make sure the /first/ count is correct." In effect, the checksum flag becomes a "don't bother to actually count the printed ballots" flag. Personally, I think that's the wrong signal to send for a democratic election. >This would actually be much easier in the Diebold system than >the one I was envisioning, but there are several methods to guard >against this too. The first is that you keep a count of the number >of voters that actually vote and compare the totals to that. If the >totals are over the count, you've got a problem. Being able to identify a problem is a good first step, but you have to be able to move forward from that point, or knowing there's a problem buys you little. In Florida2000, the problem of 'hanging chad' forced election officials to make the difficult decision to interpret single ballots as for one candidate or another, or determine that the voter intent could not be adequately determined and toss the ballot out, disenfranchising that voter. In Carteret County, Knowing (even definitively) that a ballot box contains more (or less) ballots than it should casts doubt on every ballot in that box. That's what happened in Carteret County, NC in 2004, so it's not just a theorhetical problem. (Extrapolating further, the use of standardized software or machines across all booths in a precinct, all precincts in a district, all districts in a state, creates the problem that if the software or machine becomes suspect, so does every vote cast using that software or machine at the precinct, district, or statewide level.) >Have the machine doing the votes print out the ballot >in a machine readable font and have the scan machine >read it from there. Agreed, with emphasis on two points: 1) the font should be /human readable/ first and /machine readable/ second. 2) The indication of voter intent (what gets printed) can only be printed once; both human and machine have to look at the same thing, to avoid the problem of a discrepency between the two. >> >Finally, all the code should be open. Perferably open source/free software, >> >but I think at a minimum it should be easily independently auditable. >> >> Not helpful. There's no way for most people to verify source code >> does what it says anyway, no way to ensure that the compiler, loader, >> hardware, etc hasn't been compromised, and no way to verify that >> what's running in the system is the same as what was reviewed >> beforehand. > >Even if most people won't be able to verify the source code, some will >and that's what counts. You're building a stack of trust; you trust that someone will do the spotting, that the spotting will be effective, that it will be done to benefit you, that the compiler has not been compromised, that the hardware has not been compromised, that the software loaded onto the box is actually the software which got reviewed, etc, etc. There are strategies we can employ to ensure that trust is well founded (opening the source, public discussion, commodity hardware, cryptographic checksums, armed guards, etc, etc.) but each one adds complexity, and cost. In cases where there is no option but to trust, these might be appropriate. But there's no requirement for trust here. Remember: trust is the enemy of security. The only thing we really care about is that when the voter says "Candidate A" the ballot says "Candidate A". Not so much Open Source as Open Document. And all but the most exceptional of idiots can verify the correctness of the system. >Closing the source code brings no benefit >at all to the voters and to the process itself. Worse, employing closed source in a process demanding openness (such as an election) raises doubts about the trustability of the process itself. (Hense this off-topic discussion.) That fact alone should make the use of closed-source software verboten in cases where trust is required. >But I don't agree that technology cannot be used at all. Technology should be used appropriately. It's appropriate to use technology to make voting easier for disabled people, to make vote tabulation more accurate and more efficient for vote tabulators, etc. You are not alone among FOSS advocates wanting to see the their shared FOSS (ultimately democratic) values filter into the electorial system. -- Steve Holton [EMAIL PROTECTED] "Convenience causes blindness. Think about it." -- TriLUG mailing list : http://www.trilug.org/mailman/listinfo/trilug TriLUG Organizational FAQ : http://trilug.org/faq/ TriLUG Member Services FAQ : http://members.trilug.org/services_faq/
