[Clips] Bruce Schneier talks cyber law
--- begin forwarded text Delivered-To: [EMAIL PROTECTED] Date: Wed, 19 Oct 2005 23:33:54 -0400 To: Philodox Clips List [EMAIL PROTECTED] From: R.A. Hettinga [EMAIL PROTECTED] Subject: [Clips] Bruce Schneier talks cyber law Reply-To: [EMAIL PROTECTED] Sender: [EMAIL PROTECTED] http://www.theregister.co.uk/2005/10/19/schneier_talks_law/print.html The Register Biting the hand that feeds IT The Register » Security » Network Security » Original URL: http://www.theregister.co.uk/2005/10/19/schneier_talks_law/ Bruce Schneier talks cyber law By John Oates in Vienna (john.oates at theregister.co.uk) Published Wednesday 19th October 2005 10:01 GMT RSA Europe 2005 ISPs must be made liable for viruses and other bad network traffic, Bruce Schneier, security guru and founder and CTO of Counterpane Internet Security, told The Register yesterday. He said: It's about externalities - like a chemical company polluting a river - they don't live downstream and they don't care what happens. You need regulation to make it bad business for them not to care. You need to raise the cost of doing it wrong. Schneier said there was a parallel with the success of the environmental movement - protests and court cases made it too expensive to keep polluting and made it better business to be greener. Schneier said ISPs should offer consumers clean pipe services: Corporate ISPs do it, why don't they offer it to my Mum? We'd all be safer and it's in our interests to pay. This will happen, there's no other possibility. He said there was no reason why legislators do such a bad job of drafting technology laws. Schneier said short-sighted lobbyists were partly to blame. He said much cyber crime legislation was unnecessary because it should be covered by existing laws - theft is theft and trespass is still trespass. But Schneier conceded that getting international agreements in place would be very difficult and that we remain at risk from the country with the weakest laws - in the same way we remain at risk from the least well-protected computer on the network. -- - R. A. Hettinga mailto: [EMAIL PROTECTED] The Internet Bearer Underwriting Corporation http://www.ibuc.com/ 44 Farquhar Street, Boston, MA 02131 USA ... however it may deserve respect for its usefulness and antiquity, [predicting the end of the world] has not been found agreeable to experience. -- Edward Gibbon, 'Decline and Fall of the Roman Empire' ___ Clips mailing list [EMAIL PROTECTED] http://www.philodox.com/mailman/listinfo/clips --- end forwarded text -- - R. A. Hettinga mailto: [EMAIL PROTECTED] The Internet Bearer Underwriting Corporation http://www.ibuc.com/ 44 Farquhar Street, Boston, MA 02131 USA ... however it may deserve respect for its usefulness and antiquity, [predicting the end of the world] has not been found agreeable to experience. -- Edward Gibbon, 'Decline and Fall of the Roman Empire' - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]
[Clips] Read two biometrics, get worse results - how it works
--- begin forwarded text Delivered-To: [EMAIL PROTECTED] Date: Wed, 19 Oct 2005 23:32:55 -0400 To: Philodox Clips List [EMAIL PROTECTED] From: R.A. Hettinga [EMAIL PROTECTED] Subject: [Clips] Read two biometrics, get worse results - how it works Reply-To: [EMAIL PROTECTED] Sender: [EMAIL PROTECTED] http://www.theregister.co.uk/2005/10/19/daugman_multi_biometrics/print.html The Register Biting the hand that feeds IT The Register » Internet and Law » Digital Rights/Digital Wrongs » Original URL: http://www.theregister.co.uk/2005/10/19/daugman_multi_biometrics/ Read two biometrics, get worse results - how it works By John Lettice (john.lettice at theregister.co.uk) Published Wednesday 19th October 2005 14:47 GMT A regular correspondent (thanks, you know who you are) points us to some calculations by John Daugman, originator of the Daugman algorithms for iris recognition. These ought to provide disturbing reading for Home Office Ministers who casually claim that by using multiple biometrics (http://www.theregister.co.uk/2005/10/17/mcnulty_fingers_id_problem/) you'll get a better result than by using just the one. Although that may seem logical, it turns out that it it isn't, necessarily. Daugman presents (http://www.cl.cam.ac.uk/users/jgd1000/combine/combine.html) the two rival intuitions, then does the maths. On the one hand, a combination of different tests should improve performance, because more information is better than less information. But on the other, the combination of a strong test with a weak test to an extent averages the result, so the result should be less reliable than if one were relying solely on the strong test. (If Tony McNulty happens to be with us, we suggest he fetches the ice pack now.) The key to resolving the apparent paradox, writes Daugman, is that when two tests are combined, one of the resulting error rates (False Accept or False Reject rate) becomes better than that of the stronger of the two tests, while the other error rate becomes worse even than that of the weaker of the tests. If the two biometric tests differ significantly in their power, and each operates at its own cross-over point, then combining them gives significantly worse performance than relying solely on the stronger biometric. This is of particular relevance to the Home Office's current case for use of multiple biometrics, because its argument is based on the use of three types of biometric, fingerprint, facial and iris, which are substantially different in power. Daugman produces the calculations governing the use of two hypothetical biometrics, one with both false accept and false reject rates of one in 100, and the second with the two rates at one in 1,000. On its own, biometric one would produce 2,000 errors in 100,000 tests, while biometric two would produce 200. You can treat the use of two biometrics in one of two ways - the subject must be required to pass both (the 'AND' rule) or the subject need only pass one (the 'OR' rule). Daugman finds that under either rule there would be 1,100 errors, i.e. 5.5 times more errors than if the stronger test were used alone. He concludes that a stronger biometric is therefore better used alone than in combination, but only when both are operating at their crossover points. If the false accept rate (when using the 'OR' rule) or the false reject rate (when using the 'AND' rule) is brought down sufficiently (to smaller than twice the crossover error rate of the stronger test, says Daugman) then use of two can improve results. If we recklessly attempt to put a non-mathemetical gloss on that, we could think of the subject having to pass two tests (in the case of the 'AND') rule of, say, facial and iris. Dropping the false reject rate of the facial test (i.e. letting more people through) in line with Daugman's calculations would produce a better result than using iris alone, but if the facial system rejects fewer people wrongly, then it will presumably be accepting more people wrongly. Which suggests to us that simply regarding a second or third biometric as a fall back to be used only if earlier tests fail constructs a scenario where the combined results will be worse than use of the single stronger test, because in such cases the primary biometric test would have to be sufficiently strong to stand on its own, because you won't always be using the second or third test. The deployment of biometric testing equipment in the field is also likely to have a confusing effect on relative error rates, because environmental factors will tend to impact the different tests to different degrees. Poor lighting may have an effect on iris and facial but not on fingerprint, while the aircon breaking down may produce greasy fingers and puffy red faces, but leave iris intact. Which would presumably mess up attempts to sync error rates. But we feel ourselves beginning to intuit, and had perhaps best
Practical Security Mailing List
Hello, I would like to notify you all of a new mailing list forum which I opened. It is called Practical Security and is aimed at discussing security measures in the context of real problems in real projects. It has a much narrower scope than the Cryptography mailing list and by no means intends to replace it or to compete with it. From the mailing list info page: This forum discusses applications of cryptographic protocols as well as other security techniques, such as (but not limited to) methods for authentication, data protection, reverse-engineering protection, denial-of-service protection, and digital rights management. The forum also discusses implementation pitfalls to avoid. This forum does not discuss theoretical and/or mathematical aspects of cryptography. Neither does the forum discuss particular vulnerabilities of commercial products, such as what one may find in BugTraq. Joining this mailing list is especially recommended to professionals who design security systems and to application designers who are also responsible for the security aspects of their products. I confess that at the moment of writing the list has just a few participants, but I project that it will grow much larger. To subscribe visit http://www.hbarel.com/practicalsecurity or send a blank message to [EMAIL PROTECTED] Regards, Hagai. - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]
Re: [Clips] Read two biometrics, get worse results - how it works
RAH, et al., It is true that one can combine two diagnostic tests to a worse effect than either alone, but it is not a foredrawn conclusion. To take a medical example, you screen first with a cheap test that has low/no false negatives then for the remaining positives you screen with a potentially more expensive test that has low/no false positives. There is a whole health policy management literature on this. I reproduce the barest precis of same below, assuming the reader can manage to view it in a fixed width font while respecting my hard carriage returns as writ. --dan cheat sheet on terminology of medical diagnostic testing _ \ the true situation \ \+ - +---+---+--- | | | + | a | b | a+b what the | | | diagnostic +---+---+--- test returns | | | - | c | d | c+d | | | +---+---+--- | | | | a+c | b+d | t true positives a = positive testers who have disease true negatives d = negative testers who are without disease false positives b = positive testers who are without disease false negatives c = negative testers who have disease prevalence (a+c)/t = fraction of population that has disease sensitivity a/(a+c) = what fraction of those with disease test positive specificity d/(b+d) = what fraction of those without disease test negative predictive value positive a/(a+b) = what fraction of positive tests have disease predictive value negative a/(a+b) = what fraction of negative tests are without disease Notes: Information retrieval people know sensitivity as recall and predictive value positive as precision. Screening with a cheap test with high sensitivity then an expensive test with high specificity is often the best (most cost effective) strategy. - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]
Re: [Clips] Read two biometrics, get worse results - how it works
On 10/19/05, R.A. Hettinga [EMAIL PROTECTED] wrote: [EDIT] Daugman presents (http://www.cl.cam.ac.uk/users/jgd1000/combine/combine.html) the two rival intuitions, then does the maths. On the one hand, a combination of different tests should improve performance, because more information is better than less information. But on the other, the combination of a strong test with a weak test to an extent averages the result, so the result should be less reliable than if one were relying solely on the strong test. I believe the Daugman results are correct only when one accepts results where the tests disagree. That is, if the first test returns positive and the second test returns negative, you chose the overall results to be positive or negative as opposed to do over until they agree. Of course, in real life with knowledge of the physics of the tests and the ability to pull out non-boolean results, one may be able to remove many of the do over results to keep from annoying the test subjects. -Michael Heyman - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]
Re: Cisco VPN password recovery program
* Perry E. Metzger: Via cryptome: http://evilscientists.de/blog/?page_id=343 The Cisco VPN Client uses weak encryption to store user and group passwords in your local profile file. I coded a little tool to reveal the saved passwords from a given profile file. If this is true, it doesn't sound like Cisco used a particularly smart design for this. Why? In essence, this is the PSK that is used to authenticate the VPN gateway. It must be available in cleartext on the client. (Later versions offer asymmetric encryption as well.) - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]
Re: Cisco VPN password recovery program
http://www.cisco.com/en/US/products/hw/vpndevc/ps2284/products_configuration_guide_chapter09186a00803ee1f0.html#wp2477015 - - - Cisco Client Parameters Allow Password Storage on Client - Check this box to allow IPSec clients to store their login passwords on their local client systems. If you do not allow password storage (the default), IPSec users must enter their password each time they seek access to the VPN. For maximum security, we recommend that you not allow password storage. - - - I really doubt that this affects group password (PSK). In some cases, network administrators used the password obfuscation to force their users to use Cisco's VPN client. Competing products, such as vpnc, do not enforce client-side policies. However, there's been a website where you can upload the obfuscated password, and it returns the password in clear text for quite some time now. It is implemented by running the Cisco client under a debugging tool, intercepting a memcpy call that copies the password. In the end, the publication of the algorithm doesn't change the security of the system (there wasn't much to start with). But it's certainly easier to write interoperable software using this information. - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]
Re: [fc-discuss] Financial Cryptography Update: On Digital Cash-like Payment Systems
On Thu, 20 Oct 2005, cyphrpunk wrote: system without excessive complications. Only the fifth point, the ability for outsiders to monitor the amount of cash in circulation, is not satisfied. But even then, the ecash mint software, and procedures and controls followed by the issuer, could be designed to allow third party audits similarly to how paper money cash issuers might be audited today. One approach, investigated by Hal Finney, is to run the mint on a platform that allows remote attestation. Check out rpow.net - he has a working implementation of a proof of work payment system hosted on an IBM 4758. -David Molnar - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]