Specially note the article: ECHELON Technology

----- Original Message -----
From: Bruce Schneier <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Thursday, 16 December 1999 1:25 PM
Subject: CRYPTO-GRAM, December 15, 1999


>                  CRYPTO-GRAM
>
>               December 15, 1999
>
>               by Bruce Schneier
>                Founder and CTO
>       Counterpane Internet Security, Inc.
>            [EMAIL PROTECTED]
>           http://www.counterpane.com
>
>
> A free monthly newsletter providing summaries, analyses, insights, and
> commentaries on computer security and cryptography.
>
> Back issues are available at http://www.counterpane.com.  To subscribe or
> unsubscribe, see below.
>
>
> Copyright (c) 1999 by Bruce Schneier
>
>
> ** *** ***** ******* *********** *************
>
> In this issue:
>      "Security Is Not a Product; It's a Process"
>      Sarah Flannery's Public-Key Algorithm
>      ECHELON Technology
>      Counterpane -- Featured Research
>      News
>      New U.S. Crypto Export Regulations -- Draft
>      Counterpane Internet Security News
>      The Doghouse:  Egg
>      Fast Software Encryption 2000
>      Comments from Readers
>      European Cellular Encryption Algorithms
>
>
> ** *** ***** ******* *********** *************
>
>   "Security Is Not a Product; It's a Process"
>
>
>
> In April 1999, someone discovered a vulnerability in Microsoft Data Access
> Components (MDAC) that could let an attacker take control of a remote
> Windows NT system.  This vulnerability was initially reported on a public
> mailing list.  Although the list moderator withheld the details of that
> risk from the public for more than a week, some clever hacker
> reverse-engineered the available details to create an exploit.
>
> Then, an exploit script (written in PERL) was publicly posted on the
> Internet.  At about the same time, Microsoft created a patch and
> work-around to prevent attackers from exploiting the vulnerability on
> users' systems.  Microsoft also issued a security bulletin on the topic,
as
> did several other security news outlets.
>
> But patches don't magically fix security vulnerabilities.  Over Halloween
> weekend, hackers attacked and defaced more than 25 NT-based Web sites.
> Seems like a bunch of security administrators didn't bother updating their
> configurations.
>
> This sort of thing goes on all the time.  Another example:  Microsoft
> issued a bulletin and a patch for a data access vulnerability in Internet
> Information Server (IIS) last year.  Recently, experts demonstrated that
> Compaq, Dell, CompuServe, PSINet, and NASDAQ-AMEX never bothered
installing
> the patch and were still vulnerable.
>
> A vulnerability is reported and a patch is issued.  If you believe the
news
> reports, that's the end of the story.  But in most cases patches never get
> installed.  This is why most systems on the Internet are vulnerable to
> known attacks for which fixes exist.
>
> Security is not a product; it's a process.  It's the process of paying
> attention to vendor updates for your products.  Not only network and
> network security products -- browsers, firewalls, network operating
> systems, Web server software -- but every piece of software you run.
> Vulnerabilities in your word processor can compromise the security of your
> network.
>
> It's the process of watching your systems, carefully, for signs of attack.
> Your firewall produces audit logs.  So do your UNIX and NT servers.  So do
> your routers and network servers.  Learn to read them, daily.  Learn what
> an attack looks like and how to recognize it.
>
> No security product acts as magical security dust; they all require time
> and expertise to make work properly.  You have to baby-sit them, every
day.
>
> The Microsoft bug mentioned above:
> http://www.microsoft.com/security/bulletins/ms99-025.asp
> http://www.microsoft.com/security/bulletins/ms99-025faq.asp
>
> News report:
> http://www.fcw.com/pubs/fcw/1999/1101/fcw-newsfedwire-11-01-99.html
>
> Why vulnerabilities don't get fixed:
> http://www.computerworld.com/home/print.nsf/all/991122CD52
>
>
> ** *** ***** ******* *********** *************
>
>     Sarah Flannery's Public-Key Algorithm
>
>
>
> In January 1999, a 16-year old Irish woman named Sarah Flannery made
> international news by announcing a new public-key algorithm, called
> Cayley-Purser, that was supposedly faster and better than RSA and ElGamal.
>
> The only problem is that no one knew what the algorithm was.
>
> Well, it's finally public.
>
> Flannery's paper, describing the Cayley-Purser algorithm, has been
> published on the Internet by an unknown source.  It's interesting work,
but
> it's not secure.  Flannery herself publishes a break of the algorithm in
an
> appendix.
>
> To me, this makes Flannery even more impressive as a young cryptographer.
> As I have said many times before, anyone can invent a new cryptosystem.
> Very few people are smart enough to be able to break them.  By breaking
her
> own system, Flannery has shown even more promise as a cryptographer.  I
> look forward to more work from her.
>
> Flannery's paper:
> http://cryptome.org/flannery-cp.htm
>
> News stories from January:
>
http://www.zdnet.com/zdnn/stories/news/0,4586,2189301,00.html?chkpt=zdnnsmsa
> http://www.wired.com/news/technology/0,1282,17330,00.html
>
>
> ** *** ***** ******* *********** *************
>
>              ECHELON Technology
>
>
>
> The NSA has been patenting, and publishing, technology that is relevant to
> ECHELON.
>
> ECHELON is a code word for an automated global interception system
operated
> by the intelligence agencies of the U.S., the UK, Canada, Australia and
New
> Zealand.  (The NSA takes the lead.)  According to reports, it is capable
of
> intercepting and processing many types of transmissions, throughout the
globe.
>
> Over the past few months, the U.S. House of Representatives has been
> investigating ECHELON.  As part of these investigations, the House Select
> Committee on Intelligence requested documents from the NSA regarding its
> operating standards for intelligence systems like ECHELON that may
> intercept communications of Americans.  To everyone's surprise, NSA
> officials invoked attorney-client privilege and refused to disclose the
> documents.  EPIC has taken the NSA to court.
>
> I've seen estimates that ECHELON intercepts as many as 3 billion
> communications everyday, including phone calls, e-mail messages, Internet
> downloads, satellite transmissions, and so on.  The system gathers all of
> these transmissions indiscriminately, then sorts and distills the
> information through artificial intelligence programs.  Some sources have
> claimed that ECHELON sifts through 90% of the Internet's traffic.
>
> How does it do it?  Read U.S. Patent 5,937,422, "Automatically generating
a
> topic description for text and searching and sorting text by topic using
> the same," assigned to the NSA.  Read two papers titled "Text Retrieval
via
> Semantic Forests," written by NSA employees.
>
> Semantic Forests, patented by the NSA (the patent does not use the name),
> were developed to retrieve information "on the output of automatic
> speech-to-text (speech recognition) systems" and topic labeling.  It is
> described as a functional software program.
>
> The researchers tested this program on numerous pools of data, and
improved
> the test results from one year to the next.  All this occurred in the
> window between when the NSA applied for the patent, more than two years
> ago, and when the patent was granted this year.
>
> One of the major technological barriers to implementing ECHELON is
> automatic searching tools for voice communications.  Computers need to
> "think" like humans when analyzing the often imperfect computer
> transcriptions of voice conversations.
>
> The patent claims that the NSA has solved this problem.  First, a computer
> automatically assigns a label, or topic description, to raw data.  This
> system is far more sophisticated than previous systems because it labels
> data based on meaning not on keywords.
>
> Second, the patent includes an optional pre-processing step which cleans
up
> text, much of which the agency appears to expect will come from human
> conversations.  This pre-processing will remove what the patent calls
> "stutter phrases."  These phrases "frequently occurs [sic] in text based
on
> speech."  The pre-processing step will also remove "obvious stop words"
> such as the article "the."
>
> The invention is designed to sift through foreign language documents,
> either in text, or "where the text may be derived from speech and where
the
> text may be in any language," in the words of the patent.
>
> The papers go into more detail on the implementation of this technology.
> The NSA team ran the software over several pools of documents, some of
> which were text from spoken words (called SDR), and some regular
documents.
>  They ran the tests over each pool separately.  Some of the text documents
> analyzed appear to include data from "Internet discussion groups," though
I
> can't quite determine if these were used to train the software program, or
> illustrate results.
>
> The "30-document average precision" (whatever that is) on one test pool
> rose significantly in one year, from 19% in 1997 to 27% in 1998.  This
> shows that they're getting better.
>
> It appears that the tests on the pool of speech- to text-based documents
> came in at between 20% to 23% accuracy (see Tables 5 and 6 of the
"Semantic
> Forests TREC7" paper) at the 30-document average.  (A "document" in this
> definition can mean a topic query.  In other words, 30 documents can
> actually mean 30 questions to the database).
>
> It's pretty clear to me that this technology can be used to support an
> ECHELON-like system.  I'm surprised the NSA hasn't classified this work.
>
> The Semantic Forest papers:
> http://trec.nist.gov/pubs/trec6/papers/nsa-rev.ps
> http://trec.nist.gov/pubs/trec7/papers/nsa-rev.pdf
>
> The patent:
> http://www.patents.ibm.com/details?&pn=US05937422__
>
> News reports on this:
> http://www.independent.co.uk/news/Digital/Features/spies151199.shtml
> http://www.independent.co.uk/news/Digital/Features/spies221199.shtml
>
> General information on ECHELON:
> http://www.echelonwatch.org
> http://www.wired.com/news/print/0,1294,32586,00.html
>
> Excellent article on ECHELON:
> http://mediafilter.org/caq/cryptogate/
>
> EPIC files lawsuit against NSA to get ECHELON document released:
> http://www.epic.org/open_gov/foia/nsa_suit_12_99.html
> EPIC's complaint:
> http://www.epic.org/open_gov/FOIA/nsa_comp.pdf
> NY Times article:
> http://www.nytimes.com/library/tech/99/12/cyber/articles/04spy.html
>
>
> ** *** ***** ******* *********** *************
>
>        Counterpane -- Featured Research
>
>
>
> "Ten Risks of PKI: What You're Not Being Told About Public-Key
Infrastructure"
>
> C. Ellison and B. Schneier, Computer Security Journal, vol. 16, n. 1,
2000,
> pp. 1-7.
>
> Public-key infrastructure has been oversold as the answer to many network
> security problems.  We discuss the problems that PKI doesn't solve, and
> that PKI vendors don't like to mention.
>
> http://www.counterpane.com/pki-risks.html
>
>
> ** *** ***** ******* *********** *************
>
>                     News
>
>
> There's a product, PawSense, that claims to detect when cats are stepping
> on your keyboard and a) require a password, just in case it's a human
doing
> it, and b) make a noise that annoys the cat.  It's a bizarre form of
> biometrics, I suppose.
> http://www.newscientist.com/ns/19991204/newsstory9.html
> http://www.bitboost.com/pawsense/
>
> And on the more mundane biometrics front, a security system is being
> developed that can identify people by their gait.
> http://www.newscientist.com/ns/19991204/newsstory3.html
>
> Jon Carroll's essay on the FBI's new anti-terrorist strategy is pretty
> funny.  "Bob, show Mr. Carroll the attractive pen and pencil set we're
> offering just for a chance to talk to you about terrorism for a few
minutes."
>
<http://www.sfgate.com/cgi-bin/article.cgi?file=/chronicle/archive/1999/11/1
> 5/DD43291.DTL>
>
> The German government is going to help fund the GPG effort.  GPG is an
> open-source program that is compatible with (some versions of) PGP.
> http://www.nytimes.com/library/tech/99/11/cyber/articles/19encrypt.html
> http://www.gnupg.de/presse.en.html
>
> Risks of "anonymous" e-mail accounts:  Someone sent a bomb threat from an
> account from an account named [EMAIL PROTECTED]  The police
contacted
> Hotmail, and found that the Hotmail account had been accessed at a
> particular date and time, using an IP address owned by America Online.
> Using the AOL information, police identified exactly who was using that IP
> address at that time and were able to trace the sender to his apartment in
> Brooklyn.
> <http://www.zdnet.com/zdtv/cybercrime/news/story/0,3700,2324068,00.html>
> I posted this to comp.risks, and people pointed out that the police didn't
> need to contact Hotmail.  The information is in the e-mail header.
>
> This essay describes a copy-protection scheme from several years back that
> was actually successful (in part because the game it protected was never
> all that popular).  There's a discussion of how software cracking works,
> and some interesting attempts to psych out what crackers don't like to do
and
> force them to do a lot of it in order to crack the game.  It's security
> through obfuscation, of course, but the author is very clear that
> copy-protection is ultimately impossible and all you can do is discourage
> attackers that aren't determined enough.
> http://www.erasmatazz.com/library/JCGD_Volume_6/Copy_Protection.html
>
> I know nothing about the Windows 2000 Encryption Pack, except what I read
> at this URL:
> http://www.microsoft.com/windows/professional/beta/downloads/default.asp
>
> An interesting article on simulating Web attacks:
> http://all.net/journal/ntb/simulate/simulate.html
>
> And someone's listing of the top ten computer hacks of all time:
>
http://home.cnet.com/specialreports/0-6014-7-1420567.html?tag=st.cn.1f%20d2.
> tlpg.6014-7-1420567
>
> EPIC (Electronic Privacy Information Center), EFF (Electronic Frontier
> Foundation), and the ACLU have asked a federal appeals court to block
rules
> that give the FBI power to determine appropriate wiretapping capabilities
> for new communications systems.  The groups claim that the levels of
> surveillance the FBI wants exceed what it is entitled to under the law.
> http://www.epic.org/privacy/wiretap/calea/release_11_18_99.html
>
http://www.washingtonpost.com/wp-srv/WPlate/1999-11/18/155l-111899-idx.html
>
http://www.zdnet.com/zdnn/stories/news/0,4586,2397376,00.html?chkpt=zdnntop
>
> E-mail eavesdropping:  Online bookseller Alibris will plead guilty to
> charges that they intercepted the e-mail sent by Amazon.com to business
> partners.  This may be the first conviction of industrial e-mail
espionage.
> http://www.computerworld.com/home/print.nsf/all/991129CF52
>
> Seymour Hirsch writes about the NSA's failures in the Internet age:
> http://cryptome.org/nsa-hersh.htm
> An NPR report on the same topic (audio):
> http://www.npr.org/ramfiles/atc/19991129.atc.03.ram
>
> Opinions on UNIX and Windows NT security, and the differing philosophies
of
> the two operating systems:
> http://www.zdnet.com/zdtv/cybercrime/story/0,3700,2382021,00.html
>
> Is buggy software inevitable?  It is, as long as market forces reward it.
> There is no liability for buggy software, so there is no economic
incentive
> to create quality software.  In fact, there is an economic incentive to
> create the lowest quality the market will bear.  This _Business Week_
> article discusses the problem:
> http://www.businessweek.com/1999/99_49/b3658015.htm
>
> The DVD crypto break affects the release of new products:
> http://www.eet.com/story/OEG19991202S0046
> http://www.theregister.co.uk/991203-000006.html
>
> The Smart Card Security Users Group (SCSUG), which is composed of Visa,
> AmEx, Europay, MasterCard, Mondex, JCB, and the National Information
> Assurance Partnership (NIAP = NIST + NSA).  They've written a Protection
> Profile, and have posted it for comment:
> http://csrc.nist.gov/cc/sc/sclist.htm
>
> PGP got a world-wide export license:
>
http://www.nai.com/asp_set/about_nai/press/releases/pr_template.asp?PR=/Pres
> sMedia/12131999.asp&Sel=647
> http://www.infoworld.com/articles/en/xml/99/12/13/991213enpgp.xml
>
> And two smart card breaks to finish things off:
>
> Number 1.  A French engineer succeeded in factoring the 640-bit RSA key
> stored in the chip on the card (all French "CB" credit cards have had a
> chip since 1990).  He contacted the conglomerate (GIE) that makes these
> cards; now he's being sued by GIE for fraud and intrusion and risks seven
> years in prison, as well as a 5M-franc ($800K) fine.  GIE has also
censored
> TV programs where he should have been interviewed, and claim he is
> blackmailing them.  Meanwhile, they are not fixing the problem.  The
> weakness?  The payment terminal: another good illustration of "weakest
link
> in the chain" attack.
> http://www.pele.org/english/smartcard.htm
>
> Number 2.  German hackers have succeeded in cracking the Siemens digital
> signature chip, used in electronic payment and access control systems
> throughout Germany.  It seems that there was an undocumented test mode of
> the chip that allows someone to dump the card's memory.  Already the code
> has been disassembled, and some private keys have been compromised.
> http://www.theregister.co.uk/991201-000021.html
>
>
> ** *** ***** ******* *********** *************
>
>   New U.S. Crypto Export Regulations -- Draft
>
>
>
> On November 22, the White House released a draft of its new crypto export
> regulations.  These new regulations are part of the changes promised in
> September.  These regulations were due to be released on December 15, but
> have been delayed until January 15.
>
> The regulations do some of what's promised -- allow for export of 56-bit
> and 64-bit encryption products -- but fall far short of the promises made
> in September.
>
> I have three main objections:
>
> One:  These regulations affect end-user products only.  The primary uses
of
> cryptography are not for end-user products.  They do not affect Internet
> routers, firewalls, VPNs, CAs, etc.  They do not affect software toolkits.
> These regulations do not affect technical assistance.
>
> Two:  While these regulations permit the export of open-source
cryptography
> code, there are some nasty complications.  Near as I can tell, I can post
> crypto source on my Web page, but if a foreign company wants to use it I
am
> obligated to make them get U.S. approval for the end product.  Not only is
> this ridiculous, it is completely unenforceable.  (Although you can see
the
> NSA salivating at the chance to get their hands on all of those foreign
> products.)
>
> Three:  These regulations are much too complicated.  Instead of simply
> lifting export restrictions, this proposal just adds to the confusion.
> Heavy reporting and review requirements have always served the interests
of
> those trying to stop the spread of strong cryptography.  There are so many
> ifs, ands, and buts in these regulations that many will simply not bother.
> There are enough ambiguities to keep the lawyers busy for years.  This is
> not the simplified and streamlined export process that we have been
promised.
>
> Rumor has it that the Administration is addressing these (and other)
> concerns in the final regulations, and that the month delay was to make
> sure they were addressed.  They are redoing the definition of
> "non-commercial" source code, trying to spell out the screening
> requirements (which they claim will be easy to comply with), and
> streamlining any reporting requirements.  If this is true, the final
> version of this could be quite good.  People I trust, who are closer to
the
> process than I am, are "guardedly optimistic."  We'll see.
>
> Draft regulations:
> http://www.epic.org/crypto/export_controls/draft_regs_11_99.html
>
> News reports:
>
http://www.washingtonpost.com/wp-srv/WPlate/1999-11/24/105l-112499-idx.html
> http://www.computerworld.com/home/news.nsf/all/9911243cryptdraft
> http://news.cnet.com/category/0-1005-200-1463231.html
>
http://www.zdnet.com/zdnn/stories/news/0,4586,2399788,00.html?chkpt=zdnntop
> http://www.wired.com/news/politics/0,1283,32732,00.html
>
>
> ** *** ***** ******* *********** *************
>
>       Counterpane Internet Security News
>
>
>
> You may have some noticed some changes around Counterpane.  Here's the
news:
>
> Last summer I teamed up with seasoned start-up CEO Tom Rowley to start a
> new company: Counterpane Internet Security, Inc.  This company will
address
> the critical need for higher level security services on the Internet.  My
> motto is: "The fundamental problems in computer security are no longer
> about technology; they're about applying the technology."
>
> We have raised funding, and are now putting the technical and business
> management teams in place.  We're keeping a low profile for now, but we're
> actively hiring.  See http://www.counterpane.com/jobs.html for details.
>
> My consulting company, Counterpane Systems, has become the research
> division and working laboratory of Counterpane Internet Security, Inc.
> Renamed Counterpane Labs, it will provide ongoing research and critical
> resources to the newly formed company.   Counterpane Labs will continue to
> engage in cryptography research, and to support the Twofish AES
submission.
>
>
> Bruce Schneier's article on attack trees has been published in Dr. Dobb's
> Journal:
> http://www.ddj.com/articles/1999/9912/9912a/9912a.htm
> See also the presentation on the topic at:
> http://www.counterpane.com/attacktrees.pdf
> And the discussion on Slashdot:
> http://slashdot.org/article.pl?sid=99/12/02/232229&mode=thread&threshold=0
>
>
> ** *** ***** ******* *********** *************
>
>              The Doghouse:  Egg
>
>
>
> Egg, a UK banking and investment firm, sent customer credit card details
> out in unencrypted e-mails.  "We didn't think [sending credit card details
> in unsafe e-mails] was a security problem," a spokeswoman for Egg conceded
> today.  "We've now accepted that this was not best business practice."
>
> http://www.theregister.co.uk/991130-000015.html
>
>
> ** *** ***** ******* *********** *************
>
>       Fast Software Encryption 2000
>
>
>
> Fast Software Encryption is an annual workshop on cryptography.  The first
> Fast Software Encryption workshop was held in Cambridge in 1993, followed
> by Leuven in 1994, Cambridge in 1996, Haifa in 1997, Paris in 1998, and
> Rome in 1999.  The workshop concentrates on all aspects of traditional
> cryptographic algorithms, including the design and analysis of block
> ciphers, stream ciphers, and hash functions.
>
> The seventh Fast Software Encryption workshop, FSE 2000, will be held from
> 10-12 April 2000, in New York, at the Hilton New York and Towers.  It will
> be in conjunction with the 3rd AES Candidate Conference (same location,
> 13-14 April 2000).  We expect that most people will attend both FSE and
AES.
>
> Come, experience the wonders of symmetric cryptography.  Watch the AES
> finalists battle it out in a war of cryptanalyses, comparisons, and vague
> innuendoes.  If you're a corporation, please help by sponsoring the event.
> Register by the end of the year and save some money.
>
> Fast Software Encryption Workshop:
> http://www.counterpane.com/fse.html
>
> Third AES Candidate Conference:
> http://csrc.nist.gov/encryption/aes/round2/conf3/aes3conf.htm
>
>
> ** *** ***** ******* *********** *************
>
>     European Cellular Encryption Algorithms
>
>
>
> There's been a lot of bad information about what kinds of encryption are
> out there, what's been broken, and how bad the situation really is.
Here's
> a summary of what's really going on.
>
> GSM is the world's most widely used mobile telephony system (51% market
> share of all cellular phones, both analog and digital), with over 215
> million subscribers in America, Europe, Asia, Africa, and Australia.  In
> the US, GSM is employed in the "Digital PCS" networks of such
> telecommunications giants as Pacific Bell, Bell South, and Omnipoint.
>
> There are four cryptographic algorithms in the GSM standard, although not
> all the algorithms are necessarily implemented in very GSM system.  They
are:
>
> A3, the authentication algorithm to prevent phone cloning
> A5/1, the stronger of the two voice-encryption algorithms
> A5/2, the weaker of the two voice-encryption algorithms
> A8, the voice-privacy key-generation algorithm
>
> (Remember, these voice-encryption algorithms only encrypt voice between
the
> cellphone and the base station.  It does not encrypt voice within the
phone
> network.  It does not encrypt end to end.  It only encrypts the
> over-the-air portion of the transmission.)
>
> These algorithms were developed in secret, and were never published.
"Marc
> Briceno" (with the Smartcard Developer Association) reverse-engineered the
> algorithms, and then Ian Goldberg and David Wagner at U.C. Berkeley
> cryptanalyzed them.
>
> Most GSM providers use an algorithm called COMP128 for both A3 and A8.
> This algorithm is cryptographically weak, and it is not difficult to break
> the algorithm and clone GSM digital phones.
>
> The attack takes just 2^19 queries to the GSM smart-card chip, which takes
> roughly 8 hours over the air.  This attack can be performed on as many
> simultaneous phones in radio range as your rogue base station has
channels.
>
> The Berkeley group published their COMP128 analysis in April 1998.  They
> also demonstrated that all A8 implementations they looked at, including
the
> few that did not use COMP128, were deliberately weakened.  The algorithm
> takes a 64-bit key, but ten key bits were set to zero.  This means that
the
> keys that secure the voice-privacy algorithms are weaker than the
> documentation indicates.
>
> They published and analyzed A5/2 in August 1999.  As the weaker of the two
> voice-encryption algorithms, it proved to be very weak.  It can be broken
> in real-time without any trouble; the work factor is around 2^16.
> Supposedly this algorithm was developed with "help" from the NSA, so these
> weaknesses are not surprising.
>
> The Berkeley group published A5/1 in May 1999.  The first attack was by
> Jovan Golic, which gives the algorithm a work factor of 2^40.  This means
> that it can be broken in nearly real-time using specialized hardware.
> Currently the best attack is by Biryukov and Shamir.  Earlier this month
> they showed that they can find the A5/1 key in less than a second on a
> single PC with 128 MB RAM and two 73 GB hard disks, by analyzing the
output
> of the A5/1 algorithm in the first two minutes of the conversation.
>
> All GSM providers and equipment vendors are part of the GSM Association.
> The algorithms were designed and analyzed by the secretive "SAGE" group
> (which is really part of ETSI).  We don't know who the people are or what
> their resumes look like.  What we do know is that the SAGE security
> analyses of the ciphers are online at ETSI's homepage in PDF format.  Read
> it; it's entertaining.  A5/1 is purported to be a modified French naval
> cipher.  This is mentioned in the leaked Racal document.
>
> What's most interesting about these algorithms is how robustly lousy they
> are.  Both voice-encryption algorithms are flawed, but not obviously.  The
> attacks on both A5/1 and A5/2 make use of subtle structures of the
> algorithm, and result in the ability to decrypt voice traffic in real time
> on average computer equipment.  At the same time, the output of the A8
> algorithm that provides key material for A5/1 and A5/2 has been
> artificially weakened by setting ten key bits to zero.  And also, the
> COMP128 algorithm that provides the keying material that is eventually
> weakened and fed into the weakened algorithms is, itself, weak.
>
> And remember, this encryption only encrypts the over-the-air portion of
the
> transmission.  Any legal access required by law enforcement is unaffected;
> they can always get a warrant and listen at the base station.  The only
> reason to weaken this system is for *illegal* access.  Only wiretaps
> lacking a court authorization need over-the-air intercepts.
>
> The industry reaction to this has been predictably clueless.  One GSM
> spokesman claimed that it is impossible to intercept GSM signals off the
> air, so the encryption breaks are irrelevant.  Notwithstanding the fact
> that GSM interception equipment was once sold openly -- now it's
illegal --
> certainly the *phone* can receive signals off the air.  Estimated cost for
> a high-quality interception station is well under $10K.
>
> GSM analysis:
> http://www.scard.org/gsm/
> http://www.jya.com/crack-a5.htm
>
> GSM Association Web site:
> http://www.gsmworld.com
>
> News reports:
> http://wired.lycos.com/news/politics/0,1283,32900,00.html
> http://www.nytimes.com/library/tech/99/12/biztech/articles/07code.html
>
>
> ** *** ***** ******* *********** *************
>
>            Comments from Readers
>
>
>
> From: [EMAIL PROTECTED] (WJCarpenter)
> Subject: Electronic voting, replying to Greg Weiss
>
>  > Are e-votes more prone to voter coercion?
>  >
>  > I used to agree with you on this.  But when talking with someone
>  > about absentee balloting this last week, it seems to me this
>  > problem is equally present in today's non-virtual scenario.  How?
>  > Well, absentee ballots enable voter coercion in the privacy of
>  > non-public polling places.  E-votes are not particularly more
>  > subvertible than absentee ballot votes at least from the voter
>  > coercion threat.
>
>  > Now with absentee ballots, there is one further protection.  One
>  > can apparently still vote in person at the polling place, and their
>  > polling-place vote takes precedence over their absentee ballot.
>
> Hmmm.  I had the opportunity to describe the coercion problem to a
> non-technical person recently, and the absentee ballot parallel was
> immediately obvious.  Equally obvious were the critical differences.
>
> First, it is probably true that only a small percentage of voters use
> absentee ballots (beats me, an ambitious person could easily find out; my
> guess is that 15-20% is a big number).  So, even if the absentee ballot
> system is completely corrupted by coercion, its effects are limited.
Sure,
> absentee ballots decide some elections, but those are close elections to
> begin with.
>
> There is a dis-incentive to use absentee ballots because you must commit
> your vote several days in advance of the election.  My intuition tells me
> that for most common cases people make up their minds at the last minute,
> perhaps even in the voting booth, and they are subconsciously aware of
> this.  It seems likely to me that more people who truly need an absentee
> ballot (because they will be out of town or whatever) will forgo voting
> altogether.
>
> Electronic voting would presumably be made more convenient, even more
> convenient than traditional voting booth voting (no standing in line, no
> finding a parking place, no finding someone to watch your toddler for
you).
>  It is this convenience that should make it much more popular than
absentee
> ballots have ever been.  One could probably look at the case of electronic
> filing of tax returns (where you have to actually pay a fee) for how fast
> something like this could catch on.  Electronic voting should be even more
> popular.
>
> Second, the forced delay in the absentee ballot process should be missing
> from electronic voting.  Electronic voting doesn't carry the logistical
> burden of paper absentee ballots, and so it could be done exactly on
> election day.  The success rate of a coercion scheme is probably related
to
> how long you would have to control someone to keep them from going to the
> voting booth.  (This doesn't mean that electronic voting wouldn't come
with
> an artificial delay if one or more dominating political parties saw an
> advantage in that.)
>
>
> From: Dave Sill <[EMAIL PROTECTED]>
> Subject: "Why Computers are Insecure"
>
> Regarding your "Why Computers are Insecure" piece, I think you're almost
> completely wrong.
>
> Yes, designing and implementing secure software is very hard, but it's not
> as hard as you make it sound.
>
> Proving security is, of course, impractical for any reasonably complex
> system, but, then, so is proving correctness.  Does the inability to prove
> that software does the right thing mean we can never build software that
> works? Of course not.
>
> We're in the midst of a software quality crisis, and security problems are
> just one symptom.
>
> The problem is simply that users don't put a premium on reliability or
> security.  Users want features above all else, and they're willing to
> accept a wide range of bugs as long as a product has the desired features.
> Until reliability and security are features that users demand, vendors
> won't go to the expense of providing them.
>
> We've got to get up, go to our windows, and shout "I'm as mad as hell, and
> I'm not going to take it anymore!"  We've simply got to stop using poorly
> designed and implemented software.
>
> Yes, "virtually all software is developed using a 'try-and-fix'
> methodology" -- but that's not the only software development methodology
> available.  Software can be engineered for reliability and security just
> like it can be engineered to implement certain capabilities.
>
> And, yes, Windows 2000 will have many more bugs than any software system
in
> history.  But that's due more to Microsoft's poor design and engineering
> than it is to the mind boggling complexity of the system.
>
>
> From: [EMAIL PROTECTED]
> Subject: "Why Computers are Insecure"
>
> > Almost every week the computer press covers another security flaw:
> > When will it get better? ...  I don't believe it ever will....
> > Security engineering is different from any other type of engineering.
...
> > In many ways this is similar to safety engineering. ...
> > The only reasonable way to "test" security is to perform security
> reviews. ...
> > Satan's computer is hard to test.
>
> I believe you're missing the real problem here.
>
> I was a verification engineer for two years, testing the software in the
> Boeing 777 fly by wire computer.  I've worked on "Satan's computer" as you
> put it.  We played "devil's advocate" continuously looking for flaws in
the
> design or flaws in the code that might lead to a bug.  A benchmark to
> thoroughness,  one module consisted of 30 pages of B size "schematics"
> which showed the arithmetic flow and design for the module.  I cant
> remember the exact number of lines of code, but I seem to recall it was
> roughly 20 pages of solid code.  I spent three months reviewing that one
> module.
>
> Here's the part I think you're missing though.  Our group was self driven
> to do their job.  Boeing paid us to do our job, sure.  And Boeing could be
> liable if the plane crashed, absolutely.  The FAA gave us the requirements
> for testing software, yes.  But at the heart of it all, I think we were
> clearly driven by a simple concept:  We could all see the consequences if
> we failed our task.
>
> People were putting their lives in our hands.  Our software literally
keeps
> the plane in the air.  If we didn't do our job, people could die.  It was
a
> universally clear cut mission.  It was something everyone on the team
could
> identify with.
>
> There is not a universally clear consequence to bad encryption systems.
> Companies who produce systems have no clear cut consequence that the
> engineers "in the trenches" can identify with.  They get paid, either way.
> They have never been held liable for poorly implemented encryptions
systems.
>
>
> From: Greg Guerin <[EMAIL PROTECTED]>
> Subject: Security engineering comparison
>
> I really liked the feature article in Nov 99 Crypto-Gram.  The analogy to
> safety engineering was excellent.  It left me with a nagging feeling I'd
> recently read something about safety engineering, but I couldn't pin it
> down.  The answer recently clicked into place while filing magazine
> back-issues.
>
> There is an article entitled "Safety Critical Embedded Systems" in the Oct
> 1999 issue of "Embedded Systems Programming":
> <http://www.embedded.com/mag.shtml>
>
> Unfortunately, this particular article isn't on-line, but reprints or
> back-issues can be ordered.
>
> Anyway, the article was a clear concise overview of safety engineering,
> with an emphasis on embedded systems.  I won't try to summarize it,
because
> I'd just end up repeating the whole article.  But I will list the safety
> guidelines at the end of the article:
>  * All safety-related systems have hard real-time deadlines.
>  * Safety always requires some level of redundancy.
>  * Whenever possible, separate and isolate the safety-critical aspects of
> the system.
>  * Safety is a system issue, not a software issue.
>  * The key to a safe design is predictability.
>  * Err on the side of simplicity.
>  * Good design practices are required.
>  * Good design practices are not enough.
>  * Plan for all your assumptions to be violated.
>
> It's kind of eerie to realize that every one of these applies in full
> measure to security engineering, even the "hard real-time deadline."  In
> safety systems, it means that a fault must be detected quickly enough for
> it to be acted on in order to avoid an accident.  A fault-detector that
> triggers only after an accident has happened is worthless.  In security
> systems, not detecting a breach in a timely manner diminishes the
> usefulness of detection.  Security systems have the added difficulty of
not
> always being able to detect a breach -- encryption algorithms usually
can't
> tell if they've been cracked or not.
>
>
> From: "Nicholas C. Weaver" <[EMAIL PROTECTED]>
> Subject: DVD encryption, reason for multiple keys...
>
> The reason for the multiple key structure (session key for the DVD,
> encrypted separately by the 400 odd player keys) was so that if, say, a
> single key was made public, they could remove that key from future DVDs
> produced, essentially acting as a limited key rescission measure.  A good
> idea if their encryption algorithm itself wasn't incredibly dinky and
> highly vulnerable to a known plaintext attack.
>
> Also, they probably did deliberately choose a 40-bit scheme, simply to
> avoid any potential export complications.  It would be bad to have a DVD
> player classed as a "munition," even if it is perfectly useless to
actually
> encrypt real data.
>
> One other observation: The encryption never prevented organized, digital,
> DVD piracy, since that only requires the manufacturing of a bitwise copy
of
> the DVD.  It only prevented the organized pirates from removing region
> encoding information.
>
> Similarly, the many keys is probably for region encoding.  Since software
> players were often set up (and I know my computer hardware player is) to
> specify a region with limited abilities to change it, the different keys
> probably represented the player acting as a different "region."
>
> Finally, the only reason why people bothered to crack the encryption at
> this time is because there were no players which worked under Linux.  If
> there was a Linux software DVD player, the encryption probably wouldn't
> have been publicly cracked for months or years, because there wouldn't
have
> been an incentive for it.
>
>
> From: NBII <[EMAIL PROTECTED]>
> Subject: DVD encryption cracked
>
> A good article.
>
> In addition to your recommended links, I would suggest you include the
> following VERY well written treatise on Digital IP and Copyrights by J.P.
> Barlow:
>
>
http://www.wired.com/wired/archive/2.03/economy.ideas.html?topic=&topic_set=
>
> I have yet to read a better overview of the problems inherent in the
> current presumptions about IP and how it "will work" in the coming
economy.
>
> You'll note that, in 1994, he "predicted" what is essentially exactly the
> problem and the situation you describe.
>
>
> From: Roger Schlafly
> Subject:  Elliptic Curve Public Key Cryptography
>
> I'd go with elliptic curves if you need security for decades.  The
elliptic
> curve DL problem seems to be much more intrinsically difficult than the
RSA
> problem.  Elliptic curve systems also give better protection against
> Moore's Law.  If you accept the Lenstra-Verheul analysis, then you need to
> use 3000-bit keys with RSA, and almost no one is doing that.
>
>
> ** *** ***** ******* *********** *************
>
> CRYPTO-GRAM is a free monthly newsletter providing summaries, analyses,
> insights, and commentaries on computer security and cryptography.
>
> To subscribe, visit http://www.counterpane.com/crypto-gram.html or send a
> blank message to [EMAIL PROTECTED]  To unsubscribe,
> visit http://www.counterpane.com/unsubform.html.  Back issues are
available
> on http://www.counterpane.com.
>
> Please feel free to forward CRYPTO-GRAM to colleagues and friends who will
> find it valuable.  Permission is granted to reprint CRYPTO-GRAM, as long
as
> it is reprinted in its entirety.
>
> CRYPTO-GRAM is written by Bruce Schneier.  Schneier is founder and CTO of
> Counterpane Internet Security Inc., the author of "Applied Cryptography,"
> and an inventor of the Blowfish, Twofish, and Yarrow algorithms.  He
served
> on the board of the International Association for Cryptologic Research,
> EPIC, and VTW.  He is a frequent writer and lecturer on computer security
> and cryptography.
>
> Counterpane Internet Security, Inc. is a venture-funded company bringing
> innovative managed security solutions to the enterprise.
>
> http://www.counterpane.com/
>
> Copyright (c) 1999 by Bruce Schneier
>

Reply via email to