Re: encrypted tapes (was Re: Papers about Algorithm hiding ?)

2005-06-13 Thread Ben Laurie

[EMAIL PROTECTED] wrote:

[EMAIL PROTECTED] wrote:


| Oracle, for example, provides encryption functions, but the real
problem
| is the key handling (how to make sure the DBA can't get the key,
cannot
| call functions that decrypt the data, key not copied with the backup,
| etc.).
| There are several solutions for the key management, but the vendors
should
| start offering them.

I would argue that the real problem is that encryption slows large
searches (is percieved to slow large searches, anyway.)

Adam



Yes, encrypting indexed columns for example is a problem.  But if you
limit yourself to encrypting sensitive information (I'm talking about
stuff like SIN, bank account numbers, data that serves as an index to
external databases and are sensitive with respect to identity theft),
these sensitive information should not be the bases of searches.
If they are not he basis of searches, there will be no performance
problems related to encrypting them.


If they are indexes to external databases, then they are the basis of
searches in those databases, by definition.



My terminology might have been misleading.  By indexes to external
databases, I don't mean that the application that uses the database
actually talks to the external databases (it doesn't use the info as a key
to that external database)
.
Example:
   Cash_Ur_check is in the business of cashing checks.  To cash a check,
they ask you for sensitive information like SIN, bank account number,
drivers licence number, etc.   They use the information to query
Equifax or the like to see if the person has a good credit rating, if
the rating is o.k. they cash the check.  They keep all the information
in the database, because if the client comes back 2 months later, they
will send the same query to Equifax to see if the credit rating hasn't
changed.
These sensitive information are indexes to external databases (but
Cash_Ur_check doesn't directly connect to these other databases). 
Cash_Ur_check doesn't need to use these data as indexes.  Cash_Ur_check

can use first/middle/last name of person as an index, or attribute some
random number to the person, or something else, they should not use the
SIN to identify a person.  They should not do searches on SIN to find a
person given his SIN.


Sure, but Equifax should.


I have many other examples in mind, which I came across in the real world.



So my answer to people that have the perception you mentioned is that if
you want to encrypt sensitive information and that would cause
performance
problems, then there are problems with your data architecture privacy
wise
(you should re-structure your data, use it differently, etc.).


Not a very satisfactory answer.



No, of course I would sit down with the client and the software developers
and examine their needs and constraints suggest how they can structure
their data in a better way.  I've done it several times over the years. 
There is no universal answer, but in my experience I found there is often

good solutions.
Let me throw back a precise question to you, to see if you disagree with
what I said.  Are you saying that it is often inevitable to index
sensitive data?  That is, are you saying that you often have to use
sensitive data as the basis of searches?


Yes.

--
ApacheCon Europe   http://www.apachecon.com/

http://www.apache-ssl.org/ben.html   http://www.thebunker.net/

There is no limit to what a man can do or how far he can go if he
doesn't mind who gets the credit. - Robert Woodruff

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: encrypted tapes (was Re: Papers about Algorithm hiding ?)

2005-06-13 Thread astiglic
Ben Laurie wrote
 [EMAIL PROTECTED] wrote:
 Example:
Cash_Ur_check is in the business of cashing checks.  To cash a check,
 they ask you for sensitive information like SIN, bank account number,
 drivers licence number, etc.   They use the information to query
 Equifax or the like to see if the person has a good credit rating, if
 the rating is o.k. they cash the check.  They keep all the information
 in the database, because if the client comes back 2 months later, they
 will send the same query to Equifax to see if the credit rating hasn't
 changed.
 These sensitive information are indexes to external databases (but
 Cash_Ur_check doesn't directly connect to these other databases).
 Cash_Ur_check doesn't need to use these data as indexes.  Cash_Ur_check
 can use first/middle/last name of person as an index, or attribute some
 random number to the person, or something else, they should not use the
 SIN to identify a person.  They should not do searches on SIN to find a
 person given his SIN.

 Sure, but Equifax should.

No, they shouldn't!  If you think they should, you are missinformed.  At
least in Canada, the Privacy Act protects the SIN, Equifax cannot demand
it.
See for example
http://www.privcom.gc.ca/fs-fi/02_05_d_02_e.asp
and
http://www.guardmycreditfile.org/index.php/content/view/244/139/
which says the following:
Even credit reporting companies can’t demand a SIN to generate a credit
report. Trans Union Canada and Equifax Canada both have the ability to
generate such reports without a SIN. If you ask these same companies to
generate a credit report in the United States, they both require a Social
Security Number.

And if Equifax Canada can generate reports without a SIN, I don't see why
Equifax in any other country couldn't.  Of course, they like to have the
SIN, since it makes things more convenient, but they don't really need it!
 That is the problem in most cases.

--Anton




-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: encrypted tapes (was Re: Papers about Algorithm hiding ?)

2005-06-13 Thread Ben Laurie

[EMAIL PROTECTED] wrote:

Ben Laurie wrote


[EMAIL PROTECTED] wrote:


Example:
  Cash_Ur_check is in the business of cashing checks.  To cash a check,
they ask you for sensitive information like SIN, bank account number,
drivers licence number, etc.   They use the information to query
Equifax or the like to see if the person has a good credit rating, if
the rating is o.k. they cash the check.  They keep all the information
in the database, because if the client comes back 2 months later, they
will send the same query to Equifax to see if the credit rating hasn't
changed.
These sensitive information are indexes to external databases (but
Cash_Ur_check doesn't directly connect to these other databases).
Cash_Ur_check doesn't need to use these data as indexes.  Cash_Ur_check
can use first/middle/last name of person as an index, or attribute some
random number to the person, or something else, they should not use the
SIN to identify a person.  They should not do searches on SIN to find a
person given his SIN.


Sure, but Equifax should.



No, they shouldn't!  If you think they should, you are missinformed.  At
least in Canada, the Privacy Act protects the SIN, Equifax cannot demand
it.


I am just reading what you've written: To cash a check, they ask you 
for sensitive information like SIN, bank account number, drivers 
licence number, etc.   They use the information to query Equifax or the 
like


--
ApacheCon Europe   http://www.apachecon.com/

http://www.apache-ssl.org/ben.html   http://www.thebunker.net/

There is no limit to what a man can do or how far he can go if he
doesn't mind who gets the credit. - Robert Woodruff

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: encrypted tapes (was Re: Papers about Algorithm hiding ?)

2005-06-13 Thread Adam Shostack
On Fri, Jun 10, 2005 at 01:11:45PM -0400, [EMAIL PROTECTED] wrote:
| Ben Laurie wrote

|  Sure, but Equifax should.
| 
| No, they shouldn't!  If you think they should, you are missinformed.  At
| least in Canada, the Privacy Act protects the SIN, Equifax cannot demand
| it.
| See for example
| http://www.privcom.gc.ca/fs-fi/02_05_d_02_e.asp
| and
| http://www.guardmycreditfile.org/index.php/content/view/244/139/
| which says the following:
| Even credit reporting companies can’t demand a SIN to generate a credit
| report. Trans Union Canada and Equifax Canada both have the ability to
| generate such reports without a SIN. If you ask these same companies to
| generate a credit report in the United States, they both require a Social
| Security Number.
| 
| And if Equifax Canada can generate reports without a SIN, I don't see why
| Equifax in any other country couldn't.  Of course, they like to have the
| SIN, since it makes things more convenient, but they don't really need it!
|  That is the problem in most cases.

Actually, there's a difference between theory and practice here.  When
I signed up for a mobile phone, they demanded a SIN, or would put me
on the sucker plan.  When I complained to the Quebec privacy
commissioner, they told me that that was OK.

There are so many examples of this sort of thing that I gave up
sending complaint letters.  Then you look at CIBC, and the lack of
fines... 

Adam

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: encrypted tapes (was Re: Papers about Algorithm hiding ?)

2005-06-13 Thread astiglic
 [EMAIL PROTECTED] wrote:
 Ben Laurie wrote

[EMAIL PROTECTED] wrote:

Example:
   Cash_Ur_check is in the business of cashing checks.  To cash a
 check,
they ask you for sensitive information like SIN, bank account number,
drivers licence number, etc.   They use the information to query
Equifax or the like to see if the person has a good credit rating, if
the rating is o.k. they cash the check.  They keep all the information
in the database, because if the client comes back 2 months later, they
will send the same query to Equifax to see if the credit rating hasn't
changed.
These sensitive information are indexes to external databases (but
Cash_Ur_check doesn't directly connect to these other databases).
Cash_Ur_check doesn't need to use these data as indexes.  Cash_Ur_check
can use first/middle/last name of person as an index, or attribute some
random number to the person, or something else, they should not use the
SIN to identify a person.  They should not do searches on SIN to find a
person given his SIN.

Sure, but Equifax should.


 No, they shouldn't!  If you think they should, you are missinformed.  At
 least in Canada, the Privacy Act protects the SIN, Equifax cannot demand
 it.

 I am just reading what you've written: To cash a check, they ask you
 for sensitive information like SIN, bank account number, drivers
 licence number, etc.   They use the information to query Equifax or the
 like

They'll ask for it, but you don't have to give it.  They can collect it,
but they don't have to do searches on it.
It's the typical ask for SIN if the user gives it use it (as in Adam
Shostack's example with cell phone), but if they don't then ask for 2
other identity cards.  In most cases, I don't have to give my SIN, but
almost everybody asks for it.

Equifax will always ask for the SIN but they don't have the right to
demand it.

http://www.piac.ca/newpage91.htm

Equifax suggests that to prevent these inaccuracies, consumers should
always give their full name and SIN number on application forms (this
facilitates updating of files and prevents confusion of two files).
However, this solution to the problem does not take into account that
consumers have a valid interest in protecting their privacy with respect
to their SIN.

The problem is with forms that make it look like you have to give your
SIN, when in fact the law says you don't have to.  Providing other
identification can be troublesome, so allot of people just end up giving
their SIN.

--Anton


-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


RE: encrypted tapes (was Re: Papers about Algorithm hiding ?)

2005-06-13 Thread Peter Gutmann
Jerrold Leichter [EMAIL PROTECTED] writes:

They also sold a full solution for encrypted Ethernet - KDC, encrypting
Ethernet adapters, associated software. None of this stuff went anywhere.
People just weren't interested.

That wasn't quite the case for the Ethernet encryption.  What happened there
was that they had a complete product ready to ship and quite a bit of interest
when it was killed by marketing.  The problem was that Ethernet at the time
wasn't the forgone conclusion it is now, it was just one of a number of
potential candidates for the foregone-conclusion role.  By shipping an
encrypting Ethernet adapter, marketing felt that DEC were saying that standard
Ethernet wasn't safe.  In contrast token ring didn't have an encryption
adapter, so obviously token ring must be secure by default, whereas Ethernet
clearly wasn't.  As a result, the encryption adapter was never shipped.

Strategy is not letting the enemy know you're out of bullets by continuing to
 fire.

Peter.

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: encrypted tapes (was Re: Papers about Algorithm hiding ?)

2005-06-09 Thread Charles M. Hannum
On Wednesday 08 June 2005 21:20, [EMAIL PROTECTED] wrote:
 Yes, encrypting indexed columns for example is a problem.  But if you
 limit yourself to encrypting sensitive information (I'm talking about
 stuff like SIN, bank account numbers, data that serves as an index to
 external databases and are sensitive with respect to identity theft),
 these sensitive information should not be the bases of searches.
 If they are not he basis of searches, there will be no performance
 problems related to encrypting them.

I can name at least one obvious case where sensitive data -- namely credit 
card numbers -- is in fact something you want to search on: credit card 
billing companies like CCbill and iBill.  Without the ability to search by 
CC#, customers are pretty screwed.

That said, I will never buy the only encrypt sensitive data argument.  In my 
experience, you *always* end up leaking something that way.

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: encrypted tapes (was Re: Papers about Algorithm hiding ?)

2005-06-09 Thread Jason Holt


On Wed, 8 Jun 2005, David Wagner wrote:
[...]

That said, I don't see how adding an extra login page to click on helps.
If the front page is unencrypted, then a spoofed version of that page
can send you to the wrong place.  Sure, if users were to check SSL
certificates extremely carefully, they might be able to detect the funny
business -- but we know that users don't do this in practice.

Dan Bernstein has been warning of this risk for many years.
http://cr.yp.to/djbdns/bugtraq/[EMAIL PROTECTED]
http://cr.yp.to/dnscache/bugtraq/[EMAIL PROTECTED]

As far as I can tell, if the front page is unencrypted, and if the
attacker can mount DNS cache poisoning, pharming, or other web spoofing
attacks -- then you're hosed.  Did I get something wrong?


Well, yes.  TLS guarantees that you're talking to the website listed in the 
location bar.  Knowing what domain you *wanted* is up to you, and Dan handles 
that by suggesting that perhaps you have a paper brochure from the bank which 
lists their domain.


So, it's fine to have http://amex.com link to https://amex.com (or 
whatever.com) for forms requesting anything sensitive as long as amex.com (or 
whatever.com) is what's printed in the brochure.  As Dan points out, 
examination of the certificate is generally pointless as long as it's signed 
by a trusted CA, since the attacker can get a perfectly valid cert for 
hackers-r-us.com anyway.  The big question is just whether the domain asking 
for your account info corresponds with the organization you trust with it.


Of course, brochures aren't exactly hard to spoof (cf. Verisign's fraudulent 
domain renewal postcards).  And then there are the dozens of CAs your browser 
accepts, the CA staff who issue microsoft.com certs to random passersby, 
international domain names that look identical to, er, national ones.  All 
those gotchas apply even in the correct implementation outlined by Dan.


-J

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: encrypted tapes (was Re: Papers about Algorithm hiding ?)

2005-06-09 Thread Ben Laurie

[EMAIL PROTECTED] wrote:

| Oracle, for example, provides encryption functions, but the real problem
| is the key handling (how to make sure the DBA can't get the key, cannot
| call functions that decrypt the data, key not copied with the backup,
| etc.).
| There are several solutions for the key management, but the vendors
should
| start offering them.

I would argue that the real problem is that encryption slows large
searches (is percieved to slow large searches, anyway.)

Adam



Yes, encrypting indexed columns for example is a problem.  But if you
limit yourself to encrypting sensitive information (I'm talking about
stuff like SIN, bank account numbers, data that serves as an index to
external databases and are sensitive with respect to identity theft),
these sensitive information should not be the bases of searches.
If they are not he basis of searches, there will be no performance
problems related to encrypting them.


If they are indexes to external databases, then they are the basis of 
searches in those databases, by definition.



So my answer to people that have the perception you mentioned is that if
you want to encrypt sensitive information and that would cause performance
problems, then there are problems with your data architecture privacy wise
(you should re-structure your data, use it differently, etc.).


Not a very satisfactory answer.

--
http://www.apache-ssl.org/ben.html   http://www.thebunker.net/

There is no limit to what a man can do or how far he can go if he
doesn't mind who gets the credit. - Robert Woodruff

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: encrypted tapes (was Re: Papers about Algorithm hiding ?)

2005-06-09 Thread lists

From: Charles M. Hannum [EMAIL PROTECTED]

 I can name at least one obvious case where sensitive data -- namely credit 
 card numbers -- is in fact something you want to search on: credit card 
 billing companies like CCbill and iBill.  Without the ability to search by 
 CC#, customers are pretty screwed.

Is there a good reason for not searching by the hash of a CC# ?

http://www.wayner.org/books/td/

http://www.unixwiz.net/techtips/secure-cc.html
I think the author is planning further work on this site and
would be happy to receive constructive comments.

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: encrypted tapes (was Re: Papers about Algorithm hiding ?)

2005-06-09 Thread Charles M. Hannum
On Thursday 09 June 2005 17:37, Charles M. Hannum wrote:
 If we assume that the last 4 digits have been exposed somewhere -- and they
 usually are -- then this gives you at most 38 bits -- i.e. 2^38 hashes to
 test -- to search (even a couple less if you know a priori which *brand* of
 card it is).  How long do you suppose this would take?

On reconsideration, given the presence of the check digit, I think you have at 
most 2^34 tests (or 2^32 if you know the brand of card).  And this assumes 
there aren't additional limitations on the card numbering scheme, which there 
always are.

I guess you could use a keyed hash.  Remember, though, you can't use random 
padding if this is going to be searchable with a database index, so the 
amount of entropy you're putting in is pretty limited.


-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: encrypted tapes (was Re: Papers about Algorithm hiding ?)

2005-06-09 Thread astiglic
 On Wednesday 08 June 2005 21:20, [EMAIL PROTECTED] wrote:
 Yes, encrypting indexed columns for example is a problem.  But if you
 limit yourself to encrypting sensitive information (I'm talking about
 stuff like SIN, bank account numbers, data that serves as an index to
 external databases and are sensitive with respect to identity theft),
 these sensitive information should not be the bases of searches.
 If they are not he basis of searches, there will be no performance
 problems related to encrypting them.

 I can name at least one obvious case where sensitive data -- namely
 credit
 card numbers -- is in fact something you want to search on: credit card
 billing companies like CCbill and iBill.  Without the ability to search by
 CC#, customers are pretty screwed.

 That said, I will never buy the only encrypt sensitive data argument.
 In my
 experience, you *always* end up leaking something that way.

There are exceptions, I grant you that, but my hypothesis is that in most
cases you can do without indexing on the sensitive data you have.

Encrypting everything in your database, I say that will never work.  If
you do that, then you will have performance trouble, and nobody want's
that.  You can do stuff like encrypt everything at the OS level, but that
doesn't help protect database backups and it doesn't prevent your DBA to
look at data he's not supposed to.  If you encrypt everthing at the DBMS
level or at the application (client or middleware) level, than you cannot
encrypt indexed data.

--Anton
--Anton



-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: encrypted tapes (was Re: Papers about Algorithm hiding ?)

2005-06-09 Thread astiglic
 [EMAIL PROTECTED] wrote:
| Oracle, for example, provides encryption functions, but the real
 problem
| is the key handling (how to make sure the DBA can't get the key,
 cannot
| call functions that decrypt the data, key not copied with the backup,
| etc.).
| There are several solutions for the key management, but the vendors
should
| start offering them.

I would argue that the real problem is that encryption slows large
searches (is percieved to slow large searches, anyway.)

Adam


 Yes, encrypting indexed columns for example is a problem.  But if you
 limit yourself to encrypting sensitive information (I'm talking about
 stuff like SIN, bank account numbers, data that serves as an index to
 external databases and are sensitive with respect to identity theft),
 these sensitive information should not be the bases of searches.
 If they are not he basis of searches, there will be no performance
 problems related to encrypting them.

 If they are indexes to external databases, then they are the basis of
 searches in those databases, by definition.

My terminology might have been misleading.  By indexes to external
databases, I don't mean that the application that uses the database
actually talks to the external databases (it doesn't use the info as a key
to that external database)
-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: encrypted tapes (was Re: Papers about Algorithm hiding ?)

2005-06-09 Thread astiglic
 [EMAIL PROTECTED] writes:

I saw allot of requirements by security auditors that looked pretty
 silly.

 Must use 128-bit RSA encryption has to be the all-time favourite.

 One I saw recently was a requirement for using X9.17 key management... in
 SSL.

 Peter.

One of my favourites was that PINs had to be hashed  (these were PINs
for authentication in a proprietary application/system.  The justification
(given by the auditor) was that people who had access to the database,
should not be able to see the PINs in clear.  These where 4 digit PINs. So
the developers just SHA-oned the PINs.  Later on, the developers had to
export the PINs into another application, that had its own way to protect
the PINs, so they launched a brut force attack on all of the PINs, of
course this was easy because the input space was very small and the hash
function did not involve any secret key, no salt, no iterations...  Talk
about protection!

--Anton



-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: encrypted tapes (was Re: Papers about Algorithm hiding ?)

2005-06-08 Thread Ben Laurie

Perry E. Metzger wrote:
Have a look, for example, at 


http://www.americanexpress.com/

which encourages users to type in their credentials, in the clear,
into a form that came from lord knows where and sends the information
lord knows where. Spoof the site, and who would notice?

Every company should be telling its users never to type in their
credentials on a web page downloaded in the clear, but American
Express and lots of other companies train their users to get raped,
and why do they do it? Not because they made some high level decision
to screw their users. Not because they can't afford to do things
right. It happens because some idiot web designer thought it was a
nice look, and their security people are too ignorant or too powerless
to stop it, that's why.


Why is it bad for the page to be downloaded clear? What matters is the 
destination is encrypted, surely?


Which, as it happens, it is on the above site.

--
http://www.apache-ssl.org/ben.html   http://www.thebunker.net/

There is no limit to what a man can do or how far he can go if he
doesn't mind who gets the credit. - Robert Woodruff

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


RE: encrypted tapes (was Re: Papers about Algorithm hiding ?)

2005-06-08 Thread Ken Buchanan
Steven M. Bellovin wrote:
 The bigger issue, though, is more subtle: keeping track of the keys
 is non-trivial.  These need to be backed up, too, and kept separate
 from (but synchronized with) the tapes.  Worse yet, they need to be
 kept secure.  That may mean storing the keys with a different
 escrow company.  A loss of either piece,the tape or the key, renders
 the backup useless.  

This is correct.  It is not that nobody ever thought of encrypting tapes, it is 
that there has been no uptake on the idea because the management overhead costs 
outweighed the perceived benefit.  The big vendors didn't bother offering it 
because they didn't think they could make money, and the start-ups who have 
been trying to fill the gap found the market to be small.

Now it is becoming clear that the perceived benefit has been underestimated.

There are a number of small companies making products that can encrypt data in 
a storage infrastructure, including tape backups (full disclosure: I work for 
one of those companies).  The solutions all involve appliances priced in the 
tens of thousands.  The costs come not from encryption (how much does an FPGA 
cost these days?), but from solving the problems you listed, plus some others 
you didn't.

Now that the benefit of storage encryption is clearer, tape vendors 
(StorageTek, HP, IBM, etc) are almost certainly looking at adding encryption 
capability into their offerings.

There is an IEEE working group developing interoperability standards for 
storage encryption, including tape:
http://www.siswg.org

And in case anyone is really interested in this subject, Networking Computing 
magazine did a round-up of all the storage infrastructure security solutions 
currently on the market:
http://www.networkcomputing.com/showitem.jhtml?docid=1607f2


Ken

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: encrypted tapes (was Re: Papers about Algorithm hiding ?)

2005-06-08 Thread astiglic
Perry wrote:
 In case you think the answer is regulation, by the way, let me note
 that most of the regulatory pressure I've seen on security policy
 results in people finding extremely well documented ways to do exactly
 what the regulators ask, to no actual effect. This is generally
 because the regulators are almost uniformly as dumb or dumber than the
 people they regulate.

One thing that irritates me is that most security audits (that verify
compliance with regulations) are done by accountants.  No disrespect for
accountants here, they are smart people, but most of them lack the
security knowledge needed to really help with the security posture of a
company, and often they don't work with a security expert.  I saw allot of
requirements by security auditors that looked pretty silly.
I believe a mix of accountants with security experts should be used for
security audits

--Anton


-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: encrypted tapes (was Re: Papers about Algorithm hiding ?)

2005-06-08 Thread Adam Shostack
On Wed, Jun 08, 2005 at 01:33:45PM -0400, [EMAIL PROTECTED] wrote:
| 
| Ken Buchanan wrote:
|  There are a number of small companies making products that can encrypt
|  data in a storage infrastructure, including tape backups (full disclosure:
|  I work for one of those companies).  The solutions all involve appliances
|  priced in the tens of thousands.  The costs come not from encryption (how
|  much does an FPGA cost these days?), but from solving the problems you
|  listed, plus some others you didn't.
| 
|  Now that the benefit of storage encryption is clearer, tape vendors
|  (StorageTek, HP, IBM, etc) are almost certainly looking at adding
|  encryption capability into their offerings.
| 
| Another area where I predict vendors will (should) offer built in
| solutions is with database encryption.  Allot of laws require need-to-know
| based access control, and with DBA's being able to see all entries that is
| a problem.  Also backups of db data can be a risk.
| Oracle, for example, provides encryption functions, but the real problem
| is the key handling (how to make sure the DBA can't get the key, cannot
| call functions that decrypt the data, key not copied with the backup,
| etc.).
| There are several solutions for the key management, but the vendors should
| start offering them.

I would argue that the real problem is that encryption slows large
searches (is percieved to slow large searches, anyway.)

Adam

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: encrypted tapes (was Re: Papers about Algorithm hiding ?)

2005-06-08 Thread Dan Kaminsky

2) The cost in question is so small as to be unmeasurable.

  

Yes, because key management is easy or free.

Also, reliability of encrypted backups is problematic:  CBC modes render
a single fault destructive to the entire dataset.  Counter mode is
sufficiently new that it's not supported by existing code.

--Dan


-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: encrypted tapes (was Re: Papers about Algorithm hiding ?)

2005-06-08 Thread astiglic
 | Oracle, for example, provides encryption functions, but the real problem
 | is the key handling (how to make sure the DBA can't get the key, cannot
 | call functions that decrypt the data, key not copied with the backup,
 | etc.).
 | There are several solutions for the key management, but the vendors
 should
 | start offering them.

 I would argue that the real problem is that encryption slows large
 searches (is percieved to slow large searches, anyway.)

 Adam

Yes, encrypting indexed columns for example is a problem.  But if you
limit yourself to encrypting sensitive information (I'm talking about
stuff like SIN, bank account numbers, data that serves as an index to
external databases and are sensitive with respect to identity theft),
these sensitive information should not be the bases of searches.
If they are not he basis of searches, there will be no performance
problems related to encrypting them.
So my answer to people that have the perception you mentioned is that if
you want to encrypt sensitive information and that would cause performance
problems, then there are problems with your data architecture privacy wise
(you should re-structure your data, use it differently, etc.).

--Anton




-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


encrypted tapes (was Re: Papers about Algorithm hiding ?)

2005-06-08 Thread David Wagner
Ben Laurie writes:
Why is it bad for the page to be downloaded clear? What matters is the 
destination is encrypted, surely?

Because the page you downloaded in the clear contains the https: URL
in the post method.  How do you know that this is the right URL?  If
you got the page in the clear, you don't.  An attacker who can provide
a spoofed page (by DNS cache poisoning, pharming, MITM attacks, or
any other method) could substitute a post URL that sends your sensitive
data to hackers-r-us.com.

That said, I don't see how adding an extra login page to click on helps.
If the front page is unencrypted, then a spoofed version of that page
can send you to the wrong place.  Sure, if users were to check SSL
certificates extremely carefully, they might be able to detect the funny
business -- but we know that users don't do this in practice.

Dan Bernstein has been warning of this risk for many years.
http://cr.yp.to/djbdns/bugtraq/[EMAIL PROTECTED]
http://cr.yp.to/dnscache/bugtraq/[EMAIL PROTECTED]

As far as I can tell, if the front page is unencrypted, and if the
attacker can mount DNS cache poisoning, pharming, or other web spoofing
attacks -- then you're hosed.  Did I get something wrong?

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: Papers about Algorithm hiding ?

2005-06-07 Thread John Kelsey
From: Ian G [EMAIL PROTECTED]
Sent: Jun 7, 2005 7:43 AM
To: John Kelsey [EMAIL PROTECTED]
Cc: Steve Furlong [EMAIL PROTECTED], cryptography@metzdowd.com
Subject: Re: Papers about Algorithm hiding ?

[My comment was that better crypto would never have prevented the
Choicepoint data leakage. --JMK]

Sure it would.  The reason they are not using the tools is because
they are too hard to use.  If the tools were so easy to use that it
was harder to not use them, then they'd be used.  Consider Citigroup
posted today by Bob.  They didn't encrypt the tapes because the tools
don't work easily enough for them.

So, this argument might make sense for some small business, but
Citigroup uses a *lot* of advanced technology in lots of areas, right?
I agree crypto programs could be made simpler, but this is really not
rocket science.  Here's my guess: encrypting the data would have
required that someone make a policy decision that the data be
encrypted, and would have required some coordination with the credit
agency that was receiving the tapes.  After that, there would have
been some implementation costs, but not all *that* many costs.
Someone has to think through key management for the tapes, and
that's potentially a pain, but it's not intractible.  Is this really
more complicated than, say, maintaining security on their publically
accessible servers, or on their internal network?  

... 

The other way of looking at Choicepoint - change the incentives - is
a disaster.  It will make for a compliance trap.  Compliance *may*
protect the data or it may have completely the opposite effect, the
situation with 'unintended consequences' in such a case is likely to
be completely unpredictable.  The only thing we can guarantee is that
costs will go up.

Well, Choicepoint is a bit different, right?  I mean, as I understand
it the big disclosure happened because they sold peoples' data to
criminals, but they were in the business of selling peoples' data.
They just intended to sell it only to people of good intention, as far
as I can tell.  (Perhaps they should have demanded X.509 certificates
from the businesses buying the data and checked the evil bit.)  I
just can't see how cryptography could have helped prevent that attack,
other than by making the data that Choicepoint depends on harder to
get in the first place.

It's much cheaper and much more secure to simply
improve the tools.

But this does no good whatsoever if there's not some reason for the
people holding the data to use those tools.  Everyone with a network
presence and any kind of high profile does, in fact, use moderately
complicated computer security tools like routers, firewalls, VPNs,
virus scanners, and spyware detectors.  Everyone has to deal with
keeping their boxes up to date on patches.  However imperfectly, it
seems like Citigroup and Choicepoint and the rest can actually do
those things.  So when you excuse their failures to secure customer
data with the tools aren't there, this sounds absolutely implausible
to me.  

I'm not crazy about a HIPAA-style mandate for encryption and shredders
either, but we have this basic problem:

a.  It's basically easy to buy or find some amount of data about many
people.

b.  It's basically easy to use that amount of data to get credit in
their name.

I suspect a better solution than trying to regulate data brokers is to
make it more expensive to give credit to Alice under Bob's name.  The
thing that imposes the cost on me isn't when someone finds my SSN,
it's when someone takes out a bunch of loans which I'm then expected
to pay back.  Then it becomes my problem to resolve the disputes
created by the lender's desire to extend credit at minimal cost.  (The
lender also loses money, of course.  But much of the cost is shifted
to the identity theft victim.)  

iang

--John Kelsey

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: Papers about Algorithm hiding ?

2005-06-07 Thread Ian G
On Tuesday 07 June 2005 14:52, John Kelsey wrote:
 From: Ian G [EMAIL PROTECTED]
 Sent: Jun 7, 2005 7:43 AM
 To: John Kelsey [EMAIL PROTECTED]
 Cc: Steve Furlong [EMAIL PROTECTED], cryptography@metzdowd.com
 Subject: Re: Papers about Algorithm hiding ?

 [My comment was that better crypto would never have prevented the
 Choicepoint data leakage. --JMK]

OK, yes, you are right, we are talking about two
different things.

The difficulty here is that there is what we might call
the Choicepoint syndrome and then there is the
specific facts about the actual Choicepoint heist.
When I say Choicepoint I mean the former, and the
great long list of similar failures as posted last week.
I.e., it is a syndrome that might be characterised as
companies are not protecting data or in other words
the threat is on the node not the wire.

Whereas in the specific Choicepoint heist, there is
the precise issue that they are selling their data to
someone.  That's much more complex, and crypto
won't change that issue, easily.


 Sure it would.  The reason they are not using the tools is because
 they are too hard to use.  If the tools were so easy to use that it
 was harder to not use them, then they'd be used.  Consider Citigroup
 posted today by Bob.  They didn't encrypt the tapes because the tools
 don't work easily enough for them.

 So, this argument might make sense for some small business, but
 Citigroup uses a *lot* of advanced technology in lots of areas, right?
 I agree crypto programs could be made simpler, but this is really not
 rocket science.  Here's my guess: encrypting the data would have
 required that someone make a policy decision that the data be
 encrypted, and would have required some coordination with the credit
 agency that was receiving the tapes.  After that, there would have
 been some implementation costs, but not all *that* many costs.
 Someone has to think through key management for the tapes, and
 that's potentially a pain, but it's not intractible.  Is this really
 more complicated than, say, maintaining security on their publically
 accessible servers, or on their internal network?

No it's not rocket science - it's economic science.
It makes no difference in whether the business is
small or large - it is simply a question of costs.  If
it costs money to do it then it has to deliver a
reward.

In the case of the backup tapes there was no reward
to be enjoyed.  So they could never justify encrypting
them if it were to cost any money.  Now, in an unusual
exception to the rule that laws cause costs without
delivering useful rewards, the California law SB
changed all that by adding a new cost:  disclosure.
(Considering that banks probably lose a set of backups
each every year and have been doing so since whenever,
it's not the cost of the tapes or the potential for ID theft
that we care about...)

Now consider what happens when we change the
cost structure of crypto such that it is easier to do it
than not.  This is a *hypothetical* discussion of course.

Take tar(1) and change it such that every archive is
created as an encrypted archive to many public keys.
Remove the mode where it puts the data in the clear.
Then encrypt to a big set of public keys such that
anyone who can remotely want the data can decrypt
it (this covers the biggest headache which is when
you want the data it is no longer readable).

So, now it becomes trivial to make an encrypted
backup.  In fact, it is harder to make an unencrypted
backup.  What are companies going to do?  Encrypt,
of course.  Because it costs to do anything else.



 The other way of looking at Choicepoint - change the incentives - is
 a disaster.  It will make for a compliance trap.  Compliance *may*
 protect the data or it may have completely the opposite effect, the
 situation with 'unintended consequences' in such a case is likely to
 be completely unpredictable.  The only thing we can guarantee is that
 costs will go up.

 Well, Choicepoint is a bit different, right?  I mean, as I understand
 it the big disclosure happened because they sold peoples' data to
 criminals, but they were in the business of selling peoples' data.
 They just intended to sell it only to people of good intention, as far
 as I can tell.  (Perhaps they should have demanded X.509 certificates
 from the businesses buying the data and checked the evil bit.)  I
 just can't see how cryptography could have helped prevent that attack,
 other than by making the data that Choicepoint depends on harder to
 get in the first place.

Yes, you are right, I was thinking Choicepoint syndrome
here.  In order to address Choicepoint-actual with crypto
we'd have to look at Rights systems: nyms, caps and Brands,
or address it at the business level.

 It's much cheaper and much more secure to simply
 improve the tools.

 But this does no good whatsoever if there's not some reason for the
 people holding the data to use those tools.

Yes, that's why I'm saying that the tools should actually
make

Re: Papers about Algorithm hiding ?

2005-06-07 Thread Adam Shostack
On Tue, Jun 07, 2005 at 05:41:12PM +0100, Ian G wrote:

| 
| The difficulty here is that there is what we might call
| the Choicepoint syndrome and then there is the
| specific facts about the actual Choicepoint heist.
| When I say Choicepoint I mean the former, and the
| great long list of similar failures as posted last week.

Poor form there.
| No it's not rocket science - it's economic science.
| It makes no difference in whether the business is
| small or large - it is simply a question of costs.  If
| it costs money to do it then it has to deliver a
| reward.
| 
| In the case of the backup tapes there was no reward
| to be enjoyed.  So they could never justify encrypting
| them if it were to cost any money.  Now, in an unusual

Actually, that's not true.  Over 10 years ago, I wrote a small script
that took data very much like this, encrypted it, verified the
output looked like PGP encrypted data, and copied it to a public ftp
site so that a partner could pick it up.

That saved a lot over tape, and reduced manual steps which introduced
errors.

Adam

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: encrypted tapes (was Re: Papers about Algorithm hiding ?)

2005-06-07 Thread Steven M. Bellovin
In message [EMAIL PROTECTED], Perry E. Metzger writes:


The truth is, the likely reason no one encrypted the data on the tapes
in transit was because no one thought to do it, or they were too lazy
to bother to make even the simplest effort, or both.

I don't completely agree.  While I suspect that laziness or lack of thought
are the primary problems, there are some real costs.  The minor one is 
compression: most modern tape drives compress the data before writing, 
and you can't compress encrypted data.  That means they'd need to 
compress in software before writing to the tape; that chews up CPU time 
that they may not have to spare on the machines in question.  (Remember 
that we're talking about massive amounts of data here.)

The bigger issue, though, is more subtle: keeping track of the keys is 
non-trivial.  These need to be backed up, too, and kept separate from 
(but synchronized with) the tapes.  Worse yet, they need to be kept 
secure.  That may mean storing the keys with a different escrow 
company.  A loss of either piece,the tape or the key, renders the 
backup useless.  

Backups are not very reliable to start with.  Too few companies do 
regular checks on the adequacy or quality of their backups.  Most 
companies feel they can't afford lowering the reliability even further.

...

The only thing that will fix this having enough people get so badly
burned that CEOs start taking heads when people do dumb things. I
imagine it can't be too many more years before that becomes the case.

Bingo.  Especially the CEO's head -- or the CFO's head, or the general 
counsel's -- for some of the mistakes we've seen.  But there's no one 
cause.  For those who subscribe to the Wall Street Journal online, see
http://online.wsj.com/documents/info-idtheft0504.html?mod=technology_main_promo_left
for a chart of recent failures to protect identity data.  Of the 10 
failures for which a cause is listed, though, 4 were loss of tapes in 
transit.  (One was a shipment of tapes to a credit bureau.)  2 involved 
hacking, one was an inside job, one was a stolen laptop, and 2 were 
fraudulent use of logins and passwords.

--Steven M. Bellovin, http://www.cs.columbia.edu/~smb



-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: encrypted tapes (was Re: Papers about Algorithm hiding ?)

2005-06-07 Thread Daniel Carosone
On Tue, Jun 07, 2005 at 07:48:22PM -0400, Perry E. Metzger wrote:
 It happens because some idiot web designer thought it was a nice
 look, and their security people are too ignorant or too powerless to
 stop it, that's why.
 
 It has nothing to do with cost. The largest non-bank card issuer in
 the world can pay for the fifteen minutes of time it would take to fix
 it by putting the login on a separate SSL protected page. It has
 nothing to do with ease of use or tools that default safe. The
 problem is that they don't know there is anything to fix at a level
 of the firm that is capable of taking the decision to fix it.

It may well be that, rather than the fault of the web designer, an
explicit business requirement was presented to do this, because of
perceived ease of use for the customer...  and then the lack of
knowing any better to kill the idea before it takes hold.

At least, that's how it's happened in several places I've been
involved, and it took more than a little effort to make them
understand why it was a bad idea.  Thankfully, they were paying for my
time and advice, and *also* had the sense to listen to it when given -
the two don't always go together - but in this sense security did cost
them something.

The customer ease-of-use argument is quite easy to see - and also
quite easy to provide, if they want to, by putting the normal company
homepage that contains the form on SSL too.  I've had conversations
about the cost of doing that with knowledgable web designers (largely
centering on image caching concerns), and really, it isn't quite free,
even if the costs come from unreasonable and annoying places. Those
costs can have returns, even non-risk ones like being better able to
track users' browsing patterns and site navigation, too - but just by
virtue of having to have the conversation, its no longer free,
especially if people bring organisational politics to the table.

 Security these days is usually bad not because good security is
 expensive, or because it is particularly hard. 

These are the things that make the difference between middling-fair
and somewhat-decent security (let alone good security, which requires
many other things to be Done Right, more operational than technical).

The irony is that bad IT security (just like any other bad IT) is
expensive, often much more expensive - even without considering the
potential costs involved if the risks are realised.

 It is bad because even people at giant multinational corporations
 with enough budget to spare are too dumb to implement it.

Worse than that: people at the giant multinational corporations who
provide the outsourced IT services to those other corporations - and
who sell their services based on 'economies of scale' and 'industry
expertise' and 'best practice', and who are thus charged with the
responsibility of knowing better - are too dumb to implement it.

And then they're too dumb to implement it at other customer sites even
when that one rare customer comes along who knows better (or at least
knows to ask for independent outside help), and has fought hard to
convinve their outsource provider of the need and economic sense of
not being dumb on their behalf.  Most organisations just let them get
away with it, and think they're getting a good deal, while the very
basis on which that deal was sold is defeated.

If it weren't for all the other people that get hurt by the schrapnel,
it would be very hard to continue trying to help people who seem to
like walking around with bullet holes in their shoes.

--
Dan.


pgpj1TA56flkt.pgp
Description: PGP signature


Re: encrypted tapes (was Re: Papers about Algorithm hiding ?)

2005-06-07 Thread Mark Allen Earnest

Steven M. Bellovin wrote:
  The bigger issue, though, is more subtle: keeping track of the keys is
non-trivial.  These need to be backed up, too, and kept separate from 
(but synchronized with) the tapes.  Worse yet, they need to be kept 
secure.  That may mean storing the keys with a different escrow 
company.  A loss of either piece,the tape or the key, renders the 
backup useless.  


Basically, expensive or not, security is very hard to get right. When 
you look at Choicepoint, Bank of America, and Citigroup (not to mention 
universities and smaller businesses) they have little to no incentive to 
keep your personal data secure. YOU bear the cost of data compromise, 
not them. The worst they get is some bad publicity and only if it 
affects CA residents, otherwise it can be kept quiet. The threat of bad 
publicity does not mean much when next week your compromise due to bad 
security will be forgotten as the media switches to the next one.


As it stands today, the cost/benefit analysis easily directs them away 
from taking strong measures to protect customer's financial data. Doing 
so is time consuming, opens up potential for problems, and gets them 
next to nothing in return.


--

Mark Allen Earnest

Lead Systems Programmer
Emerging Technologies
The Pennsylvania State University

Lt Commander
Centre County Sheriff's Office Search and Rescue

KB3LYB


smime.p7s
Description: S/MIME Cryptographic Signature


Re: Papers about Algorithm hiding ?

2005-06-06 Thread John Kelsey
From: Ian G [EMAIL PROTECTED]
Sent: Jun 4, 2005 6:43 AM
To: Steve Furlong [EMAIL PROTECTED]
Cc: cryptography@metzdowd.com
Subject: Re: Papers about Algorithm hiding ?

GPG is an application that could be delivered by default
in all free OSs.  BSD is more or less installed automatically
with SSH installed.  Linux machines that are set up are
also generally set up with SSH.

I think you need one more step here to get the protective coloration
effect you'd like, where encrypted files aren't automatic evidence of
wrongdoing: During installation, generate 50 or so random passwords
with too much entropy to feasibly guess (easy to do when no user need
ever remember them), and encrypt some reasonable-length files full of
binary zeros with them.  The number of randomly-generated files needs
to be randomized, naturally, and probably should follow some kind of
distribution with a big tail to the right, so that it's not that
uncommon for a random install to put several hundred encrypted files
on the drive.  The value of this is that an attacker now sees
encrypted files on every machine, most of which nobody on Earth can
decrypt.  If this is normal, then it's not evidence.  (There are
probably a bunch of issues here with putting plausible tracks in the
logs, datestamps on the files, etc.  But it seems like something like
this could work)

...
Certainly using another app is fine.  What would be more
relevant to the direct issue is that it becomes routine to
encrypt and to have encryption installed.  See the recent
threads on where all the data is being lost - user data is
being lost simply because the companies don't protect
it.  Why aren't they protecting it?  Because there are no
easy tools that are built in to automatically and easily
protect it.

Huh?  There have been effective tools for protecting data from
disclosure for a long time, though it's not clear what good they'd do
for a company whose whole business was just selling access to that
data for a fee.  I'll bet the Choicepoints of the world are pretty
careful protecting, say, their payroll and HR records from disclosure.
It's just *your* data they don't mind giving out to random criminals.
No amount of crypto could have helped this.

iang

--John Kelsey

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: Papers about Algorithm hiding ?

2005-06-06 Thread Bill Stewart

At 01:14 PM 6/3/2005, [EMAIL PROTECTED] wrote:

I think we are already in a state where practically everybody that has a
computer has crypto available, and it's not difficult to use it!


Of course they have it -
the problem is having crypto in a way that's not suspicious,
and suspicious is highly dependent on your threat model.
For instance, Microsoft Word has crypto -
it's lousy crypto, which isn't directly relevant here,
but it's a utility that people view as normal,
while PGP is inherently suspicious-looking.
No reason that OpenOffice couldn't have crypto that's actually reasonable 
quality.
The rename the binaries strategy is probably more reliable than 
cyphersaber etc.







-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: Papers about Algorithm hiding ?

2005-06-04 Thread Ian G
On Thursday 02 June 2005 13:50, Steve Furlong wrote:
 On 5/31/05, Ian G [EMAIL PROTECTED] wrote:
  I don't agree with your conclusion that hiding algorithms
  is a requirement.  I think there is a much better direction:
  spread more algorithms.  If everyone is using crypto then
  how can that be relevant to the case?

 This is so, in the ideal. But if everyone would only... never seems
 to work out in practice. Better to rely on what you can on your own or
 with a small group.

The number of people who are involved is actually quite
small if you think it through.  It's more a shift in attitude that
is the barrier, not a large number of people who have to
be sold.

GPG is an application that could be delivered by default
in all free OSs.  BSD is more or less installed automatically
with SSH installed.  Linux machines that are set up are
also generally set up with SSH.

From there it isn't a large step conceptually to install GPG
in the base installs.  Start with the BSDs (because they
understand security) and Linux (because they understand
cool).

It's also not a large step to add a special hook into SSH
and browsers to add a simple file encryption utility.  Just
like OpenPGP's secret key mode.  It doesn't have to be
good, it just has to be there.  A lot of machines have OpenSSL
in them (this is how we get easy access to SHA1).  Can we
add a simple file encrypt to that?

Once all the Unixen have these, the next step is to encourage
a little usage...  All you need to do is have one person that
you communicate with like your brother or sister for the fun
of doing some crypto chat, and it now becomes a regular
*non-relevant* issue.  All we need to do is to encrypt and
protect one file and encryption becomes easy.

 In response to Hadmut's question, for instance, I'd hide the crypto
 app by renaming the executable. This wouldn't work for a complex app
 like PGP Suite but would suffice for a simple app. Rename the
 encrypted files as well and you're fairly safe. (I've consulted with
 firms that do disk drive analysis. From what I've seen, unless the
 application name or the data file extensions are in a known list, they
 won't be seen. But my work has been in the realm of civil suits,
 contract disputes, SEC claims, and the like; the investigators might
 be more thorough when trying to nail someone for kiddie porn.)

Right.  If they find any evidence of information hiding
other than a boring OpenPGP install that is as common
as crazy frog mp3s then that's what I'd call highly relevent
evidence.  That would make matters worse for the particular
case at hand.

Information hiding is real sexy.  I wouldn't recommend it
for anyone who isn't really sure of their situation, and is
willing to understand that if he gets caught with it, he's
dead.

 Or use another app which by the way has crypto. Winzip apparently has
 some implementation flaws
 (http://www.cse.ucsd.edu/users/tkohno/papers/WinZip/ ) but a quick
 google doesn't show anything but brute force and dictionary attacks
 against WinRar.

Certainly using another app is fine.  What would be more
relevant to the direct issue is that it becomes routine to
encrypt and to have encryption installed.  See the recent
threads on where all the data is being lost - user data is
being lost simply because the companies don't protect
it.  Why aren't they protecting it?  Because there are no
easy tools that are built in to automatically and easily
protect it.

The picture here is becoming overwhelmingly clear - in order
to protect users we should be employing as much crypto as
we can openly, opportunistically, and easily.  Anything that
holds back from users protecting their data is a bad, and
anything that moves them forward in protecting their data
is a good.

iang
-- 
Advances in Financial Cryptography:
   https://www.financialcryptography.com/mt/archives/000458.html

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: Papers about Algorithm hiding ?

2005-06-03 Thread Steve Furlong
On 6/3/05, [EMAIL PROTECTED] [EMAIL PROTECTED] wrote:
 Another alternative is the cyphersaber type of thing, where you could just
 implement your crypto-code on the fly, as needed.

Yes, I could, and have. Presumably you could. Ben Laurie probably
could blindfolded with both hands tied behind his back. But Alice
Philanderer, Bob Pedophile, Charlie Terrorist, and Generic Joe User
can't. Your alternative is more practical than if everybody would
xxx (sorry, Ian) but still not good enough. If only techies are able
to protect themselves from the JBTs, then merely being a techie will
be grounds for suspicion. (As well as throwing our non-programming 
brethren to the wolves.)

The only realistic solutions are those which allow the concerned but
non-technical user to take measures to protect himself against the
perceived threat, without requiring major changes to human nature or
to society.

As it happens, I have really good test cases to refine my solutions:
my extended family is a bunch of mountain hicks with internet access.
They're not especially educated and certainly not technically adept,
and are concerned about the gummint grabbing their computers or
snooping their traffic. Once I've got an acceptable suite of tools and
a training package put together, I'll post it somewhere. (Don't hold
your collective breath; making a living takes most of my time.)


Regards,
SRF

-- 
There are no bad teachers, only defective children.

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: Papers about Algorithm hiding ?

2005-06-02 Thread Steve Furlong
On 5/31/05, Ian G [EMAIL PROTECTED] wrote:
 I don't agree with your conclusion that hiding algorithms
 is a requirement.  I think there is a much better direction:
 spread more algorithms.  If everyone is using crypto then
 how can that be relevant to the case?

This is so, in the ideal. But if everyone would only... never seems
to work out in practice. Better to rely on what you can on your own or
with a small group.

In response to Hadmut's question, for instance, I'd hide the crypto
app by renaming the executable. This wouldn't work for a complex app
like PGP Suite but would suffice for a simple app. Rename the
encrypted files as well and you're fairly safe. (I've consulted with
firms that do disk drive analysis. From what I've seen, unless the
application name or the data file extensions are in a known list, they
won't be seen. But my work has been in the realm of civil suits,
contract disputes, SEC claims, and the like; the investigators might
be more thorough when trying to nail someone for kiddie porn.)

Or use another app which by the way has crypto. Winzip apparently has
some implementation flaws
(http://www.cse.ucsd.edu/users/tkohno/papers/WinZip/ ) but a quick
google doesn't show anything but brute force and dictionary attacks
against WinRar.

-- 
There are no bad teachers, only defective children.

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


RE: Papers about Algorithm hiding ?

2005-05-31 Thread Scott Guthery
Isn't this what Rivest's Chaffing and Winnowing is all about?

http://theory.lcs.mit.edu/~rivest/chaffing.txt

Cheers, Scott

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Hadmut Danisch
Sent: Thursday, May 26, 2005 5:51 PM
To: cryptography@metzdowd.com
Subject: Papers about Algorithm hiding ?

Hi,

you most probably have heard about the court case where the presence of
encryption software on a computer was viewed as evidence of criminal
intent.

http://www.lawlibrary.state.mn.us/archive/ctappub/0505/opa040381-0503.ht
m
http://news.com.com/Minnesota+court+takes+dim+view+of+encryption/2100-10
30_3-5718978.html



Plenty of research has been done about information hiding.
But this special court case requires algorithm hiding as a kind of
response. Do you know where to look for papers about this subject?

What about designing an algorithm good for encryption which someone can
not prove to be an encryption algorithm?


regards
Hadmut


-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to
[EMAIL PROTECTED]



-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


RE: Papers about Algorithm hiding ?

2005-05-31 Thread Valery Pryamikov
 -Original Message-
 Hadmut Danisch wrote:
 
 ...
 Plenty of research has been done about information hiding.
 But this special court case requires algorithm hiding as a kind of
 response. Do you know where to look for papers about this subject?
 ...

Here is the list that you can start with:

[1] Boaz Barak, Oded Goldreich, Russell Impagliazzo, Steven Rudich, Amit
Sahai, Sali Vadhan, Ke Yang. On the (im)possibility of obfuscating
programs. In Proceedings of CRYPTO 2001.
[2] Benjamin Lynn, Manoj Prabhakaran, Amit Sahai. Positive Results and
Techniques for Obfuscation. In Proceedings of Eurocrypt 2004. 
[3] Hoeteck Wee. On Obfuscating Point Functions. Computer Science
Division University of California, Berkeley. Jan 2005
[4] Christian Collberg, Clark Thomborson. Watermarking, tamper-proofing
and obfuscation - tools for software protection. IEEE Transactions on
software engineering, vol.28, No.8, August 2002.
[5] Christian Collberg, Clark Thomborson. Watermarking, tamper-proofing
and obfuscation - tools for software protection. Technical Report
TR00-03, The Department of Computer Science, University of Arizona,
February 2000.
[6] Christian Collberg, Clark Thomborson, Douglas Low. Breaking
Abstractions and Unstructuring Data Structures. IEEE International
Conference on Computer Languages, May 1998. 
[7] Christian Collberg, Clark Thoborson, Douglas Low. Manufacturing
Cheap, Resilient, and Stealthy Opaque Constructs. Principles of
Programming Languages 1998, POPL'98, January 1988.
[8] Christian Collberg, Clark Thomborson, Douglas Low. A Taxonomy of
Obfuscating Transformations. Technical Report 148, Department of
Computer Science, The University of Auckland. July 1997. 
[9] Chenxi Wang, Jonathan Hill, John Knight, Jack Davidson. Protection
of Software-based Survivability Mechanisms. International Conference of
Dependable Systems and Networks. July 2001. 
[10] Chenxi Wang. A Security Architecture for Survivability Mechanisms.
PhD Dissertation, Department of Computer Science, University of
Virginia, October 2000.
[11] Chenxi Wang, Jonathan Hill, John Knight, Jack Davidson. Software
Tamper Resistance: Obstructing Static Analysis of Programs. Technical
Report CS-2000-12, Department of Computer Science, University of
Virginia. May 2000.  
[12] Cullen Linn, Saumya Debray. Obfuscation of Executable Code to
Improve Resistance to Static Disassembly. ACM Conference on Computer and
Communications Security (CCS 2003), October 2003, pp. 290-299
[13] Fritz Hohl. A Framework to Protect Mobile Agents by Using Reference
States. Proceedings of the 20th International Conference on Distributed
Computing Systems (ICDCS 2000), pp. 410-417, 2000.
[14] Gregory Wroblewski. General Method of Program Code Obfuscation. PhD
Dissertation, Wroclaw. 2002.
[15] S. Chow, H. Johnson, P.C. van Oorschot, P Eisen. A White-Box DES
implementation for DRM applications. In Proceedings of ACM CCS-9
Workshop DRM. 2002
[16] Matthias Jacom, Dan Boneh, Edard Feltin. Attacking an obfuscated
cipher by injecting faults. In Proceedings of ACM CCS-9 Workshop DRM.
2002. 
[17] Yuval Ishai, Amit Sahai, David Wagner. Private Circuits: Securing
Hardware against Probing Attacks. In Proceedings of CRYPTO 2003.
[18] Markus Kuhn, Oliver Kommerling. Design Principles for
Tamper-Resistant Smartcard Processors. USENIX Workshop on Smartcard
Technology proceedings, Chicago, Illinois, USA, May 10-11, 1999.
[19] Ross Anderson, Markus Kuhn. Tamper resistance - a Cautionary Note.
The Second USENIX Workshop on Electronic Commerce Proceedings, Oakland ,
California, November 18-21 1996, pp 1-11.
[20] Gael Hachez. A Comparative Study of Software Protection Tools
Suited for E-Commerce with Contributions to Software Watermarking and
Smart Cards.
PhD thesis, Faculte des Sciences Appliquees Laboratoire de
Microelectronique. Universite Catholique de Louvain. March 2003.
[21] T.-C. Lin, M. Hsiang Hsu, F.-Y. Kuo, and P.-C. Sun. An Intention
Model-based Study of Software Piracy. In 32nd Annual Hawaii
International Conference on System Sciences (HICSS-32). IEEE Computer
Society, January 1999.
[22] Amit Sethi. Digital Rights Management and Code Obfuscation. Thesis
in fulfillment of degree of Master of Mathematics. University of
Waterloo, Ontario, Canada. 2003.
[23] M. Kunin. Why do Software Manufactures Tolerate Piracy in
Transition and Less Developed Countries? A theoretical model. Discussion
Paper Series 2001-75, JEL classification: O34;L20. Center for Economic
Research, Charles University and the Academy of Sciences, Czech
Republic, October 2001.
[24] Christian Collberg. SandMark Algorithms. January 28, 2003
[25] Christian Collberg, Ginger Myles, Michael Stepp. Cheating Cheating
Detectors. Department Computer Science University of Arizona. Technical
Report TR04-05. March 3, 2004
[26] James R. Gosler. Software Protection: Myth or Reality? Sandia
National Laboratory. Advances in Cryptology - CRYPTO'85. 1985.  
[27] Kelly Heffner, Christian Collberg. The Obfuscation Executive.

Re: Papers about Algorithm hiding ?

2005-05-31 Thread Jozef Vyskoc
HD What about designing an algorithm good for encryption which someone
HD can not prove to be an encryption algorithm?

Hmmm, but to do that one needs to have a good definition of 'encryption
algorithm' and perhaps also some other apparently fundamental terms. But
we have none, I am afraid ... at least it seems it is hard to give precise
definition even of the cryptography alone (the old one relating that to
'secure communication over insecure channel' seems not to be consistent
with quantum crypto...)

Regards,

Jozef


-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: Papers about Algorithm hiding ?

2005-05-31 Thread Jerrold Leichter
| Hi,
| 
| you most probably have heard about the court case where the presence
| of encryption software on a computer was viewed as evidence of
| criminal intent.
| 
| http://www.lawlibrary.state.mn.us/archive/ctappub/0505/opa040381-0503.htm
| 
http://news.com.com/Minnesota+court+takes+dim+view+of+encryption/2100-1030_3-5718978.html
| 
| 
| 
| Plenty of research has been done about information hiding.
| But this special court case requires algorithm hiding as a kind of
| response. Do you know where to look for papers about this subject?
There was a paper published on this a while back.  The question was posed as 
essentially:  Can one produce a one-way trapdoor compiler?  That is, just as 
an encryption algorithm takes plaintext and converts it to cryptotext, with 
the property that the inverse transformation is computationally intractable 
without the key, we want something that takes an algorithm and a key and 
produces a different but equivalent algorithm, such that asking some set of 
questions (like, perhaps, whether the output really *is* equivalent to the 
input) without the key is computationally intractable.  The result of the 
paper was that no such compiler can exist.
 
| What about designing an algorithm good for encryption which someone
| can not prove to be an encryption algorithm?
Can prove *any* algorithm is an encryption algorithm?  If so, I think there 
are some big prizes coming your way.

On a more general note:  This court case is being blown out of proportion. 
Screwdrivers, hammers, and a variety of other implements are burglar's 
tools.  If you are caught in certain circumstances carrying burglar's tools, 
not only will they be introduced into evidence against you, but the fact you 
have them will be a criminal violation in and of itself.  The way the law 
deals with all kinds of ambiguities like this is to look at intent:  If I 
carry a screwdriver to repair a broken door on my own house, it's not a 
burglar's tool.  If I carry it to break the lock on my neighbor's house, it 
is.  Determining intent is up to a jury (or judge or judge's panel, depending 
on the legal system and the defendent's choices).  It's outside the realm of 
mathematics, proof in the mathematical sense, or much other than human 
judgement.  If an expert witness testifies something is an encryption 
algorithm, and the jury believes him more than the defense's expert witness 
who testifies it's a digital controller for a secret ice cream maker ... 
that's what it is.  If the jury further believes that encryption algorithm was 
used in the furtherance of a crime ... the defendent is in trouble.

-- Jerry


-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: Papers about Algorithm hiding ?

2005-05-31 Thread Ian G
On Thursday 26 May 2005 22:51, Hadmut Danisch wrote:
 Hi,

 you most probably have heard about the court case where the presence
 of encryption software on a computer was viewed as evidence of
 criminal intent.

 http://www.lawlibrary.state.mn.us/archive/ctappub/0505/opa040381-0503.htm
 http://news.com.com/Minnesota+court+takes+dim+view+of+encryption/2100-1030_
3-5718978.html



 Plenty of research has been done about information hiding.
 But this special court case requires algorithm hiding as a kind of
 response. Do you know where to look for papers about this subject?

 What about designing an algorithm good for encryption which someone
 can not prove to be an encryption algorithm?

I don't agree with your conclusion that hiding algorithms
is a requirement.  I think there is a much better direction:
spread more algorithms.  If everyone is using crypto then
how can that be relevant to the case?

I would suggest that the best way to overcome this
flawed view of cryptography by the judges is to have
the operating systems install with GPG installed by
default.  Some of the better ones already install SSH
by default.

(In fact the thrust of the argument was flawed as the
user's PC almost certainly had a browser with SSL
installed.  As HTTPS can be used to access webmail
privately and as we have seen this was an El Qaeda
means of secret communication, the presence of one
more crypto tool as relevent is a stretch.)

iang
-- 
Advances in Financial Cryptography:
   https://www.financialcryptography.com/mt/archives/000458.html

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]