Re: Surveillance, secrecy, and ebay

2008-07-27 Thread David G. Koontz
Sherri Davidoff wrote:
> Matt Blaze wrote:
>> Once sensitive or personal data is captured, it stays around forever,
>> and the longer it does, the more likely it is that it will end up
>> somewhere unexpected.
> 
> Great point, and a fundamental lesson-of-the-moment for the security
> industry. To take it one step further: The amount of sensitive
> information an organization stores is roughly proportional to the number
> of data leaks it initiates. We already know that information "wants" to
> be free, and if you keep information around, sooner or later, it's going
> to leak out. (There's probably some mathematical way to describe this
> relationship.)
> 
> Rather than expecting companies to keep data totally secure and then
> send apologetic letters when it gets lost, perhaps we should start
> taxing companies in proportion to the amount of sensitive information
> they store, and use that tax to assist victims of identity theft. This
> would have the double benefit of giving companies immediate incentive to
> reduce the amount of information they store, and would also provide
> appropriate public funding for incident recovery.
> 
> Sherri
> 
> 

Encryption with a resistance to cryptanalytic techniques requiring on the
order of the useful lifetime of the 'secrets' being protected to overcome is
a perfectly valid way to secure private data.  This resulted following the
Privacy Act of 1972, in the release of the Digital Encryption Standard
detailing the Digital Encryption Algorithm commonly known as DES in 1977 and
published as FIPS PUB 46.

Immediately the U.S. government started providing itself with waivers to the
use of encryption for at rest storage of data, that are only being overcome
today.  During the same era, the nation's security agencies exhibited a
strong desire to prevent the disbursement of security technology for private
and business use, as it foils the gathering of economic intelligence and
provides strong encryption to foreign military and security concerns. I'm of
the opinion that DES didn't provide much advantage to 'adversaries' of the
U.S. government, but it's spread was effectively limited to the banking
industry for a considerable length of time.

During it's life time the cost of breaking DES has reduced steadily, to the
point a recent low cost implementation could attack a DES system in between
5 and 32 hours using $1000 dollars worth of commercial FPGA hardware[1], or
a totally brute force attack yielding a key in 7.8 days at the cost of
$10,000[2].  Note that this has resulted in changes  to approved algorithms,
with resulting increase in resistance to brute force attacks by dramatically
increasing the key space.  We now worry about the near mythical quantum
computer's ability to break any current encryption scheme.

While Matt was relating the inadvertent disbursement of information relating
to a criminal investigation, you'd think that could be under the aegis of
the court system, perhaps by tinkering with the rules of evidence.  After
all encrypted storage is an effective means of preventing unauthorized
access, duplication and altering of evidence.  Bar associations would appear
a logical place to influence protecting client-data and client attorney
privilege.

We also see the Department of Defense requiring at rest encrypted storage of
data, the requirement becoming universal over time.  You'd have to wonder if
the requirement was extended to the rest of the U.S. government, just how
long it would take to protect data.  Couldn't be more than a decade.

State and local governments, you run into unfunded mandates.  It helps that
they already have a duty under various privacy laws to protect data, as do
private companies.  Perhaps the problem is not that we need more laws, but
that the laws we have aren't be adhered to?

Is the resistance to data protection today predicated on cost?  We see
secure disk products that when the costs are amortized across volume for a
couple of kilobytes of code, a slightly faster processor, or one with
security co-processor, the cost of developing software interface controls
and finally certification costs, should add a cost burden of a couple of
dollars but are being sold at a premium, all the market can bear.

What's not apparent is the cost of data loss, other than bad press.  We find
interesting cases, such as in aviation security where we find from Professor
Mueller that the cost in terms of lives saved with the Transportation
Security Agency is 15 times higher than their value by protection by other
means[3], indicating we have an enormous white elephant, there.  How do we
prevent the inadvertent replication of waste in another large area of
government mandated security?

Balancing the apparent lack of adherence to current privacy laws and the
potential cost of a bureaucracy dedicated to measuring quanta of privacy
data, regulating the balance of taxes owed, offsets by encryption, tracking
the acquisition of privacy data, it's proper and a

Re: cleartext SSH, Truecrypt, etc passwords in memory

2008-07-27 Thread Peter Gutmann

Sherri Davidoff <[EMAIL PROTECTED]> writes:

For this paper, I specifically examined the case where memory was dumped
while the applications were still active. The snapshots were taken up to
45 minutes after the passwords were entered. (See Appendix A for the
full testing procedure.)  Given that users keep applications such as
SSH, Truecrypt, email, etc open for a significant percentage of time
that they use their systems, I do think it's important for applications
to zero sensitive data immediately after it is used rather than waiting
until the process is closed.


I think it'd be good to distinguish between cases where keeping
cryptovariables around is a bug and where it's by design.  For example SSL
caches the shared secret information for later use in session resumption
so finding a copy of that in memory while an SSL client or server is
running isn't a bug.  Finding it after it's exited is.  Even then though,
some apps include daemons that cache credentials and whatnot for ongoing
use by the app (e.g. the assorted 'xyz-agent' helpers for things like
various SSH clients or GPG) so finding the information in memory when the
app has exited but the cacheing daemon hasn't isn't necessarily a bug.


As a next step, it would be great to follow the same procedure, but
image all of memory after the applications have been closed.


That'd be the interesting one, because keys left lying around in memory
afterwards is definitely a sign of a problem (but be careful about the
cacheing issue mentioned above).

Peter.

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: cleartext SSH, Truecrypt, etc passwords in memory

2008-07-27 Thread Sherri Davidoff
Peter Gutmann wrote:
> So was this a case of "recover data from an active app's memory image"
> (not surprising) or "recover data after the app has exited"
> (surprising, at least for the crypto apps)?

For this paper, I specifically examined the case where memory was dumped
while the applications were still active. The snapshots were taken up to
45 minutes after the passwords were entered. (See Appendix A for the
full testing procedure.)  Given that users keep applications such as
SSH, Truecrypt, email, etc open for a significant percentage of time
that they use their systems, I do think it's important for applications
to zero sensitive data immediately after it is used rather than waiting
until the process is closed. Also, as you point out, there were
passwords such as SSH and root which were retained outside of the
application's memory.

I also did some preliminary experiments to test whether passwords
remained in memory after the applications were closed. However, I
decided to wait until the Princeton/EFF/Wind River folks released their
memory dumper code before analyzing this in detail. As described in the
paper, there are now annoying limitations on access to /dev/mem in
Linux, so I thought it would be best to approach this particular
question by getting a full memory image using cold boot techniques.

As a next step, it would be great to follow the same procedure, but
image all of memory after the applications have been closed. Using Jake
Appelbaum and co's newly released memory imaging tools would probably be
an easy way to get full memory dumps from any OS:

http://citp.princeton.edu/memory/code/

Based on your feedback, I've updated section 2 and the abstract to clarify:

http://philosecurity.org/pubs/davidoff-clearmem-linux.pdf

Thanks for your comments,

Sherri


-- 
http://philosecurity.org

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: cleartext SSH, Truecrypt, etc passwords in memory

2008-07-27 Thread Peter Gutmann

Sherri Davidoff <[EMAIL PROTECTED]> writes:


Hello all. During the past few months, I've been poking around Linux
memory and consistently finding cleartext login, SSH, email, IM,
Truecrypt and root passwords. I've just finished a paper which includes
detailed location and context information for each password. Given the
recent buzz about cold boot memory dumping, it seems the risk associated
with cleartext passwords in memory has increased.


What the abstract doesn't make at all clear is that the process used
seems to have been (from section 2 of the paper):

Start application;
Enter password;
Take snapshot of running application's memory;

(although some passwords were apparently found in non-application-specific
memory, see section 3.7 of the paper).

In other words what's apparently being demonstrated for most of the apps
isn't an ability to recover keys still hanging around in memory at some
arbitrary later point but to recover keys from the active process memory
image.  The reason why I keep using "apparently" is that paragraphs 2 and
3 of section 2 don't make at all clear whether the application is still
active or not, although "after all programs had been launched process
memory was captured live" seems to imply it was a snapshot of a running
process.  Since many crypto applications zeroise keys after they've
been used, it seems a bit surprising that it'd be possibly to recover key
data after the app has exited, as the paper implies.

So was this a case of "recover data from an active app's memory image"
(not surprising) or "recover data after the app has exited" (surprising,
at least for the crypto apps)?

Peter.

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Surveillance, secrecy, and ebay

2008-07-27 Thread Sherri Davidoff
Matt Blaze wrote:
> Once sensitive or personal data is captured, it stays around forever,
> and the longer it does, the more likely it is that it will end up
> somewhere unexpected.

Great point, and a fundamental lesson-of-the-moment for the security
industry. To take it one step further: The amount of sensitive
information an organization stores is roughly proportional to the number
of data leaks it initiates. We already know that information "wants" to
be free, and if you keep information around, sooner or later, it's going
to leak out. (There's probably some mathematical way to describe this
relationship.)

Rather than expecting companies to keep data totally secure and then
send apologetic letters when it gets lost, perhaps we should start
taxing companies in proportion to the amount of sensitive information
they store, and use that tax to assist victims of identity theft. This
would have the double benefit of giving companies immediate incentive to
reduce the amount of information they store, and would also provide
appropriate public funding for incident recovery.

Sherri


-- 
http://philosecurity.org


-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]