This is a bit long, to here's the TL;DR: If you just want to stop thieves, use tokens and biometrics for convenience. If you want to protect privacy, use a password. If you want to protect against both, start with a password, and add token, if you want more security then add a biometric on top of that.
If you have something encrypted with a modern symmetric algorithm (e.g.
AES/Rijndael, Twofish, Serpent), then you don't need to worry about it being
decrypted without your 128-bit key.
If you are expecting quantum computers to become cheap in the near future, or
you're worried about Starfleet being willing to consume the rest-mass energy of
a galaxy to decrypt your file, use a 256-bit key.
If you're really worried about a state willing to spend a few million USD on a
side-channel attack (which goes around the cryptography, so your cipher and key
don't matter much) against your one file or device, then keep your data
somewhere other than a computer or phone and far away from anything resembling
a network.
----------------------------------------------------------------------------------------------------
First, as to the biometric issue and state compulsion:
1) The Virginia decision is pretty unfortunate, they're basically saying that
the fingerprints are like keys (which are not, of course, testimonial). This
goes with existing caselaw, however, and actually makes sense if you think
about it like a lawyer would.
Imagine you have private papers, sealed in a vault in your home. The state
cannot force you to tell them the combination (knowledge), but they can demand
the key (if there is one) or cut open the safe if necessary. All that the state
needs is a search warrant and the time/money to break open the safe, because
it's the 4th amendment, which acknowledges your property rights, that applies
in that case.
Your phone is considered just another sealed vault where you keep your
private papers. You're expected to protect them from thieves, but the state
can, and will, compel you to produce the keys to unlock it if they can.
2) Using a biometric or token without an accompanying password is fine for
security, but bad for privacy. All "good" digital security, *when privacy is
the goal*, requires a password (or equivalent, "something you know").
That locked-in-your-mind characteristic is the only thing that's protected
by the rights acknowledged in the 5th amendment.
A search warrant is the only requirement for something locked by a key
(token, "something you have"), and the state may use any reasonable means to
get in without the proper key (subject to warrant requirements).
It seems clear, to me, from recent cases that biometrics ("something you
are") are going to be treated the same way as a physical key by the courts, at
best.
3) The more concerning issue is some recent cases where courts did not require
a search warrant to break into and examine the contents of a smartphone. That
issue is far more worrisome, and it's an area where there is very valid basis
to oppose (through the political process) the overreach of the state.
4) The use of biometrics on phones is predicated on the assumption that you
have no privacy interest in your phone; the security of the phone is against
theft, not against snoops or law enforcement.
If you are trying to protect privacy, in addition to property (and intend to
use biometrics), then after you've proved knowledge, and provided a physical
key/token, you're challenged with a unique identifier you cannot change
(yourself) and, ideally, cannot replicate as a final challenge.
In the case of a phone, the passcode is knowledge, the phone is a token, and
your biometric would be the final layer of authentication.
If you are concerned about privacy on your phone (not just theft), then it
currently falls on you to require a passcode/pin, instead of using the more
convenient token or biometric options.
If you think about it, we've used a combination of token and biometric for
quite some time (driver's license with photo or other state-issued document
with photo) for government authentication, so anything you don't want the
government to have should use something other than those two factors (possibly
in combination with those two factors as added protection against thieves).
I may not like that the state intends to compel production of a biometric, but
it doesn't, as far as I can tell, significantly conflict with long-standing
jurisprudence either.
----------------------------------------------------------------------------------------------------
Second, as to encryption. If you're afraid of being targeted by a state with
massive resources, note the following regarding data encryption:
1) Full disk encryption (any bulk encryption really, including the actual data
transfer in TLS) uses a symmetric key, not an RSA key (which is only used to
authenticate and encrypt the symmetric key, when used at all).
2) The most secure (general bulk data) encryption is actually symmetric key
(e.g. AES/Rijndael, Serpent, or Twofish), public-key encryption (e.g. RSA or
Elliptic-Curve) is really only useful (due to key sizes, computational
complexity, cipher design limitations, etc...) in the key-exchange portion of a
remote communication, or in some types of authentication (e.g. token).
3) For modern symmetric key ciphers you need only 128 bits for the key, 256 if
you're truly paranoid.
4) Rijndael, Serpent, and Twofish are unbroken and considered extremely secure.
The full ciphers are not subject to any known attack better than brute-force.
There are known attacks against reduced-security versions of the algorithms,
with Serpent having the largest security margin (difference between attackable
versions and the full cipher) among them.
In all cases, even the best known attacks against *weakened implementations*
require more computing power than the entire planet can provide within the
foreseeable future.
Basically, decrypting a well-studied block encryption cipher (such as Twofish
or Serpent) with an open-source implementation and a decent key (20-30 well
chosen characters) is currently beyond the capabilities of even massive state
actors (such as China, Russia, or the USA), or even the entire planet working
in concert.
Side channel attacks (such as those used to break ATM's, iPhones, or SSLv3) are
the current preferred means, as they're actually possible, but incredibly
expensive (in most cases, but not for SSLv3; seriously, don't use SSLv3 for
anything). Unless your data is worth the millions required to accomplish a
side-channel attack against your system, you probably don't need to worry about
these.
No key or cipher will protect from side-channel attacks as the side-channel
attacks (by definition) bypass the cipher and attack the processing
environment. Some newer (and less studied) ciphers like Threefish may limit the
usefulness of some side-channel attacks, however, but even that is very limited
added security.
If you're concerned about side-channel attacks (e.g. your data is worth more
than 1 million USD to a state or other large organization wanting to read it),
then you'd best keep that data off of anything using an SSD, CPU, or magnetic
disk, and away from any device connected to a network.
----------------------------------------------------------------------------------------------------
Some basic numbers, for the curious:
Brute-force of a 128 bit key requires a median of 2^127 trials, which, at 1
microjoule/trial (far below current processors' power cost), would require
roughly 1 billion times the total energy consumption (in all forms) of the
world in 2010. Even assuming there existed incomprehensible numbers of CPU's
available to run the trials, it's beyond current human capability to power them
all.
Brute-force of a 256 bit key would require an amount of energy exceeding the
estimated theoretical rest-mass energy (including dark matter and dark energy)
of the observable universe.
Quantum computing can reduce the effective key size by half (Grover's
Algorithm), however, so if you expect quantum computers to become useful for
this purpose in the next decade or so (unlikely), then you might prefer a 256
bit key. Even at 64 effective bits, however, it would take millions of quantum
CPU-years to complete the process, for a single encrypted file.
The best side-channel attacks I know of are against the CPU (SSD is the next
target, followed by magnetic disk and network connections), it requires, on
average, 3 weeks of time using extremely specialized equipment and highly
skilled engineers. The process involves tens of thousands of iterations and
only works against relatively short keys (such as a PIN).
The best side-channel attacks I know of against actively defended systems
(systems that self-destruct in some way when an attack is detected, such as by
deleting all data after some number of failed logins) involve painstaking work
with an AFM and some extremely careful interaction with the active system
cooled to extremely low temperatures. These are also limited to relatively
short keys (such as a PIN) and take considerable time.
On 08/02/2016 02:08 PM, Mike Bushroe wrote:
> This is scary. I would hope that as caselaw becomes more extensive and
> complete they split this into two parts. I have no qualms about allowing
> police to compel finger prints of any degree of fidelity. It is already
> standard practice to photograph and finger print every arrested person, so
> this is little change from decades worth of standard practice. However, I
> think they should split this when it comes to the step of fabricating a
> fake finger (or compelling the defendant to use their own finger) to unlock
> any secure data store (phone, computer, network file server(that is a scary
> hole in the security system itself!), safe, etc. If they have gone to the
> effort of securing access with a finger print then the information inside
> is clealry not 'public domain'. It should be as secure as their own
> testimony. It should be illegal for them to 'force' entry with a fake
> finger, and they should not be able to use any data inside, nor any further
> leads developed from data learned by that process. However, I am not
> confident that the world will be that reasonable.
>
> A second weakness in my argument is that on TV, they routinely hack into
> someone's encrypted files and disks. If they can actually use anything that
> they can hack out of your computer, then faking a finger print to gain
> access is just a partially physical method of doing the same encryption
> hacking.
>
> If this turns out to be the case, or becomes the case, then there would be
> no safe way to store *any* information digitally. And since having a
> passphrase to unlock a large encryption key is no more secure than the
> passphrase and encryption key storage program, I begin to wonder about
> encrypting entire disks with a 2048 bit key that IS the passphrase! Now we
> just need to learn how to do passphrases with 2048 bits of significant
> data. ugh! Upper and lowercase letters, digits, punctuation only give about
> 6 1/2 bits per character. That would need a 315 character pass phrase to
> remember and type in each time to get maximum security. And don't even
> *think* about writing it down somewhere! :)
>
> Mike
>
>
>> The Smartphone versus the Fifth Amendment," Berkeley Technology Law
>> Journal, 21 Dec 2014[3]
>>
>>> in the aftermath of Virginia v. Baust, many smartphone users may soon
>>> reconsider their reliance on fingerprint ID technology.
>>>
>>
>> In October [2014], a Virginia trial judge ruled [in Virginia v. Baust]
>>> that unlike a passcode, the production of one's fingerprint is not
>>> "testimonial communication", and therefore, the Fifth Amendment privilege
>>> against self-incrimination cannot be invoked. Rather, the government may
>>> properly compel the production of a smartphone user's fingerprint to unlock
>>> the user's device. This force compulsion would ostensibly extend to any
>>> applications within a device that can be opened via fingerprint.
>>>
>>
>> However,
>>
>> As a trial court, the ruling in Virginia v. Baust is not mandatory law.
>>> However, as with any early caselaw in a novel and undeveloped area of the
>>> law, this opinion will likely be cited as a persuasive authority.
>>
>>
>>
>>
>> ---------------------------------------------------
>> PLUG-discuss mailing list - [email protected]
>> To subscribe, unsubscribe, or to change your mail settings:
>> http://lists.phxlinux.org/mailman/listinfo/plug-discuss
signature.asc
Description: OpenPGP digital signature
--------------------------------------------------- PLUG-discuss mailing list - [email protected] To subscribe, unsubscribe, or to change your mail settings: http://lists.phxlinux.org/mailman/listinfo/plug-discuss
