Imagine you’re at a border crossing, and the guard asks you to hand over all of your electronics for screening. The guard then asks that you unlock your device, provide passwords and decryption keys.
Imagine you’re at a border crossing, and the guard asks you to hand over all of your electronics for screening. The guard then asks that you unlock your device, provide passwords and decryption keys. Right now, he’s asking nicely, but he happens to be carrying an unpleasant-looking rubber hose,
This isn’t a hypothetical situation. The Freedom of the Press Foundation published an open letter to camera manufacturers requesting that they provide “encryption” by default. The thing is, what they want isn’t just encryption, it’s deniability, which is a subtly different thing.
Deniable
I’m convinced that there’s a sociotechnical blind spot in how current technology handles access to personal devices. We, in the infosec community, need to start focusing more on allowing users the flexibility to handle situations of duress rather than just access control. Deniability and duress codes can go a long way in helping us get there.
Recent events in law have highlighted the need for deniability and duress codes in particular.
In particular, a recent precedent-setting court case in Minnesota
Orin Kerr has a great in-depth analysis of this decision here, but the gist is that the courts have decided that fingerprints don’t count as a “testimonial,” and therefore aren’t protected under the fifth amendment.
There’s an interesting wrinkle to the case in that the defendant willingly told the police which finger would have unlocked the phone. Admittedly, the court could just demand that the guy provide all of his fingerprints and try each of them in a row. If we take this to an extreme, this is not too different from arguing that the police have a right to try to crack a password for the device that they’ve gotten legally, it just happens to be that the characters of the password are physical objects.
The good news is that other decisions have decided that passwords are constitutionally protected. In the esoterically-named “In re Grand Jury Subpoena Duces Tecum”,
However, the bad news is that hand-typed passwords are increasingly seen as the way of the past; hardware tokens and biometric sensing are considered to be far more usable, and will likely be employed more and more in the future.
As mentioned earlier, a key observation from these court cases is that the police can compel you to hand over a fingerprint, but cannot order you to tell the police which finger is used to unlock the device. This would be tantamount to ordering you to provide a passcode.
In the short term, Apple and Google can take steps to alleviate this threat by adding duress codes into their access control mechanisms. For instance, scanning anything but your right index finger might force a password-only lock. Scanning a pinky (or some other fingerprint / combination of fingerprints) might cause the phone to factory reset, or unlock and trigger deletion a specified portion of user data. Adding this functionality might take a few weeks of coding and months of UX research, but it can easily help make the issue void.
In the long term, we need to rethink deploying deniability as a set of strategies for helping users evade coercion in general. What is similarly important is that all devices must have some sort of deniability baked-in, full stop. Adding deniable systems to devices only when that person is targeted provides little protection to at-risk populations like journalists. If it isn’t baked-in to the operating system, the fact that the journalist was using some out-of-the-ordinary software itself, which may or may not have undeniable tells, would likely be a red flag and induce liberal use of the rubber hose.
– Mike Specter PhD candidate in computer science at MIT, with thanks to Danny Weitzner (principal research scientist), Jonathan Frankle (also a PhD candidate at MIT) and the rest of the Internet Policy Research Initiative