A public statement of the US military’s Southern Command:
We do not force-feed observant Muslims during daylight hours during Ramadan.
That creaking you hear must be the sound of the moral arc of the universe bending toward justice…
Back when I was a graduate student, in the late 1980s and early 90s, there was a lot of discussion, among those interested in cryptography and computing (which I was, only peripherally) of the status of cryptographic algorithms as “weapons”, subject to export controls. The idea seemed bizarre to those of us who thought of algorithms as things you prove theorems about, and computer code as something you write. It seemed as absurd as declaring a book to be a weapon. Sure, you might metaphorically call Das Kapital a weapon, or the Declaration of Independence, but it’s not really a weapon, and a country was much more likely to think about banning imports than banning exports. The author of PGP was then being threatened with prosecution, and had the code published as a book to mate the analogy more explicit.
So, I used to defend free access to cryptography because I thought it was ridiculous to consider codes to be weapons. I now think that was naïve. But if codes are weapons, does that provide a justification for a right of free access (in the US)? Maybe it’s not freedom of speech or the press — 1st amendment — but if cryptography is a weapon, is the use and manufacture of cryptographic algorithms and software protected in the US by the 2nd amendment? Certainly the main arguments made for a right to firearms — sport, self-defence, and bulwark against tyranny — are all applicable to cryptography as well. Are there current US laws or government practices that restrict the people’s free access to cryptography that would be called into question if cryptography were “arms” in the sense of the 2nd amendment?
This is connected to the question I have wondered about occasionally: Why didn’t strong cryptography happen? That is, back then I (and many others) assumed that essentially unbreakable cryptography would become easy and default, causing trouble for snoops and law enforcement. But in fact, most of our data and communications are pretty insecure still. Is this because of legal constraints, or general disinterest, or something else? The software is available, but it’s sufficiently inconvenient that most people don’t use it. And while it wouldn’t actually be difficult to encode all my email (say) with PGP, I’d feel awkward asking people to do it, since no one else is doing it.
It seems as though the philosophy of the Clipper chip has prevailed: Some people really need some sort of cryptography for legitimate purposes. If you make a barely adequate tool for the purpose conveniently available, you’ll prevent people from making the small extra effort to obtain really strong cryptography.