Computer culture and gun culture, ctd.

Since I’ve been interested in the history and political significance of cryptography (I discussed the connection between computers and the 2nd amendment here) I read the book This Machine Kills Secrets by journalist Andy Greenberg, a fascinating, if somewhat brief and barely technical history of underground cryptography in the internet age. Among other things I learned there is that, whereas I had thought of gun culture and computer culture as analogous but non-intersecting, in fact there was considerable overlap:

One adjunct group, called the Cypherpunks Shooting Club, even organized trips to rifle ranges to teach each other to shoot .22s and semiautomatic weapons, the final resort should the government ever come after their electronic and physical freedoms. (Tim May, an avid gun enthusiast himself, didn’t attend. “I Don’t give free lessons, especially not to clueless software people,” he says.)

Jim Bell, a cypherpunk insider, proposed in the mid-1990s “Assassination Politics”, basically a scheme for combining strong cryptography with a sort of stock market for murder contracts. The goal was anarchy:

If only one person in a thousand was willing to pay $1 to see some government slimeball dead, that would be, in effect, a $250,000 bounty on his head[…] Chances are good that nobody above the level of county commissioner would even risk staying in office.

Just how would this change politics in America? It would take far less time to answer, “What would remain the same?” No longer would we be electing people who will turn around and tax us to death, regulate us to death, or for that matter send hired thugs to kill us when we oppose their wishes.

This all sounds like the sorts of rant you hear these days from the extreme gun nuts. So maybe the analogy is not that far-fetched.

And, come to think of it, now that concrete schemes are afoot to turn weapons manufacture into a software problem with 3d printing, even the technical differences between guns and codes are dissipating.

The right to bear codes

Back when I was a graduate student, in the late 1980s and early 90s, there was a lot of discussion, among those interested in cryptography and computing (which I was, only peripherally) of the status of cryptographic algorithms as “weapons”, subject to export controls. The idea seemed bizarre to those of us who thought of algorithms as things you prove theorems about, and computer code as something you write. It seemed as absurd as declaring a book to be a weapon. Sure, you might metaphorically call Das Kapital  a weapon, or the Declaration of Independence, but it’s not really a weapon, and a country was much more likely to think about banning imports than banning exports. The author of PGP was then being threatened with prosecution, and had the code published as a book to mate the analogy more explicit.

So, I used to defend free access to cryptography because I thought it was ridiculous to consider codes to be weapons. I now think that was naïve. But if codes are weapons, does that provide a justification for a right of free access (in the US)? Maybe it’s not freedom of speech or the press — 1st amendment — but if cryptography is a weapon, is the use and manufacture of cryptographic algorithms and software protected in the US by the 2nd amendment? Certainly the main arguments made for a right to firearms — sport, self-defence, and bulwark against tyranny — are all applicable to cryptography as well. Are there current US laws or government practices that restrict the people’s free access to cryptography that would be called into question if cryptography were “arms” in the sense of the 2nd amendment?

This is connected to the question I have wondered about occasionally: Why didn’t strong cryptography happen? That is, back then I (and many others) assumed that essentially unbreakable cryptography would become easy and default, causing trouble for snoops and law enforcement. But in fact, most of our data and communications are pretty insecure still. Is this because of legal constraints, or general disinterest, or something else? The software is available, but it’s sufficiently inconvenient that most people don’t use it. And while it wouldn’t actually be difficult to encode all my email (say) with PGP, I’d feel awkward asking people to do it, since no one else is doing it.

It seems as though the philosophy of the Clipper chip has prevailed: Some people really need some sort of cryptography for legitimate purposes. If you make a barely adequate tool for the purpose conveniently available, you’ll prevent people from making the small extra effort to obtain really strong cryptography.

Continue reading “The right to bear codes”