Laws of the Internet, continued

It seems to be the day for oracular pronouncements about the Net. An engineer I work with told me about an intermittent network connectivity problem he had experienced yesterday. Sometimes he could get on the network and sometimes he couldn’t. The cause? A bad network cable! He said, “Normally with a network problem like this it’s either on or off, not somewhere in the middle.”

I responded without thinking, “Yeah, every now and then we need to be reminded that we live in a very shallow digital layer on an analog world.”

That just might be my first law of the Internet.

Spafford’s axioms of Usenet, generalized

In looking for a source for the “https = armored truck between two cardboard boxes” analogy referenced in my previous post, I came across a list of other famous analogies by the author, Gene “Spaf” Spafford. Many of the ones cited need some context, but #7, which I reproduce below in its entirety, is completely understandable to any Internet veteran of a certain age:

Usenet is like a herd of performing elephants with diarrhea: massive, difficult to redirect, awe-inspiring, entertaining, and a source of mind-boggling amounts of excrement when you least expect it.

The comment, posted prior to Spafford’s withdrawal from recreational Usenet use, sits alongside his three axioms of Usenet (Usenet is not the real world, and usually does not resemble it; ability to type on a computer keyboard is no guarantee of sanity, intelligence, or common sense; and Sturgeon’s Law applies to Usenet). I think the quote above, and Spafford’s axioms, deserve elevating to a higher consideration. They are certainly directly applicable to blogs, MySpace, Facebook, and just about every other online expression of individuality. They may be applicable to Wikipedia, and are certainly applicable if the deletions and random vandalism all too visible from the Recent Changes page are taken into account. They may even generally apply to humanity itself, as formulated below:

  1. Humanity is not (all of) the real world, and human models of the real world usually do not resemble it.
  2. Humanity is no guarantee of sanity, intelligence, or common sense.
  3. Sturgeon’s Law applies to humanity.
  4. Humanity is like a herd of performing elephants with diarrhea: massive, difficult to redirect, awe-inspiring, entertaining, and a source of mind-boggling amounts of excrement when you least expect it.

To which I can only say: True. True.

Ripples from SOURCE: Boston: how much security is optimal?

I wasn’t able to attend this week’s SOURCE: Boston conference, which my company is cosponsoring, but reading about some of the talks and looking at some of the papers that are coming out of it has been fascinating. A few points:

If you think protecting digital systems is hard, what about analog systems like the telephone?

The number of potential points of compromise is staggering… Once the X-rays of telephone equipment and close-ups of modified circuit boards came out (notice that there’s supposed to be a diode there, but someone replaced it with a capacitor…) we were headed into real spy vs. spy territory. Tracking down covert channels requires identifying, mapping, and physically and electronically testing every conductor out of an area. Even the conduit and grounds can be used to carry signal, and they have to be checked.

We don’t normally think about telephone security as an issue (although given the shenanigans that the FBI has been up to, with retroactive blanket wiretapping warrants, almost 200,000 National Security Letters authorizing warrantless wiretapping in a four-year-period from 2003 to 2006, and collection of data that they are specifically disallowed from collecting, maybe we should). Why? Because there’s an implicit cost-benefit calculation at play: given the size of the attack surface, or the vulnerable parts of the infrastructure, the cost of absolute security is staggering.

But very few people bother to follow that thought to the logical conclusion, which is that the optimum number of security violations is greater than zero. I’m not recommending hacks, mind, but if you use a cost-benefit approach to analyze security spending, you are constantly trading the cost of protection vs. the cost of attacks. If you spend so much on security that there are no breaches, you have spent more than warranted by the cost of the attack. Dan Geer makes this argument neatly, in graphical form, in the opening of his article “The Evolution of Security” in ACM Queue. The whole article is worth digesting and mulling over. He points out that as our networked world gets more complex, we start to replicate design patterns found in nature (central nervous systems, primitive immune systems, hive behavior) and perhaps we ought to look to those natural models to understand how to create more effective security responses.

Getting back to SOURCE: Boston, Geer’s keynote there amplified some of his points from the Evolution paper and addressed other uncomfortable thoughts. Such as:

  1. The model for security used to be “I’m OK, you’re OK, the network is compromised,” which leads to the widespread use of encryption. But SSL and other network encryption technology has been famously likened (by Gene Spafford) to hiring an armored car to deliver rolls of pennies from someone living in a cardboard box to someone living on a park bench. Meaning: in a world of malware and botnets, maybe the model ought to be: “I’m OK, I think, but you’re not.”
  2. Epidemiologically, as malware and botnets become more prevalent, they will become less virulent. One of the L0pht team has said (as cited by Geer) that computers might be better off in botnets than in the wild, because the botmaster will want to keep them from being infected by other malware. (This is the gang membership theory of the inner city writ large.) Geer likens this to the evolution of beneficial parasites and symbiotes.
  3. So if botnets are here to stay and we need to assume everyone is compromised, why shouldn’t bots become a part of doing business? Why shouldn’t ETrade 0wnz0r my computer when I make a trade, if only to ensure that no one else can listen in? Suddenly the Sony BMG rootkit begins to make more sense, in a sick sort of way.

Geer closes his talk by bringing back the question of how much security we want. If the cost of absolute security is absolute surveillance, of having one&rsquo’s computer routinely 0wnz0r3d by one’s chosen e-commerce sites, then perhaps we need to be prepared to tolerate a little insecurity. Maybe the digital equipment of telephone equipment boxes “secured” with a single hex bolt makes sense after all.