Ripples from SOURCE: Boston: how much security is optimal?

I wasn’t able to attend this week’s SOURCE: Boston conference, which my company is cosponsoring, but reading about some of the talks and looking at some of the papers that are coming out of it has been fascinating. A few points:

If you think protecting digital systems is hard, what about analog systems like the telephone?

The number of potential points of compromise is staggering… Once the X-rays of telephone equipment and close-ups of modified circuit boards came out (notice that there’s supposed to be a diode there, but someone replaced it with a capacitor…) we were headed into real spy vs. spy territory. Tracking down covert channels requires identifying, mapping, and physically and electronically testing every conductor out of an area. Even the conduit and grounds can be used to carry signal, and they have to be checked.

We don’t normally think about telephone security as an issue (although given the shenanigans that the FBI has been up to, with retroactive blanket wiretapping warrants, almost 200,000 National Security Letters authorizing warrantless wiretapping in a four-year-period from 2003 to 2006, and collection of data that they are specifically disallowed from collecting, maybe we should). Why? Because there’s an implicit cost-benefit calculation at play: given the size of the attack surface, or the vulnerable parts of the infrastructure, the cost of absolute security is staggering.

But very few people bother to follow that thought to the logical conclusion, which is that the optimum number of security violations is greater than zero. I’m not recommending hacks, mind, but if you use a cost-benefit approach to analyze security spending, you are constantly trading the cost of protection vs. the cost of attacks. If you spend so much on security that there are no breaches, you have spent more than warranted by the cost of the attack. Dan Geer makes this argument neatly, in graphical form, in the opening of his article “The Evolution of Security” in ACM Queue. The whole article is worth digesting and mulling over. He points out that as our networked world gets more complex, we start to replicate design patterns found in nature (central nervous systems, primitive immune systems, hive behavior) and perhaps we ought to look to those natural models to understand how to create more effective security responses.

Getting back to SOURCE: Boston, Geer’s keynote there amplified some of his points from the Evolution paper and addressed other uncomfortable thoughts. Such as:

  1. The model for security used to be “I’m OK, you’re OK, the network is compromised,” which leads to the widespread use of encryption. But SSL and other network encryption technology has been famously likened (by Gene Spafford) to hiring an armored car to deliver rolls of pennies from someone living in a cardboard box to someone living on a park bench. Meaning: in a world of malware and botnets, maybe the model ought to be: “I’m OK, I think, but you’re not.”
  2. Epidemiologically, as malware and botnets become more prevalent, they will become less virulent. One of the L0pht team has said (as cited by Geer) that computers might be better off in botnets than in the wild, because the botmaster will want to keep them from being infected by other malware. (This is the gang membership theory of the inner city writ large.) Geer likens this to the evolution of beneficial parasites and symbiotes.
  3. So if botnets are here to stay and we need to assume everyone is compromised, why shouldn’t bots become a part of doing business? Why shouldn’t ETrade 0wnz0r my computer when I make a trade, if only to ensure that no one else can listen in? Suddenly the Sony BMG rootkit begins to make more sense, in a sick sort of way.

Geer closes his talk by bringing back the question of how much security we want. If the cost of absolute security is absolute surveillance, of having one&rsquo’s computer routinely 0wnz0r3d by one’s chosen e-commerce sites, then perhaps we need to be prepared to tolerate a little insecurity. Maybe the digital equipment of telephone equipment boxes “secured” with a single hex bolt makes sense after all.