Two views of cybersecurity cost and return

Two different reports came out in the last 24 hours about the costs and investments required for cybersecurity. The first, a paper from the RAND Institute’s Sasha Romanosky, claims that, on average, breaches only have a modest financial impact to organizations—but also notes that the real costs are mostly not born directly by the corporation:

while the potential for greater harm and losses appears to be increasing in time, evidence suggests that the actual financial impact to firms is considerably lower than expected. And so, if consumers are indeed mostly satisfied with firm responses from data breaches, and the costs from these events are relatively small, then firms may indeed lack a strong incentive to increase their investment in data security and privacy protection. If so, then voluntary adoption of the NIST Cybersecurity framework may prove very difficult and require additional motivation.

Bruce Schneier interprets this as meaning that there is a market failure requiring government intervention. That’s certainly one way to view it.

Another perspective: it’s a good idea to lower the cost of defending against breaches. That’s what is suggested by the second article, a study funded by my employer Veracode and conducted by Wakefield Research called “Bug Bounty Programs Are Not a Quick-Fix.” The research found that 83% of respondents released software without testing for or fixing software vulnerabilities; 36% use bug bounty programs; 93% believe that most flaws found in bug bounty programs could have been found and fixed by developer training or testing in the development phase, which 59% believe would be more cost effective.

From Dakar with love

Doom and Gloom from the Tomb: Duke Ellington Orchestra – Festival Mondial d’Arts Nègres, Théâtre National Daniel Sorano, Dakar, Senegal, April 9, 1966. I’m so ambivalent about this. I mean, on the one hand, yes, every bootleg or live broadcast recording of a long-dead jazz artist makes it that much harder for live, working jazz artists to sell albums and earn coin. On the other: DUKE ELLINGTON. WITH PAUL GONSALVEZ, HARRY CARNEY, and JOHNNY FREAKIN’ HODGES. LIVE IN DAKAR.

iOS 10 Music App: second take

I’ve been living with iOS 10 for about a week now, or long enough to have gotten up the learning curve imposed by some of the UI changes. (This is starting to be my general rule of thumb. Any UI change, even if it’s for the better, can be jarring and disruptive the first time you encounter it, but the benefits take a while to perceive). The first week I tweeted a series of questions about the new Music app, most of which I’ve managed to resolve. But there’s one very important question left unanswered, about how iOS 10 Music handles smart playlists synced from iTunes.

Relocation of Shuffle/Repeat controls: Now that I’m used to the change, I actually like Apple’s relocation of the Shuffle and Repeat controls to the newly created “swipe up” pane, which also displays the “Up Next” queue. Placing these controls, which are used infrequently during a normal playback session, where they can’t be hit accidentally counts as a UX improvement in my book.

Relocated lyrics: Given the rights issues around song lyrics, I always was a little surprised that Apple provided a way not only to add them to your own tracks but also to view them in iOS. When I first experimented with iOS 10 Music, though, I thought this had been removed. Good news: they’re still there, just with access moved to a new option on the … menu (or on the Swipe Up pane).  This is somewhat less cool than the move of the Shuffle and Repeat controls because the Lyrics option only appears if the file actually has lyrics, meaning I had to search through a bunch of songs before I could actually find one where the button showed up to verify that this actually worked.

Playlists syncing as empty: I have a few smart playlists that appear to sync but don’t appear populated on the iPhone. Fortunately it looks like there’s a workaround: plug in the phone, uncheck the playlists, sync, then check the playlists to select them and sync again.

Disappearance of star ratings: I’m less OK with this change. iOS 9 introduced “love” as a ratings option alongside star ratings. I didn’t use it because I don’t find “love” granular enough when you’re managing a library of 40,000 tracks. There’s a big difference between “desert island disk” level and “yeah, that track’s OK and I might put it on the right mix tape.” But it looks like star ratings are disappearing, even if they are still in iTunes (and accessible via Siri). Not cool.


Making a hash of Brahms

We’re in the middle of a rehearsal run for the BSO’s upcoming performances of the Brahms Requiem with Andris Nelsons, Thomas Hampson, and Camilla Trilling. On the one hand, it’s a work we’ve performed quite a bit in the eleven years I’ve been in the Tanglewood Festival Chorus, starting with our 2008 performances under James Levine (later released on CD), then at Tanglewood the following summer, then a few years later with Christoph von Dohnányi, and then again just two years ago with Bramwell Tovey. One would think it would be old hat by now.

But there is no such thing as a routine performance of this work. The emotional load alone is enough to make it an incredible experience each time, and the technical aspects of singing the work (as I’ve written previously) both demand and reward close preparation and work.

This time is especially interesting, as we are in the midst of what will hopefully be the second and final transitional season between the forty-plus year reign of founding TFC conductor John Oliver and the selection of his successor. We are working this go round (as we did during the Adams Transmigration) with Lidiya Yankovskaya, who has also been a member of the TFC and worked closely with John.

For this go round, she’s working closely with us on diction (of course), but also on the production of a rich, supported piano/pianissimo sound and on overall blend. Her tool for working on blend is a simple one: the 130 or so of us have been sitting “hashed” for the last several rehearsals. Each individual sits near someone singing one of the other voice parts. There are others on your voice part nearby, but not right next to you. The effect is immediate: you have to listen better to hear the others on your part; you immediately find the places where you need to own and improve your individual performance; and you quickly learn to adjust so that your performance complements that of the other vocal parts next to you. We sounded better in places last night than we have done for quite a while.

Apparently John’s chorus used to perform like this all the time; I can only imagine a conductor of Seiji Ozawa’s great musicianship managing to work with directing such an arrangement. I wish we could do it more often.

Unexpected side benefits of the Apple Watch


I’ve got a new health benefit from the Apple Watch to tout: it deflects canine teeth.

Our old neighbor hosted a garden party on Saturday, and one of the guests brought their dog, a rescue who is devoted to the family. Unfortunately, it wasn’t enjoying the chaos (eight kids, almost all under the age of 10) and was a bit jumpy. I thought it was trying to sniff me, so I put a hand out, and next thing you know it was trying to bite me. Not a play bite either.

I retrieved my now-tattered sleeve from its jaw, talked to the owner to make sure the dog had its vaccinations, and cleaned up. When I looked at my arm, I realized I had gotten lucky; the “milanese loop” band of my watch had slightly deformed under the pressure of the dog’s teeth. If not for the watch, I would have been far worse off than the slight scrape I got.

Apple Watch: like chainmail for your wrist.

On the airing of security grievances

I had a great day yesterday at DevOpsDays NYC. I gave a talk, but I also learned a lot from the other speakers and from the conversations. The format of DevOpsDays is half traditional conference with speakers, half “unconference” with open proposals of discussion topics and voting to establish which topics go where. They call it Open Space, and it’s a very effective way to let attendees explore the conversations they really want to have.

I proposed an Open Space topic on the “airing of grievances” around information security. What emerged was really interesting.

Attendees talked about companies that confused compliance and security, with disastrous results (hint: just because your auditor counts you compliant if you have a WAF with rules, doesn’t mean that those rules are actually protecting you from attack).

We talked about advances in declarative security, in which you could specify a policy for which ports should be open and closed via tools like Inspec.

We talked about the pains of trying to integrate legacy appsec tools into continuous integration pipelines (which happened to be the subject of my talk). I heard about people trying to integrate on-premise static analysis tools into their Jenkins toolchains when the application caused the scanner to exhaust all the memory on the machine. About on-premise dynamic scanners that run for eight hours. About the challenges of determining if an attack has successfully made it past a web application firewall.

And then Ben Zvan said (and I paraphrase), “We have a man-in-the-middle firewall (proxy) between our desktop network and the Internet that screws with security certificates, so I can’t use services that rely on certs for secure communication.”

And the floodgates opened. I talked about the secure mail gateway, intended to prevent phishing, that pre-fetches links in emails and thereby breaks one-time-use links intended for secure signup to new services. We talked about endpoint protection tools that can’t keep up with the MacOS update schedules, and thus make the user choose between taking the OS update and having the endpoint protection tool break, or not taking it and remaining at risk of exploitation of a dangerous newly-announced vulnerability.

The conclusion that we reached is that it’s a deadly dangerous irony that security tools actively stomp on security features, but it’s also the new reality. The complexity of the information security toolstack increases every year, with more and more vendors entering the space and CISOs being forced to become system integrators and figure out which tools conflict with which.

The lesson is clear: If security solution providers are serious about security, they need to build for reduced complexity.

Cocktail Friday: the false origin of the Martini

The Knickerbocker Hotel in Times Square, early 20th century and 2015
The Knickerbocker Hotel in Times Square, early 20th century and 2015, courtesy

I can’t escape cocktails, and cocktail history. Even when I’m traveling for work, they find me. So it is that I find myself staying in a hotel in New York that was once one of the epicenters of pre-Prohibition cocktail culture.

The Knickerbocker Hotel was completed by John Jacob Astor IV, after a development project on land he owned failed. Opening in 1906, it was a destination for after-theatre dining, with decor by Maxfield Parrish (whose Old King Cole mural created for the hotel bar is now at the St. Regis in the King Cole Bar). The reputation of the hotel was largely built on its food and drink, and its social connections; Astor was a bon vivant who was fleeing negative press surrounding the pregnancy of his second (18-year-old) wife when he died in the sinking of the Titanic. (He is said to have remarked, “I asked for ice in my drink, but this is ridiculous.”)

The hotel bartender, one Martini de Arma de Taggia, was said to have created the martini in 1911; mixing dry gin and vermouth, the drink was said to have caught on when it was favored by John D. Rockefeller. Unfortunately for picturesque history, that tale is almost certainly false; John D. Rockefeller was a teetotaler, and the Martini existed well before 1911.

The most likely actual origin for the Martini is in the drink called the Martinez, supposedly invented either in Martinez, California or in San Francisco for a miner who had struck it lucky; it was first documented in 1887. By 1888, the drink first called the Martinez was already being called the Martini. Though the version in Harry Johnson’s New & Improved Illustrated Bartender’s Manual uses red vermouth rather than dry, and adds Boker’s bitters (a little like modern Angostura), gum syrup and an optional dash of curaçao or absinthe, it’s still gin and vermouth at its roots. The first version using dry gin that I’ve found is the 1909 Dry Martini (II) in Applegreen’s Bar Book—still two years prior to the Knickerbocker’s claim.

Whatever the truth of its connection to the Martini, the hotel today contributes to modern cocktail culture with the St. Cloud rooftop bar. I hope to gather impressions there sometime.

I don’t claim to have anything definitive on “how to make the best martini,” but if you want to try its precursor, here’s the Highball recipe card. Enjoy!


Travel day

Not much blogging energy today. I’ve been meeting with a lot of software developers, both in a strictly job-related context and in preparing for a conference I’ll be at the next few days.

I can’t emphasize enough how lovely it is to be at a dinner table where someone observes that we have different words for some animals as meat (porkbeef) than we do as animals (pigcow), and why is that, and then someone who’s not me gives the answer, which is that the cuisine words were brought by the Normans and hail from what is now French, where the animal words come from the Anglo-Saxon. These are my people.

The myth of fingerprints

InfoWorld (Chris Wysopal): Election system hacks: we’re focused on the wrong things. Chris (who cofounded my company Veracode) says that we should stop worrying about attribution:

Most of the headlines about these stories were quick to blame the Russians by name, but few mentioned the “SQL injection” vulnerability. And that’s a problem. Training the spotlight on the “foreign actors” is misguided and, frankly, unproductive. There is a lot of talk about the IP addresses related to the hacks pointing to certain foreign entities. But there is no solid evidence to make this link—attribution is hard and an IP address is not enough to go on.

The story here should be that there was a simple to find and fix vulnerability in a state government election website. Rather than figuring out who’s accountable for the breach, we should be worrying about who is accountable for putting public data at risk. Ultimately, it doesn’t matter who hacked the system because that doesn’t make the vulnerabilities any harder to exploit or the system any safer. The headlines should question why taxpayer money went into building a vulnerable system that shouldn’t have been approved for release in the first place.

I couldn’t agree more. In an otherwise mediocre webinar I delivered in June of 2015 on the OPM breach, I said the following:

After a breach there are a lot of questions the public, boards and other stakeholders ask. How did this happen? Could it have been prevented? What went wrong? And possibly the most focused on – who did this?

It is no surprise that there is such a strong focus on “who”. The media has sensationalized stories about Anonymous and their motives as well as the motives of cyber gangs both domestic and foreign. So, instead of asking the important questions of how can this be prevented, we focus on who the perpetrators may be and why they are stealing data.

It’s not so much about attribution (and retribution)…

…it’s about accepting that attacks can come at any time, from anywhere, and your responsibility is to be prepared to protect against them. If your whole game plan is about retribution rather than protecting records, you might as well just let everyone download the records for free.

So maybe we should stop worrying about which government is responsible for potential election hacking, and start hardening our systems against it. Now. After all, there’s no doubt about it: it’s the myth of fingerprints, but I’ve seen them all, and man, they’re all the same.

It’s not nice to fool Mother Apple

Daring Fireball: Dropbox’s MacOS Security Hack. Gruber rounds up a bunch of links on Dropbox’s bad security practices in its Mac client. Basically, as documented by Phil Stokes, Dropbox asks for your admin password, injects itself into the list of applications that can “control your computer” in the Security & Privacy control panel, and reinjects itself if it’s removed from the list. Thankfully Apple has closed the loophole that allowed this to happen.

The conclusions I take from this:

  1. Dropbox really wanted to ensure that it could take some action that required Accessibility apps
  2. Their product manager didn’t trust users to grant the right authorizations and didn’t want to give them the ability to remove the permissions
  3. Their engineering staff either didn’t push back or got rolled over
  4. Their security staff either wasn’t consulted or didn’t think that this was dangerous—surely no one would ever find a vulnerability in the Dropbox Mac Client and use it to run unauthorized code? Oh wait.

Their PMs respond: the Accessibility permissions were necessary to integrate with other third party applications, and Apple’s APIs didn’t grant the right level of access.

As they say: Developing