When Things Go Wrong: Response and Recovery

Originally posted on the Truss blog

When building systems with threats in mind, it’s not enough to just plan, not enough to just raise the cost to a bad thing happening — we still have to have an idea of what we’ll do when the bad thing happens despite our best efforts.

Truss modernizes government and scales industry through digital infrastructure. Information which is sensitive to individuals and to the welfare of the organization flows through the pipes we set up. Whether hospital records, the move locations of a military family, or financial data, Truss takes the best care possible in setting up infrastructure which mitigates the likelihood of a breach. We also have a plan for such a breach, in case it happens anyway.

At RightsCon, I moderated the panel The Rules of Cyberwarfare: Connecting a Tradition of Just War, Attribution, and Modern Cyberoffensives with Tarah Wheeler, Tom Cross, and Ari Schwartz. The question was this: if “cyber”* is a fifth arena of war (the existing domains being land, air, water, and space) what is a just response to a cyberattack which follows the international expectation of deescalating?

The panel and audience knew that the responses which are happening now — the assumption of “hack back” against other state adversaries, the use of CFAA against people who might otherwise entertain the thought of being a patriotic hacker — we don’t agree with. The Computer Fraud and Abuse Act (CFAA) is what makes breaking those terms of service that are too long and dense to read fully a federal crime. That’s right, logging into your partner’s bank account after their death in order to pay the house’s electrical bill is a federal crime, and it’s the same law that’s used in most of the “hacker” cases you read about in the news. It’s also the number one reason the infosec professionals I know and love refuse to work with the government. The ACDC bill which allows “hacking back” is an exception to CFAA which means you can attack a computer that’s attacking you. Except of course it’s not that clear-cut.

Which brings in the questions that many folk in the audience had. What about attribution (the ability to know who is taking the action)? This is hard in digital space because it’s easy to attack something from behind someone else’s IP address. What about asymmetry (an imbalance between those in conflict with one another)? Is it ok if one country attacks the other in cyberspace when the other country is just beginning to get online? These are hard problems, but we can’t wait until they are solved to have conversations about responses. If you’re having a hard time moving on without those hard problems being “solved enough” first, you’re not alone – the audience also had a deeply difficult time with it.

But what would be acceptable? If there was a breach of military moving data, do you think it would be responded to differently than the malicious changing medical records? Do you think who the adversary is would matter? Does the immediate or the potential future impact on those involved matter more? Where is the line between war and espionage? We ended the panel with the a comparison to disaster response, so attendees would have a framing to continue the discussion.

Disaster response also focuses on preparedness (stockpiling water for the next Bay Area earthquake), response (digging our neighbors out of the rubble), and mitigation (enforcing building codes which make collapse in an earthquake less likely). We are terrible at recovery. When it’s time to rebuild, the money, attention, and volunteers have dried up. Huge swathes of Far Rockaway (2012) and New Orleans (2005) are still a wreck from hurricanes.

The same is true for online attacks — whether doxxing (the nonconsensual revealing of personal information) or DDoSing (a distributed denial of service attack is when many computers all pester your computer for a response, not allowing it to say anything). We spend so much attention on battening down the password hatches and doing incident response that most don’t think about what being whole again after an attack that might happen anyway looks like. And so much of infosec and government work is about trying to prevent the Bad Thing from ever happening. Plan A is to make a perfect system. But we must own up to Plan A rarely being the plan that works out. Don’t your contracts also have release clauses in them? Planning for worst case isn’t inviting calamity, it’s being pragmatic.

One of our engineers recently said, “I would rather throw away some work than have to be under a too-tight deadline later.” This was said as Plan A seemed less and less likely due to bureaucracy and too many moving parts. But Plans B and C were being procrastinated on by our government protective cover. Why? I see the cost of exploring options in government as high, with extremely limited resources to work with. This means all sorts of fragility and resources wasted putting out fires when Plan A didn’t work exactly as planned. A balance of ownership, accountability, and flexibility would have helped alleviate this difficult situation. Additionally, setting aside resources for recovering from inevitable failure helps the entire system be more robust.

While Truss doesn’t specialize in the actual recovery (there are firms and insurance providers who do focus on response and recovery plans), we know it’s a necessary part of a complete plan, and you should, too. Good luck out there today, and remember to keep in mind what you’ll do if it doesn’t all work out.

* Note that I have a deep visceral reaction to the word “cyber,” but it’s the word that has been thoroughly adopted in this discipline and so it has been used here for the sake of readability. The confusing image for this post is the inoculation to having to use the word.

A Week of User Rights and the Tools to Support

What a crazy week. This has mostly been typed on a plane back to Boston, where I get to be for a few days before heading to DC and then home to visit. When I landed to SF a week ago, I had plans to see friends, to lounge a bit, to enjoy a city I love so much. Instead, and just as happily, it was a whirlwind of jam-packed braining and action around user rights and the methods and tools to support those rights.

Countersurveillance DiscoTechs

With the Codesign Studio I TA with the Media Lab, a series of Discovery Technology (DiscoTech) Workshops were put on. The ones in Bangalore, Ramallah, Mexico City, Boston, and San Francisco were all inspiring. You can see more about the projects, art, and progress over on our hackpad. Some examples were stories from Venezuelan activists, face painting to deter facial recognition (so hard!), long-time surveillance on poor communities in America, and spoofing DNA.

And seriously. Take a few minutes to go through the partner pages for this. Need a bit of morning outrage? Think everything’s going pretty ok in the world? Nope!

UI/UX for Crypto Tools Hackathon

The second usability hackathon with OpenITP went incredibly well, and repped as the San Francisco DiscoTech as well as its own thing.

It was utterly luxurious to work on large team of people. Ciprian had logistics covered, solidly. Gus knew the projects inside and out. Bex had notes and tone in beautiful competence. Anytime I would think “oh, we should..” it was already in progress. All I had to do was herd the kittens, which was absurdly fun.

RightsCon

It was really nice to completely surrounded by the people I usually see when we all jam into the one or two sessions at any tech or policy event which involve both. But that overlap was the whole conference, so we were able to dive in much deeper, see more nuance, and see next steps. I learned about funder motives, and the initiatives which backed tech in atrocity prevention/detection/accountability, and about many many tools used to amplify the voices of marginalized people. I drew a lot, and I hugged even more.

VizNotes

Full set on bl00viz.

Typed Notes!

I typed notes for two interactive sessions for sake of formatting. One was a review of the the UI/UX hackathon the weekend before, the other was stories from the field and suggestions for how to be better trainers. Those can be found over on the Civic blog.

Responsible Data Forum

Thursday I trekked out to Oakland to participate in Engine Room‘s Responsible Data Forum, as hosted by the inspirational Aspiration Tech. Again, I was spoiled by being surrounded by an impressively diverse set of people interested in the same fulcrum of concern and change. We skeletoned out plans for checklists before collecting data, and workflows that include project death, and illustrated how data moves through a company. We talked hosting and coercion resistant design and informed consent. We also talked about context-based privacy in disasters. I’m looking forward to seeing what comes of the day.