Forbidden Research liveblog: Against the law: countering lawful abuses of digital surveillance

With bunnie huang, Author, Hacking the Xbox: An Introduction to Reverse Engineering and Edward Snowden. Liveblog by Sam Klein, Erhardt Graeff, and myself.

Introduction and overview from Snowden

This is my first time giving an academic talk, and I think it’s the first time a US exile is presenting research at a US academic institution. One of the great things about Cory’s talk is that we don’t talk enough about how laws are a weak guarentee of outcome. theft, murder, etc still happen.

I’m Edward Snowden, I’m director of the Freedom of the Press Foundation. Some years ago I told the truth about a matter of public importance. Some years ago a warrant was issued for my arrest. I’m no longer allowed to travel freely. I’d like to thank MIT for organizing ths conference and the opportunity to speak to everyone in the room today. For journalists in the audience, that’s not a small thing; they deserve credit for living up to that commitment to knowledge. No one is perfect, everyone makes mistakes, but that is quite a risk. This may be the first time an American exile has been able to present research at an American university. That’s [already] enough reason to have this talk at a forbidden research conference.

The guiding theme of many of the talks today is that law is no substitute for conscience. Our investigation covers lawful abuse. What is that? It seems it might be a contradiction in terms. When I talked to someone on Twitter, immediately they said ‘lawful abuse – it’s not a contradiction!’ But if you think about it for a moment it might seem more clear. The legality of a think is after all quite distinct from the morality of it. I claim no special expertise for any of this, but having worked for both the NSA he CIA I know about about lawful abuses. After all, mass surveillance was thought to be constitutional… yet it was later found by the courts to be different, after more than a decade. A lawful abuse, I would define as “an immoral or unethical activity protected under a shell of law”.

What about things that are more recent? Mass surveillance is closest to my own experience, but let’s set that aside. What about torture? the Bush administration decided that this could be indefinitely [legalized]. What about internment? Extra judicial killing, far from any war zone, often by drones? The [targets] may be criminals, or armed combatants — in many cases, but not all. The fact that these things are changing, often in secret, without anyone’s consent, should be concerning.

Such abuses aren’t limited strictly to national security. We don’t want to this to be about politics between doves and hawks.
Segregation.
Slavery.
Genocide.
These have all been perpetuated under frameworks that said they were lawful as long as you abide by regulations.

Lawful abuse surveillance might be more difficult to spot:

  • A restriction on who and how you can love someone,
  • An intentional tax loophole, or
  • Discrimination.

Lawful abuse: so we’ve defined the term. [Willow is thinking about an anarchist zine about D&D called “Lawful Ain’t Good” and how there are only 8 (not 9) alignments.!]

Combined with legal frameworks, our daily activities produce an endless wealth of records which can and are being used to harm individuals, including those who have themselves done no wrong. If you have a phone in your pocket that’s turned on, a long-lived record of your movements has been created. As a result of how the network functions, your devices are constantly shouting into the air, via radio signals, a unique identity that validates you to the phone company. This is not only saved by the phone company, but can be observed as it travels, by independent, even more dangerous third parties.

Due to proliferation of an ancient 3d-party-doctrine style interpretation of law, even the most predatory and unethical data collection regimes are [usually] entirely legal. So if you have a device, you have a dossier. They may not be reading or using it, but it’s out there.

Why should we care? Even if there are these comprehensive records of your private activities: where you are, who you went with, how long you were there, who you meet with, what you purchased – any electronic activity records…?
I can think of 1,070 reasons why it matters. According to figures of the committee to protect journalists, more than 1070 journalists or media workers have been killed or gone missing since January 2005. This might not be as intuitive as you expect… we’ve had a number of wars going on, those could be combat deaths. But: murder is a more common cause of death, and politics was a more common newsbeat [to be targeted] than war correspondence.

Why is this? Because one good journalist in the right place and time can change history. They can move the needle in the context of an election. They can influence the outcome of a war. This makes journalists a target, and increasingly the tools of their trade are being used against them: technology is beginning to betray us not just as individuals but as classes of workers, including those putting a lot on the line in the public interest – especially those who rely on communication as part of their daily work.

And journalists are being targeted specifically based on those communications. A single mistake can have a lot of impact; it can result in detention. For example, David Miranda (related to reporting on Snowden) had his materials seized by the British government, after they intercepted his communications about plans to travel.

It can also result in far worse than that. In Syria, Assad began surveillance the city of Homs, to the extent that all foreign journalists were forced to flee. The government stopped accrediting journalists, and they were being beaten, harassed, disappeared. Only a few remained, including a few who specifically headed there to document abuses being visited upon the population.

Typically in such circumstances , a journalist wouldn’t file reports until after they had left the conflict area, to avoid reprisals. But what happens when you can’t wait? When there are things a government is sort of arguing aren’t happening, but are happening? At the time they denied they were targeting civilians; they were claimed to be enemy combatants. These lawful abuses of activities happen in many places. You say surely this isn’t lawful! By international law you are right; by any interpretation of the Universal Declaration of Human Rights, it’s not lawful. But domestic laws are a hell of a thing… China, Russia, North Korea, Syria have courts. They have lawyers and general counsels, who create policy and frameworks to justify whatever it is the institutions of power want to do.

In Homs, the Syrian government was lying in a way that affected international relations: they justified the offensive, but there was a reporter there [Marie Colvin] infiltrating the city. She crawled in through a tunnel in the dark, climbing stone walls, not speaking to avoid being fired upon. She said this [the government’s claim] was not the case. She filed live report despite the fact they worried about reprisal. She spoke four times to government agencies on a single day. [quote from Colvin’s report – “there are only civilian houses here”], the building she was in was later precisely targeted, and she was killed.

This might sound like just another war story. But the next day, the makeshift media center she was working out of, was repeatedly and precisely shelled. She died, as did a French journalist. The photographer she worked with was wounded. It wasn’t until a while after that we found, based on intelligence collection, that the Syrian Army had given the order to target journalists. How did they discover her? Know where to aim? According to reporting this week: her family has filed a suit against the Syrian government, claiming the audio frequencies of her communications were intercepted by the army (using direction-finding capabilities). Then they walked artillery fire towards the makeshift media center. They had a spotter somewhere in the city helping. By the time the second shell hit, they know they were in trouble… She was caught by a shell and killed.

There’s a question here among policy officials: was this legal, how do we remediate these threats when they happen, when do policies fail? This is an argument that the Syrian government says the event was misunderstood—these were terrorist attacks, or they were lawful.

But does it matter, if it was lawful or not [by national law]? [Perhaps we should ask:] Was it moral? Can we put safeguards in place for future journalists? What about journalists who have to meet with a source in a denied area? They don’t want their phone to be shouting indications of their movements.

This is the area of our research.

We also wanted to investigate: Can we use devices, that are so frequently used against us, as a canary to detect these new efforts to monitor us? (ex: malware attacks, to compromise the phone)

For example, there was an Argentine prosecutor [Alberto Nisman] who was killed. They discovered malware on his phone. It did not match the OS, so it was not responsible in that case, but it was clearly an attempt has been made to compromise devices and use them against him. This same attack was used on other lawyers and journalists in Latin America.

If we can start using our devices as a canary to know when phones have been compromised, and can get that to a targeted class of individuals—journalists or human rights workers—so they know they are acting in unexpected ways. We can affect the risk calculation of the offending actors. The NSA is very nervous about getting caught red-handed. They don’t want to be known to target these groups, journalists and lawyers. They have only done this rarely; it’s not their meat and potatoes [but it has happened].

But if we can find out when it happens, we can start to change the risk calculation. If we can create a clear record of activites. In the cases so far, impunity was the most frequent outcome. Perhaps, we can start affecting the cost of carrying out lawful abuse of digital surveillance.

Let’s go to the technical side and talk about what we’ve done. [to bunnie]

Great #forbiddenML talk: @Snowden & @bunniestudios on hacking phones to detect hacked phones https://t.co/KnuQncm66z pic.twitter.com/h21qx55p9S

— Erhardt Graeff (@erhardt) July 21, 2016

bunnie tells us about the technical parts

There are a lot of smart people working to turn phones into cyber fortresses. But smartphones are a large, complicated attack surface. Can you trust the gatekeeper and UI? If you read things about airplane mode after ios8, it doesn’t turn off GPS. It’s constantly on without any indicator on the phone. So you can turn on bluetooth or wifi mode… but The little icon makes you still think you’re not sending or receiving signals. Can we have a CCTV on our own phone? Technical goal is to be sure the cellular model, WiFi, GPS, etc. Trying to secure this against a state-level adversary is difficult. Turn over the phone and look on the back, and you have a surface that’s simpler, with only two notable features: antennae. A choke point for things going in or out. If you want to ensure your phone isn’t sending signals, you can turn on airplane mode.

Technique: “Direct introspection”
Principles:

  1. OS and inspectable, you don’t have to trust us.
  2. partitioned execution environment for introspection. (in case phone was compromised, don’t ask it to self-eval)
  3. proper operation field-verifiable,
  4. hard to trigger false positives (like walking by a strong wifi emitter),
  5. hard to trigger false negatives Vendor can put holes in a wall that you once thought was intact.
  6. be undetectable: avoid leaving a signature that’s easy to profile (that you’re introspecting)
  7. intuitive interface 🙂 Shouldn’t have to be a cryto person to use it.
  8. final solution should be usable every day; not hard to do while traveling in and out of protected areas.

With that in mind, I went to shenzhen and started buying a bunch of bits and bobs. Are there any viable signals to introspect? We found signals strongly correlated w/ activation of the radio. even firmware updates would have a hard time bypassing that. Candidate wires/signals: configuring antenna switches, configuring power amps, baseband to comms, wlan to comms, reseting pci bus, bluetooth to comms, gps quality sync.

Next steps:

  • Develop hardware. Build circuit to monitor signals. “Battery case” add-on to existing iPhone 6
  • Extend technique. Other makes and models of phones. Filesystem and OS integrity using disk introspection.

Closing

See more: htps://goog.lg/y0Fslu and pubpub.org/pub/direct-radio-introspection

This was my first acad collab; having bunnie as your first collaborator is amazing. He is one of those individuals whose competence gives people impostor syndrome. So, I’ll do my best. thank you very much.

Liveblog for Rightscon USABILITY AND DIGITAL RIGHTS TOOLS: PROBLEMS AND SOLUTIONS FROM THE FIELD

Liveblog from the Usability panel at RightsCon. It covered the Countersurveillance hackathon we had just wrapped up, a SF event headed by OpenITP but also linked to the recent Countersurveillance DiscoTechs. I typed this live and published right after the sessions for the day – please forgive typos etc.

Usability as a huge issue. Doesn’t matter how much we train if the software isn’t usable.

OpenITP has done two events around crypto and usability now – DC in January, and SF in March. OpenITP tools brought to be worked on. Not hard to do a hackathon on usability. Can get people in to improve documentation. User testing is a thing that can happen in a day. Making wireframes. We’ll do a report back on challenges faced, what people are doing usability-wise. What they’re doing to correct those things.

Barbara with Benetech
Martus – about 10 years old, end-to-end encryption for folk in the field doing human work. This is using old software and design patterns. We were super security-focused, but maybe did it wrong. Example of protecting against key logging by clicking with the mouse to put in a password. To avoid pattern recognition, we’ll randomize the screen every time. But people don’t use that.


You can get really detached from what the users want/will use. Even if we had protected the password, the rest of it would have been key logged. Usability is a security feature. The people who need security the most are usually those who understand it the least. And human rights documentarians are targets. Usability can protect them.
Used the DREAD model to make specific threat models and user stories as a way to bring together engineers and designers. Out of that, one of the ways we made the conversations easier was creating a shared language. DREAD is broad, so we clearly defined what we meant by things in simple language.

Bex
Codesign facilitator and community organizer at Center for Civic Media MIT Media Lab. Shared values, common language. We design with the users, but we also see everyone as a designer. Everyone makes a decision together about how technology works and what it does, means it’s more likely to work for people.
Discovering Technology – getting together to share about technology, learn. Very hands on – at the Boston event, used face paint to thwart facial recognition software. Surveillance camera walking tour as a way of calling attention to the social implications. In Bangalore, they did DNA spoofing. So, as far as UX being about how people relate to tools, the DiscoTechs talk to that, and how tech is used to build power.

Projects
Commotion Project
Our goal at Commotion is to make it really easy for anyone to make their own wireless network. There’s a lot of complicated technology under the hood. Want to make the technology interface as simple as possible, but on the other side we have a construction kit that is a visual documentation set anywhere from installing to engaging your community.
As much as we can explain it to ourselves and our colleagues, that is not our audience. This is a hard problem – community organizing to wireless interference. A number of volunteers approached us – which is inspiring – you are so valuable. We don’t know where the mistakes are, we don’t know what doesn’t make sense. Had a number of folk go through documentation, gave them a big red pen. Circled things, this doesn’t make sense, etc. also went over our website. where do you find it, if it exists at all? Someone going through and gave a long feedback form. Also some folk translated.

Serval Project
Having people come in and say our selves, our family and friends at home need this. Make the user interface accessible to them. Take a situation that is incredibly difficult, choose disaster response – don’t know who is alive, where things are. And we’re saying “hey! here is new technology to learn so you can organize!” and your brain isn’t in a place to learn that. We are painfully aware that. It does the job but it’s not the most elegant. The codesign process has been great in getting us to that.
Group from Venezuela where gov is either turning off net or making it go very slowly. Controlling media. Where do you get toilet paper, communicate to people? This is a direct need – they’re talking about their families.
Ok, so if this is for Venezuela, can we get this to work there? They asked for a Twitter without the connectivity – like storyful.

Guardian Project – InformaCam
Collecting digital evidence in encrypted and verifiable media by journalists and migrant farm workers. We went through all these use cases to identify the data visualization needs of these different groups. InformaCam collects a bunch of metadata, which is useful for lots of people (so it’s also sensitive, needs to be protected). Range of uses means collecting and displaying the data needs to be useful to the folk collecting the data. We got a bunch of good suggestions about how to improve the interface by making it more configurable on the backend so when admins come into the system, they can customize the display for the users to either aggregate records based on locations or teh submitter. Also raised a lot of suggestions around additional features, how to
improve the system when there’s low or no internet. Messaging systems other folk worked on.

Guardian Project – Bazaar
Extend the Fdroid market for peer-to-peer, so if there is an app I want to share to friends when bandwidth is super low or down. We went in with a prototype around sharing apps between the phones. Bluetooth, NFC, SD card, etc- how do we distill all these different means into a flow handled by non technical users. “Do both of our phones have bluetooth?”
Worked on distilling the best method to learn those things. Also came to understand how some of the terminology we were using didn’t translate. Repositories, packages.. these words mean nothing to people who just have a few apps they want to share. “Share,” and “Swap.” Above the technical mumbo jumbo.

Small World News – StoryMaker
Anyone can learn to make a better story out of multimedia. We’re building a secure camera that we can also guide people at the point of production. A guiding principal is to put the trainer in the phone. People pay the most attention when they’re doing what they want to accomplish. This weekend we had an initial exercise of value-based design with outsiders – what is the value of story makers. You should feel more capable of doing what they do. Implicit understanding that people are capable, convince others of that. What are different ways to guide them through production process – pay attention to sound, you’re shooting vertically, etc. We started building out the UI with things like level of notification based on triggers (tiny icon if you’re tilted, animated arrow to rotate if you need to 90*). as we work on the variables, it will become magic. Oh, I am a great storyteller! Don’t have to worry about craft.

Questions!
Having people in the room that weren’t on the tech teams was super useful.
User stories, field experience, etc. Started you thinking about how to make your tool more useful. What can we do to make that more possible?

Most Users are Abroad – How do you Engage with Them?
Skype screenshare etc for instant feedback?
OpenITP is looking into this. Operational Security – if we do remote testing, it’s really video heavy. If you’re directly interacting with activists you might put them at risk. Work more with trainers.
Oktavia was suggesting a (non F/OSS) tool to mark up a page about things like “this button is places strangely,” or “I don’t know what that means.”
Shuttleworth Foundation has an F/OSS tool for this! Annotate
Community outreach – groups should be able to do this without us, which requires intervention at a different stage. Carl Vogel on OS projects – not just about the code being open, but the developers being open to questions as well. Having an open IRC channel, but always looking for more ways to show openness.
Training as a multi-week process. Not just fly-in, fly-out. What format do we need for the trainings?

Open Source vs Free/Libre
Free Software was about software respecting the rights of the people using it. Open Source doesn’t do the same. Think about free, don’t get distracted by open source.
In the field, people aren’t aware of whether it’s open source or free. They use what is usable – Facebook and Google. We have to take the bandwidth and financing to make our tools usable, too.

Goal is to not use Google Play
App stores are censored, closed webs.
Not about how to be fun, design for addictiveness. But Google and Apple have realized that you can identify what you need in an app store to be successful, how can we riff on that and make use of the space carved into brains around us. Interface side of things.
Fdroid is just one way of doing this.

Analytics,  Tracking Where Trouble is Happening
No one knows how to track their web logs any more.
Having tools that aren’t cloud or invasive. Guidelines for data collection on your users that are safe? Does that exist? When we track users or deploy elsewhere, is there a list of what to not do?
Benetech has gone through a process of collecting even less data. I get where you’re coming from, but if you get 5 people in a room to talk about a project, it’ll get you 80% understanding. Tracking gives you specifics but not real usability understanding.
Printed things out, people went around with post-it-notes, basically gave us user analytics without logging. People don’t print out paper. Just accept that you’ll do that, have every tiny step involved.
No data without explicit permission. We do everything distributed – we don’t have a way to get anything back
When you have specific steps of opting in, not just a block of ToS. Difference in what permissions you’re giving on different options.
Hearing from users is not anecdotal, it’s data.
Go places, talk to people.
As  a trainer, I’m a filter. The organizations are filters for other  people. Anecdotal is good. It can be enough for the 200 members of teh  organization.

Localizing Our Software
How can we do that without compromising the privacy of the users.
This is where codesign comes in.

Marketing
Lots of people build software, but who is marketing it?
In funding, it’s difficult to support marketing. Usabilty is a sort of marketing. When people like it, they want to tell other people. Usability is easier to finance.
People will push it through their community if they like it.

Example of User-Centered Design that has Been Widely Adopted?
Firefox browser is a F/OSS developed by the users.
Cryptocat. Started with user in mind, had great success for that.
Guardian Project. Seen Informacam iterate through, seen a lot of improvements.

Idea for a Tool, Gets Developed, then Users Asked to Evaluate
This is a part of the funding process. How do you come up with the tools that you build? Do you go out and ask people what they need? When the ideas are codesigned, not just the build, stopgapping after.
Worked for a human rights org called Witness before Guardian. In working with people there, found out that people needed their faces blurred out of photos. Brought this to a hackathon, cobbled together a prototype. Should strive for that more.
Working with migrant farm workers where we made something and it didn’t work, wasn’t understood, drained battery. So we scrapped, rebuilt… but the guts were the same.
Sometimes groups that are already organized will come to us, ask us to engage in that process with them. We’re collaborating from teh beginning. Guardian is partnering with MA ACLU and MIT to reshape something that already exists. You can write for funding together, or the community groups wil have written for funding.
Commotion writes into grants that the community interaction is first. Digital Stewards – we teach them, they teach us how to do it better. Every network tells a story – take cutouts of parts of a network, let them build the network they want.

Impression of Tech in Western World
DRC, trying to make Medicat, merge medical evidence and documentation of medical evidence with ability of camera culture to end goal of prosecuting. Piloted in field SUPER early, revamping based on what we learned. Has to be reframed. Environmental factors, how limited the resources are.

User’s Rights / Non User’s Rights
How do we protect the data from our platforms for use in other places? In SF, it’s not about documentation, it’s about follow-through. Institutionalized problems. What are we going to do with keeping records?