Parameters of Social Interaction

What does equality look like? How do we know if we are getting there?

This is the question I asked to open my talk at SHA 2017. It is also the question carried with me as I walked into CtK.Campfire. Both aimed to look at how to mitigate the polarization of human interaction in a digital age. The talk looked at the infrastructure of human interaction, and the retreat embodied some of the best ideals towards action. I’ve written two blog posts – one about each event – but they occurred temporally and intellectually adjacent. You can find the post about CtK.Campfire here.

The talk at SHA2017 (the Dutch hacker camp) was called “Weaponized Social.” WeapSoc is a project in which Meredith and I invested heavily through 2014 and 2015. She has gone on to write for Status451 on an extension of the topic area. I’ve continued to frame bits of my work in this context but have generally not kept up. It’s some of the most intellectually stimulating and emotionally draining work I’ve ever done, and that includes disaster response in the field.

A background assumption for this talk is that the effects of violence become less and less apparent to an observer of a single instance as we push the edges of “acceptable behavior” into being more aligned with human rights.

Violence is defined by the World Health Organization as “the intentional use of physical force or power, threatened or actual, against oneself, another person, or against a group or community, which either results in or has a high likelihood of resulting in injury, death, psychological harm, maldevelopment, or deprivation”, although the group acknowledges that the inclusion of “the use of power” in its definition expands on the conventional understanding of the word.

Example: seeing one person hit a non-consenting person is (pretty) easily defined as violence. Seeing one person say “your a dumb bitch” online to another non-consenting person isn’t as easily defined as violence (it’s often instead categorized as “conflict“). We have to zoom out to see that the receiver isn’t able to be online any longer due to thousands of similar messages in order to see it as the violence (in the form of depravation to opportunity or psychological harm) it is. Here’s just one example:


I don’t want to limit what this person says, but I also have a right not to experience him saying it, if it detracts from my ability to be online. As the quote says, “your right to swing your fist ends where my nose begins.” How can we bridge this sort of contention at scale?

To zoom out like this, and to take action at a systemic level, we luckily have Lessig’s four forces for social change. As the infosec crew which was the audience at SHA is largely skeptical of law (excepting the EFF), of social norms (“don’t tell me how to act”), and that I’m skeptical of markets being able to solve problems of inequality, we are left with architecture/code.

In the talk, I asked this question:

“Do we want to take a scientific approach to equality, where we tweak our infrastructure in explicit ways to see if it changes how people are interacting?”

We, as the creators and maintainers of online spaces have a responsibility to strive towards equality in the ways available to us. How can we do this without surveillance and control of speech? We change the architecture of the spaces. The crew of Weaponized Social (namely, TQ at the SF event in May 2015) started to lay out what the different parameters of social interaction are. Such as, how many people can one account be connected to, how far a message can travel (through timeouts or limits to re-broadcasts), of if an element of serendipity is introduced. These are toggles which can be changed, sliders which can be moved.

If we change these things, we can see how/if architecture changes the way we interact. The social sciences point to us being deeply (tho not solely) affected by our environments. By changing the architecture of online spaces, we could see how it changes how we interact. Who feels safe to speak by taking part in the act of speaking. We can then make better choices about our individual instances and realities based on those results. We now have one more set of tools by which to examine if we are progressing towards equality, without impinging on the individual right to speak. I hope you make use of these tools.

Technology as a Means to Equality

Originally posted on Geeks Without Bounds

I had the honor recently of speaking at the Médecins Sans Frontières (MSF / Doctors Without Borders) Canada Annual General Assembly (AGA). While an international organization, each location has a very large group of people who work on decision and policy for their specific group for the year – usually in the AGA. These are three days of talks, debates, and dinners. The international group defines a focus for the discussions, but it’s up to each pod how they act around that focus. This year, it was how MSF is using (or not) technology. While most of the talks were internal, the bit of time I was there the topics ranged from telemedicine to social media in conflict zones. They asked I come speak about technology and disaster/humanitarian response.

The gist of the talk I gave (15-minute video follows) is that technology is a means to more equality in the world – a way to be inclusive. That there are many people in the world who want to use their technical skills to help groups like MSF out, but we absolutely need them at things like hackathons. That there are many people with voices and connections to the globe now, and that groups like MSF have a responsibility to listen to them directly. And that technology, when done in codesign, will be aligned with what their needs are, and is an ongoing relationship, not a one-off delivery.

Again, most all of the discussion happened behind closed doors, but I recorded my laptop and voice while I did my own presentation.

It seemed to go pretty well. We’re keeping the conversation going, and I’m excited for more points of connection. You can follow the prezi at your own pace here, and see the full #vizthink for the panel here.

Some other highlights:
The other exceptional panelists and myself advocated for F/OSS, especially in light of security, for inclusion. MSF is rightfully anxious about infiltration, ways to be transparent, and usability. Ivan and I re-emphasized open source communities, that people are committed to examining (and re-examining) code for backdoors and optimizations. That open source has been around for decades, that most technology is built upon it, and that it’s a way of performing mutual aid between countries and cultures.

Someone asked in Q+A about using things like Facebook and Twitter in the field, if use could cause problems. Problems of location or images suddenly not being as private as you thought, and kidnappings and killings resulting. Or, what if things just get hacked by governments or by insurgents? My response was that MSF, with all their weight and influence in the world, has a duty to insist upon things like Coercion-Resistant design. Insist that these companies treat their customer bases humanely.