Can we make Security Empirical, and why might we want to?
There's a paper I recently came across by Mohammad Tahaei and colleagues called Privacy Champions in Sofware Teams: Understanding Their Motivations, Strategies, and Challenges. In it, they interview 12 folks on software teams who try to promote user privacy across their teams and organizations. It's a small group, and there are limitations with small group research; findings from studies like this cannot and should not be taken as census representations of what an entire industry is doing. But I love a deep dive into a single person's dilemmas in a specific intersection of role and passion – this is a great use of qualitative methods like semi-structured interviews. Such case studies help us refine our questions for larger future studies, or if we simply want to sit deeply with one other person's experience, which has its own benefits for our thinking, this type of research gives us the opportunity for it.
The paper describes these folks as "Champions." I was struck by the identity label contained in that. You are a Champion for privacy, rather than it just being something you occasionally engage in. The psychological fuel these folks carry around with them is quite clear in the quotes. They care about people and better practice:
It got me thinking quite a bit about the burdens that technical people carry when they feel like they need to represent an entire value to their local team, group, or community, and when that area comes loaded with a lot of responsibility, ethics, and consequences. Champion sounds nice, but you don't have to become a Champion if everyone is already doing the thing you want. So taking on the mantle of Champion means taking on conflict. If you're lucky and good at it you'll persuade people, but either way, it's probably going to be a commitment to carry chronic stress.
Melding with this, specialization is something that our technology workplaces demand from us. But that just means we continue to go deeper and deeper into a context that other people don't share. So specialization also makes us lonely. This journey can be subtle, and creep up on us:
The experiences of folks who find themselves fighting on a technical battleground – privacy, security, infrastructure, or even developer experience – have a lot of common themes. There's a psychological fortitude that goes into becoming a Champion, a clarity of seeing the consequences that other people don't like to see. Most people systematically underestimate the problems, failures, fragility, and errors around them. In this series of studies, researchers tested people's estimation of failure rates across more than thirty domains and found a consistent "failure gap." Champions may just be the people who have learned to overcome the typical failure gap in a particular domain. They're like our sociocognitive field scouts, with sharper prediction skills for the disasters most of us can't tolerate imagining for long.
I'm going to really sound like a psychologist for a second here, but I think this is a beautiful, fascinating set of skills to think about. I respect it. I want to learn more about it. I speculate (I'd love to research this) that Champions have cultivated a high degree of distress tolerance. At the same time, I worry about the brittleness inherent in this lone hero model. Scouts only work as long as a group realizes what they are. When our technology communities have bad norms around the behaviors that we expect on these tough topics, it might even feel like being an expert in them makes you an outsider, a constant diverging voice from the majority. Speaking out on principle is powerful, but also isolating. Highly conscientious people are more likely to take a stand and to feel the distress of wondering if they're the only ones who really care. If our groups don't have warm, supportive cultures, our communities fail to repair, affirm, and appreciate our moments of courage. Where does all that awareness of potential failures go, if it's only kept inside of our heads?
I'm sure none of this surprises anyone who has found themselves in a Champion position. Culture change always involves power struggles, and often, identity struggles. Knowing this doesn't necessarily make it easier, but if you're a Champion, maybe it can validate that you're not alone, and that you have other people out there who can respect, understand, and support what you're doing. If you want to be a better colleague to someone you see Championing, consider trying to make what Gregg Muragishi and colleagues called microinclusions: small, concrete gestures that convey that someone's contributions are valued and that we see them.
Microinclusions are especially impactful in moments of ambiguity, when someone is wondering things like "am I the only one who cares about this? Do they think I'm competent? Do they think someone like me belongs?" Muragishi describes the internal psychological question someone (like a Champion) might be asking like this: "Will others receive [me] in ways that allow [me] to contribute to shared work goals?"
Across tech we struggle with some bad group norms about affirming people's contributions and bolstering our collective courage to face scary possibilities like big failures. Titus Winters recently described fear in tech as an overarching challenge for us to understand and move through. It takes courage to be the first person who speaks up in a way that's against group norms and says "I appreciated that you tried that" if you're worried that'll come off as mushy feelings stuff. But its effect is potent. In general we likely underestimate the amount of times dissenters in our group need to hear our support, we think we're voicing our support more explicitly than we are, and we often fail to see how much they did for us.
And being received and welcomed for their contributions is one important way that our Champions might get to triage and release all that failure knowledge. If you can't find a way to fix the whole thing, at least make it just a little bit more shared. As a supporter, you don't have to practice your microinclusions in the hardest setting first. You could reach out for a coffee with your local Champion, or send quiet feedback about them to their supervisor, before you stand up at an all-hands. One of the biggest complaints our Champion colleagues make is that when things go right, people don't notice. Small acts of noticing things done well and going right might go further than you think.
Supporting Champions and dissenters is also absolutely key to becoming a creative, effective group that makes smarter decisions as a collective than we would as individuals. As Jeremy Winget wrote recently:
Speaking of groups, I've started an experiment recently where I host a free, open "Developer Science Office Hour" on a topic of interest. I'm trying to angle these toward topics that I think need more light. Like Privacy Champions, Security Champions are under pressure, and deserve our help and attention. That was my motivation behind the Developer Science Office Hour I hosted with Dr. Ariana Mirian. In her own words, Dr. Ariana works at the intersection of people * security * compassion * data – and ties it all together into her field of Empirical Security:
Dr. Ariana's work strongly represents the mix of strategies I've seen move software teams forward. Looking at the incredibly complex landscape of security practices, Ariana didn't stay stuck in the lab, lecturing people about the cutting-edge. Instead, she dove into real-world collaborations with a large IT organization, helping them take advantage of their data, and suggesting meaningful changes to how they invested their limited resources. Packaging that work into shared open publications has allowed the teams she's worked with to reference credible, peer-reviewed science to back their changes, adding to the systematic evidence collection for the field rather than keeping insight trapped inside of a single team or org. Ariana sees using data (thoughtfully) as a way to shift the narrative on what our organizations should do and provide scaffolds for the Champions working out there in the field.
We also talked about:
- The power of control groups for measuring efficacy
- What it takes to run practical experiments that let us test the real impact of different phishing lures
- How we can shift our spend from ineffective security trainings that put the onus on users (which Ariana's research has shown has disappointing impact) and toward more effective structural strategies
- The economics of phishing and hacking, including Ariana's Hack for Hire project
- How security professionals can lean into skills of advocacy and storytelling
One reason to care about using our data as technical practitioners? It can give you stronger ammunition for advocacy, and shift some of the stressful burden from being about your personal opinion to gesturing outside of your local context toward the mounting shared evidence in my field. I am in no way suggesting that data alone magically changes people's minds. But for Champions, I believe that better evidence might create a scaffold for their work that can have unanticipated benefits in reducing exhaustion, saving them from always reinventing the wheel, and feeling just a little less alone than they thought they were.
You can watch our whole Developer Science Office Hour on Empirical Security Here:
Empirical Security Now! with Dr. Ariana Mirian
P.S. The paid tier of this newsletter has allowed me to spend the time to plan and run these Developer Science Office Hours as an open educational event for the community. It is deeply meaningful to me that you are supporting this work of caring about and investing in the people of tech thriving. I've already heard from several people that they're trying to put a few new things into practice in their workplaces after an Office Hour. We are doing it, team. We just held one on thinking about research design, so stay tuned for that!
Member discussion