The Tech Coalition had the opportunity to interview John Starr, Vice President of Industry Sector Strategy and GM of Safer at THORN, and learn what keeps him inspired, as the trust and safety community - industry, NGOs, governments, and academia - doubles-down on keeping children safe.
TC: One of the things that make you unique, John, is your journey across the field of child protection, walk us through your path a bit.
JS: I started in the NGO space. I was an analyst at the National Center for Missing and Exploited Children (NCMEC), where I quickly became aware of the growing scale and severity of the problem.
I then moved to government as an analyst at the FBI doing similar work, and around that same time, the internet just exploded. And that really drove me to want to be part of the solution for industry - the feeling that industry was so far upstream and that there could be a lot of good we could do. So I went to Twitter and I was lucky enough to be asked to help build both the team and the effort around trying to stop the spread of CSAM. I was at Twitter for six and a half years and when I left I was responsible for Safety and Integrity policies for the service.
In terms of the transition from Twitter to THORN, it was one that felt natural. There’s this spirit of “where can I have the most impact in this space?” at THORN, and I was drawn to their superpower of building technology. I’ve seen the power of technology in combating the spread of CSAM, and I was excited about the strategy and tools that THORN was putting together. How do we use technology to make platforms safer? How can we help ensure that accessibility to tech was no longer a blocker? For me, that was the pull.
TC: At the Tech Coalition, we are continuously inspired by the unique sense of collaboration amongst industry and across NGOs; how do you experience that in your work?
JS: I think it’s deeper than just the collaboration between organizations. It’s really the love and respect for those who choose to do this work - whether you’re in the law enforcement community, or in the industry space. There’s a lot of noise around this issue, but the people who do the work have a lot of respect and love for each other. That foundation lends itself to a natural desire to want to collaborate as much as possible.
There’s a lot of collaboration models out there around unique policy areas or certain functions. One example of that is hash-sharing - the ability to share hashes at scale. But an example that’s maybe not as clear, is just taking care of other people who do this work. Going out and having coffee, calling someone up. But then more tactically it can be really helping each other out from a… hey how've you handled this problem, or how do you think about that, or hey what tool do you guys use for this standpoint. The spirit of collaboration is there because we’re not competing over users, we're instead aligning on something that is shared - the desire to solve this issue.
There’s a global and multifaceted community of people who are working on this issue. That type of community is hard to duplicate, and is a big reason why we’re starting to chip away at this problem.
TC: In your experience, how does transparency impact and support child protection work?
JS: It’s important when we're talking about transparency reporting to talk about the origin. Historically, tech companies provided “transparency reports” that detailed requests for user data, content removal, or other controversial asks from government authorities. Those reporting efforts have been extremely important to highlight potential abuse and build trust with users around how their data will or won’t be used. It’s in that spirit that we see the new era of transparency reports emerge, though wholly different in content.
The spirit of transparency reporting from industry is accountability and trust. It’s important for companies to be more transparent with, not only the people who use their service, but with the world as the world becomes more curious about content moderation and understanding what and how things happen.
In terms of CSAM and child protection, the trend we’ve seen is the shift from transparency related to government accountability and how platforms are related, to how the platforms show up themselves - what actions they take, and how many actions they take. This can obviously signal to users - as well as regulators, academics, and non-profits - what kind of principles you have as a company or service, and what you’re doing to back it up, which can help build trust.
TC: You talked a bit about trends already, what other trends are you seeing or would like to see?
JS: There is an incredible opportunity to explore and learn more about the groups of people that are trading and interacting on platforms. A lot of academics would really benefit from that, as would platforms. Interesting thread to pull in terms of a trend of not only the output that you're being transparent about (how you are doing it), but also when you find a complex network of people doing this really awful thing, what can we learn about it as a collective community?
TC: How do we advance transparency?
JS: Transparency in “framework” is really important, transparency in practice is better. How we get there is courage. We get there by helping ensure that a platform continues to push the needle. It is about how we incentivize companies to be more courageous as they continue to be more transparent. Our role is to continue to understand what incentivizes them at the moment, and how we can tap into that.
TC: Why do you choose to do this challenging but important work?
JS: I choose to do this work because I’ve seen it evolve. It started in the space of driving impact, and then, as I have added a few gray hairs, it has also become about caring for the people who want to drive impact - ensuring that they have what they need to do things that we couldn’t do ten years ago. We also want to ensure that there are no blockers to accessibility of tech. If they need tech organizations such as THORN are there to help them; if they need help and insight, we know the community for them to plug into. For me it is about caring for and helping people who choose to say yes to this work.
TC:: What exciting things does THORN have coming up?
JS: We have a unique lane in this space. We are a tech forward organization - that is how we solve problems. We are going to work with any platform with a content upload button that needs help because our goal is to end CSAM on the web. I am most excited about the things we are doing to eliminate the gaps; to end the “I like to do that, John, but I do not have the resources, expertise, knowledge, or technology” conversation. So what THORN is doing with SAFER to ensure that platforms all over the world can have as much protection as some of the largest platforms right when they start is incredibly important. To ensure that we are continuing to push the needles on how we continue to develop new detection techniques and capabilities and really work with the ecosystem to perfect them overtime, also really important. For me it is how do we ensure that the claim to not enough resources or technology is no longer a viable response.