Background
In March 2022, the Tech Coalition launched the Video Hash Interoperability Project (VHIP) to help the tech industry find and remove known CSAM videos faster than ever.
Implemented in collaboration with NCMEC, Meta, and Google, this initiative was designed to address a long-standing gap: the lack of standardized support for the many modern video hash types.
The challenge
Hashing – the process of creating digital “fingerprints” of known CSAM – is a powerful tool to detect re-uploads of the same footage. But not every company uses the same type of video hash. If Company A uses one format and Company B uses another, databases that store hashes of known CSAM cannot automatically translate between formats. That may slow detection and risk leaving abusive content online.
The breakthrough
VHIP has helped to change that. Together with our partners, the Tech Coalition built a bridge, enabling different hash formats to be compatible with NCMEC’s database.
First, NCMEC and Thorn rehashed more than 220,000 known CSAM videos into formats compatible with Google and Meta systems. Those new hashes were shared with Google and Meta, allowing them to complement their existing efforts and more quickly detect previously undiscovered known CSAM on their platforms.
This system has supplemented video hashing processes across Google and Meta products, helping them – and other participating companies – match against NCMEC’s full database of known CSAM videos and remove content more quickly.
Impact to date
VHIP hashed over 435,000 videos in 2025 alone. This included work catching up on videos from previous years, bringing the total number of videos hashed since the project began to more than 784,000.
Each hash represents a video depicting the sexual abuse of a child reported to NCMEC – each one helping to protect victims and prevent further child abuse.
What’s next
Our Video Hash Interoperability Project hasn’t just strengthened today’s systems – it can open the door for future innovation. Tech Coalition members can adopt compatible video hashing algorithms with confidence that they will be able to make use of NCMEC-provided hashes of known CSAM videos to improve their own detection efforts. VHIP could also be expanded in the future to include additional hashing video formats.
That’s interoperability in action and impact at scale. We are planning updates to VHIP, with information about next steps coming soon. Together, we can continue detecting,
removing, and reporting CSAM – and make the online world safer for children.

