This episode contains difficult content about child sexual abuse material (CSAM) that is likely to be particularly sensitive.

 In 2019 nearly 30 million images and over 40 million videos relating to child sexual exploitation were released online… “The most surprising part is that it’s not hidden, it’s on everyday tools that you and I use, it’s on Google and Facebook and Microsoft, they are hiding in plain sight.”

Today’s podcast is about a particularly difficult and sensitive topic that no one really wants to talk about, or even really acknowledge exists. But, child sexual abuse material (CSAM) sadly does exist and my guest today plays a fundamental role in helping to improve content moderation and law enforcement to more effectively protect children online.

Chris Wexler is one of the founders and CEO of Krunam, a technology company that builds tools to make it easier to mitigate and stop the inhumanity of child sexual abuse, while also freeing up police time to focus more on investigating and catching the perpetrators. 

The company’s technology is 10x more effective than PhotoDNA (the current industry-leading technology) and can find previously unknown CSAM in images, video, and very soon, live streaming, for the first time at scale.

“In real-world testing, our classifier outperformed human classification because, after 10 minutes of doing this, you just get exhausted,” Chris explains. “It’s a brutal, brutal way to live and work. We believe [our technology] is more humane, for not only the victims of these crimes but the workers that are fighting these crimes do not have to spend so much time with it.” 

We also talk about: 

  • The story behind Krunam
  • What businesses can do to help stop the spread of CSAM
  • Social enterprises and ESG as the future of business 

You can follow Chris on Twitter @ChrisWexler.