Content Warning: In this show, Frank and Andy interview Chris Wexler of Krunam.co to discuss using AI to classify images containing child sexual content. The work is important. The discussion is not suitable for all ears. Please consider your surroundings before listening. Frank and I recommend headphones for consuming this content.
When we record shows, both Frank and I are in “interview mode.” We are listening for terms that may need explaining – like the term “grooming” mentioned around 31:30. Does this mean we are not paying attention to the discussion? I cannot answer for Frank. For me, I am listening differently. I listen for terms to expand – like terms that may be unfamiliar – and topics that may need expanding. Post-edit, I listen again as, well, a listener.
This is an impactful show.
The topic of Chris’ work is heinous. To call the work itself “admirable” is an understatement.
I listened twice. Once to consume the material; a second time to process the excellent work of Krunam.co in identifying CSAM. I am impressed by what they do. I am also impressed in how they do what they do. The business model is innovative.
Human trafficking needs to end.
CSAM (Child Sexual Abuse Material) needs to go.
Thank you, Chris, Ben, Scott, and the entire team at Krunam.co. God bless you.