New DataDriven Episode – Chris Wexler on Using AI to Protect the Vulnerable

Content Warning: In this show, Frank and Andy interview Chris Wexler of to discuss using AI to classify images containing child sexual content. The work is important. The discussion is not suitable for all ears. Please consider your surroundings before listening. Frank and I recommend headphones for consuming this content.

When we record shows, both Frank and I are in “interview mode.” We are listening for terms that may need explaining – like the term “grooming” mentioned around 31:30. Does this mean we are not paying attention to the discussion? I cannot answer for Frank. For me, I am listening differently. I listen for terms to expand – like terms that may be unfamiliar – and topics that may need expanding. Post-edit, I listen again as, well, a listener.

This is an impactful show.

The topic of Chris’ work is heinous. To call the work itself “admirable” is an understatement.

I listened twice. Once to consume the material; a second time to process the excellent work of in identifying CSAM. I am impressed by what they do. I am also impressed in how they do what they do. The business model is innovative.

Human trafficking needs to end.
CSAM (Child Sexual Abuse Material) needs to go.

Thank you, Chris, Ben, Scott, and the entire team at God bless you.

Andy Leonard

Christian, husband, dad, grandpa, Data Philosopher, Data Engineer, SSIS and Biml guy. I was cloud before cloud was cool. :{>


Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.