Dear Citizens Working To Prevent Child Sexual Abuse,
Currently, in Massachusetts, it is legal to create deepfake child sexual abuse material (CSAM). Using AI, or any other digital editing technology, predators can produce realistic visual depictions of child sexual abuse. AI-generated CSAM may be wholly computer-generated, or it can incorporate images of real, often identifiable, children taken from social media or other sources.
Although no physical abuse occurs during the material’s creation, the psychological and emotional harm to the children (whether identifiable or not) and their families is severe and real. The material is also used for sextortion and grooming.
Join Enough Abuse’s campaign to challenge those that would sexually abuse our children online. Ask your legislators to support S.1174/S.2633 – An Act protecting minors from the creation of computer-generated child sexual abuse visual materials.
45 states to date have criminalized computer or AI-generated CSAM. Massachusetts remains 1 of only 5 states that have yet to pass this crucial child protection law!
These bills were introduced by Senator Paul Mark and Senator Michael Moore, and co-drafted by Enough Abuse® with input from parents of children whose images were sexually exploited and from national policy experts at the National Center for Missing and Exploited Children.
Interested in learning more? Visit our Citizens to Prevent Child Sexual Abuse webpage for links to child sexual abuse prevention-related bills, a summary of their key provisions, and why passage of these bills is crucial for our children.