August 20, 2021: ENOUGH IS ENOUGH STATEMENT
Newly Announced Safety Measures to Protect Children on Social Media and Technology Platforms are a Step in the Right Direction
Enough Is Enough® applauds recent actions taken by technology giants, including Apple, TikTok and Instagram, to initiate safety and privacy measures for children and adolescents who use their technology. Specifically, Apple plans to match photos on iPhones and uploaded iCloud accounts to a database of child sexual abuse images, and alert authorities as needed. Teens under 18 using TikTok will now have direct messaging disabled by default, and new Instagram accounts created by kids under 16 will default to private, blocking some adults from interacting with teens on its platform.
In addition, we’re pleased to learn of the decision of subscription-based content provider OnlyFans to remove sexually exploitative videos from their platform.
“Because predator’s prey where kids play, it is encouraging that Big Tech is becoming part of the solution, and not part of the problem, by proactively and voluntarily implementing child safety and privacy features,” said Donna Rice Hughes, President and CEO of Enough Is Enough®. “Actions speak louder than words. Setting safety defaults to private not only protects kids, it empowers parents. These long-overdue protections to prevent exploitation represent a step in the right direction. However, there’s much more work to be done. Accordingly, Enough Is Enough® and our NGO partners will continue to advocate for additional actions to build upon these recent developments.”
It’s no secret that tech conglomerates, who have historically done little to make their platforms a safe place for youth to interact and engage, are now responding to the mounting pressure to prioritize children’s safety.
The sexual exploitation of children is a public health issue that impacts everyone: The data is clear that sexual exploitation of children dramatically escalated during the COVID pandemic, and is now at an all-time high. Record reports of images and videos of child sex abuse material (CSAM) in 2020 increased 28% during COVID as compared to year before; there was a 98% increase in online enticement reports to the National Center for Missing and Exploited Children.
Those at the helm of technology platforms must never allow its technology to be used by sex predators, traffickers and pornographers as a means to exploit our most vulnerable, nor should they be able to cling to Section 230 of the Communications Decency Act (CDA) to avoid responsibility and claim immunity. A historic ruling, announced Thursday on a class action suit brought by two survivors that allowed a lawsuit against Twitter to move forward, represents a step closer to justice for survivors who suffered due to Twitter’s negligence to remove harmful CSAM on its platform; Twitter had previously alleged CDA immunity.
We are grateful to the National Center on Sexual Exploitation Law Center, The Haba Law Firm, and The Matiasic Firm for pushing this case forward.
No single victim of exploitation or single NGO can take on these Goliath-sized battles against Big Porn and Big Trafficking enterprises. The progress made thus far represents collaborative efforts of numerous NGO’s, coalitions, child safety advocates, business leaders including payment processors, and lawmakers, who have joined forces to stand up against these giant conglomerates, who consistently put profit over child safety and human dignity.
“The tide is finally beginning to turn,” said Hughes. “The process of protecting the safety of children on the internet has been a marathon, not a sprint, filled with many milestones and setbacks over the years. It took William Wilberforce a lifetime to abolish slavery. Our adversaries can be assured that those who are defending the innocence of children online will never back down.”