Cupertino: Child safety advocates disrupt Apple developers conference

CUPERTINO — Around 35 protesters gathered at Apple headquarters Monday morning during the company’s annual global developers conference demanding the tech giant add a system to remove child sexual abuse content on iCloud — a venture Apple previously abandoned due to concerns over user privacy.

iCloud is a storage service that allows users to store and sync data across their devices, keeping information including photos, files, backups, passwords secure. The protesters — comprised mostly of child safety experts and advocates — say the service allows perpetrators and abusers to confidently store and share child exploitation materials without getting caught by authorities.

Apple had spent years attempting to design a system that would identify and remove such content on the iCloud. The company ultimately scrapped the idea in late 2023, in response to concerns from digital rights groups that a scanning system would compromise the privacy and security of all iCloud users.

Shortly after, the child advocacy group Heat Initiative began organizing a campaign to demand that the company continue to move forward with detecting and reporting such materials. According to the Intercept, Heat is backed by “dark-money donors,” and the group has declined to comment on its funding sources in the past. The initiative, along child safety groups Wired Human and the Brave Movement, organized Monday’s protest.

Monday’s protest coincided with the first day of Apple’s annual Worldwide Developer’s Conference, an event when the company announces new tech features for its software programs. Sarah Gardner, CEO of the Heat Initiative, said Apple is leaving children safety behind in their conversations for new technologies, and needs to focus on protecting them.

“We don’t want to be here, but we feel like we have to,” Gardner said. “This is what it’s going to take to get people’s attention and get Apple to focus more on protecting children on their platform.”

As company officials and stakeholders passed through the Apple Park Visitor Center, child safety experts and advocates called out: “Build a future where children are protected.” Some spoke about their personal experiences with sexual abuse and voiced concerns about having more child safety measures in place.

“We’re not asking for much,” activist Sochil Martin said as the protester’s chants echoed in the background. “Apple has everything in their hands to do it.”

Their concerns also come as national leaders urge the passage of child safety bills including the Kids Online Safety Act, which would establishes guidelines to protect minors on social media platforms, including TikTok and Facebook.

Apple declined to comment on the protest, and instead provided this news organization with a 2023 letter exchange between Gardner and Erik Neuenschwander, Apple’s director of user privacy and child safety, that addressed the company’s reasoning for scrapping the system.

Neuenschwander said implementing a system would compromised the security and privacy of users, and would open the door, “for bulk surveillance and could create a desire to search other encrypted messaging systems across content types (such as images, videos, text, or audio) and content categories.”

Related Articles

Technology |


Apple leaps into AI with an array of upcoming iPhone features and a ChatGPT deal to smarten up Siri

Technology |


Apple to debut passwords app in challenge to 1Password, LastPass

Technology |


Santa Clara-based Nvidia tops $3 trillion in value, leapfrogging past Apple

Technology |


Apple made once-unlikely deal with Sam Altman to catch up in AI

Technology |


5 California residents charged in alleged scheme involving thousands of counterfeit iPhones and other Apple devices

Apple already introduced new features in December 2021 that are designed to help keep children safe, including a setting where children can be warned when they receive or attempt to send content containing nudity in Messages, AirDrop, FaceTime video messages and other apps.

But protester Christine Almadjian said those features are not enough to protect children or hold predators accountable for possessing sexual abuse material. Almadjian, who is part of the national End Online Sexual Exploitation and Abuse of Children Coalition, said Apple needs to continue finding ways to identify and flag down such content.

“We’re trying to engage in a dialog with Apple to implement these changes,” she said Monday. “They don’t feel like these are necessary actions.”

You May Also Like

More From Author