Apple is facing a lawsuit due to its choice not to implement a system designed to scan iCloud photos for child sexual abuse material (CSAM). As reported by The New York Times, the lawsuit alleges that Apple’s failure to act compels victims to revisit their traumatic experiences.
Also Read: Time to Upgrade Your Android phone; Nothing Phone (2) now available at discounted price
Apple sued for failing to curtail child sexual abuse material on iCloud. #Apple #icloud https://t.co/SF5Oajm8ug
— Express Technology (@ExpressTechie) December 9, 2024
Although the technology company had previously announced enhancements aimed at safeguarding children, it did not follow through with these initiatives or take measures to identify and mitigate CSAM.
In 2021, Apple had proposed a system that would utilize digital signatures from the National Center for Missing and Exploited Children and other organizations to detect known CSAM content within users’ iCloud libraries. However, this plan was ultimately scrapped after concerns were raised by security and privacy advocates regarding the potential for it to serve as a backdoor for government surveillance.
The lawsuit has been initiated by a 27-year-old woman who alleges that a relative abused her as an infant and disseminated images of the abuse online. Filed on Saturday in Northern California, the lawsuit seeks over $1.2 billion in damages for a possible group of 2,680 victims who may be eligible for compensation.
Attorney James Marsh, who is associated with the case, disclosed these figures. This legal action follows a similar lawsuit filed in August by a nine-year-old girl and her guardian, which accused Apple of neglecting to address CSAM on iCloud. In response to the current lawsuit, an Apple spokesperson informed The Times that the company is “urgently and actively innovating to combat these crimes without compromising the security and privacy of users.”
Additionally, Apple representative Fred Sainz highlighted features such as Communication Safety, which alerts children to explicit content, emphasizing the company’s dedication to developing protections against CSAM.
$AAPL
Apple Faces $1.2B Lawsuit Over Child Sexual Abuse Material Detection On iCloud — Tech Giant Says, ‘Urgently And Actively Innovating’
— Rocky – The Stock Trader Hub (@RockyTSTH) December 9, 2024
This legal challenge arises in the context of accusations from the UK’s National Society for the Prevention of Cruelty to Children (NSPCC), which claimed that Apple has been underreporting CSAM on its platforms.
Also Read: WhatsApp to end support for older iPhone models; Check the full list here