Apple sued for allegedly allowing child sexual abuse material to thrive on iCloud

You May Be Interested In:VMRDA to spend ₹390 crore to improve connectivity to Bhogapuram airport from Visakhapatnam



Apple is facing a lawsuit due to its choice not to implement a system designed to scan iCloud photos for child sexual abuse material (CSAM). As reported by The New York Times, the lawsuit alleges that Apple’s failure to act compels victims to revisit their traumatic experiences.

Also Read: Time to Upgrade Your Android phone; Nothing Phone (2) now available at discounted price

Although the technology company had previously announced enhancements aimed at safeguarding children, it did not follow through with these initiatives or take measures to identify and mitigate CSAM.

In 2021, Apple had proposed a system that would utilize digital signatures from the National Center for Missing and Exploited Children and other organizations to detect known CSAM content within users’ iCloud libraries. However, this plan was ultimately scrapped after concerns were raised by security and privacy advocates regarding the potential for it to serve as a backdoor for government surveillance.

The lawsuit has been initiated by a 27-year-old woman who alleges that a relative abused her as an infant and disseminated images of the abuse online. Filed on Saturday in Northern California, the lawsuit seeks over $1.2 billion in damages for a possible group of 2,680 victims who may be eligible for compensation.

Attorney James Marsh, who is associated with the case, disclosed these figures. This legal action follows a similar lawsuit filed in August by a nine-year-old girl and her guardian, which accused Apple of neglecting to address CSAM on iCloud. In response to the current lawsuit, an Apple spokesperson informed The Times that the company is “urgently and actively innovating to combat these crimes without compromising the security and privacy of users.”

Additionally, Apple representative Fred Sainz highlighted features such as Communication Safety, which alerts children to explicit content, emphasizing the company’s dedication to developing protections against CSAM.

This legal challenge arises in the context of accusations from the UK’s National Society for the Prevention of Cruelty to Children (NSPCC), which claimed that Apple has been underreporting CSAM on its platforms.

Also Read: WhatsApp to end support for older iPhone models; Check the full list here



share Paylaş facebook pinterest whatsapp x print

Similar Content

Meta's new smart glasses to feature a display, potential 2025 release; report
Meta’s new smart glasses to feature a display, potential 2025 release; report
JMM releases manifesto, promises 33% job quota for women
JMM releases manifesto, promises 33% job quota for women
Yvonne Pilon
Funding allows for longer support for more companies
Canada challenges order to address 140,000-request backlog in First Nations children's program | CBC News
Canada challenges order to address 140,000-request backlog in First Nations children’s program | CBC News
Arbuckle throws for two touchdowns to lead Argos past Bombers 41-24 in Grey Cup - Edmonton | Globalnews.ca
Arbuckle throws for two touchdowns to lead Argos past Bombers 41-24 in Grey Cup – Edmonton | Globalnews.ca
BTS V Birthday content paused after horrific South Korean plane crash; BigHit issues apology
BTS V Birthday content paused after horrific South Korean plane crash; BigHit issues apology
Pulse of the World | © 2024 | News