West Virginia sues Apple for alleged child sexual abuse material failures
好的,我现在需要帮用户总结一篇文章的内容,控制在100字以内。用户已经给出了文章的英文内容,我得先仔细阅读并理解其主要内容。 文章讲的是西弗吉尼亚州起诉苹果公司,指控其允许iCloud用于传播和存储儿童性虐待内容。起诉依据是该州的消费者保护法,这是政府机构首次对苹果采取此类行动。苹果在2020年曾计划使用检测工具,但后来因隐私倡导者的反对而放弃。苹果还宣传自己重视隐私,但实际却未能有效阻止相关内容的传播。 接下来,我需要将这些要点浓缩成一段简洁的中文描述,确保不超过100字,并且直接描述内容,不需要开头语。要注意涵盖起诉的原因、苹果的应对措施及其宣传与实际行为的矛盾。 最后检查一下字数和表达是否清晰准确。 </think> 西弗吉尼亚州起诉苹果公司,指控其允许iCloud用于传播和存储儿童性虐待内容。苹果曾计划使用检测工具但因隐私倡导者反对而放弃。该公司宣传重视隐私,但未能有效阻止相关内容传播。 2026-2-19 20:30:54 Author: therecord.media(查看原文) 阅读量:2 收藏

West Virginia sued Apple Thursday for allegedly allowing iCloud to be used to disseminate and store child sexual abuse content.

The lawsuit is being brought under the state’s consumer protection law and is the first of its type to be filed by a government agency against Apple, according to a press release from the office of West Virginia Attorney General JB McCuskey.

Citing a February 2020 message from Apple’s head of fraud saying that the company is the "greatest platform for distributing child porn," the lawsuit alleged that despite its awareness of the problem, the company did nothing to stop it.

Apple announced plans to begin using child sexual abuse material (CSAM) detection tools in 2021, the lawsuit said, but abandoned the effort after a backlash from privacy advocates.

At the time, the Electronic Frontier Foundation and other digital freedoms organizations decried the move, the lawsuit said.

“Initially, Apple defended its CSAM-detecting methods ‘as designed with user

privacy in mind,’” the lawsuit says. “But even with this caveat, Apple was not prepared for the backlash that followed, not from the general public but from a vocal minority of purported privacy advocates.”

The lawsuit also pointed to Apple’s habit of advertising itself as a company committed to privacy, citing multiple ad campaigns, including a billboard it sponsored in New York in 2019.

“Your iPhone knows a lot about you,” the billboard said. “But we don’t.”

Apple’s website states that privacy is a “fundamental human right.” 

It’s also one of our core values. Which is why we design our products and services to protect it,” it says.

The company’s decision to abandon its plan to implement CSAM detection tools has led to an explosion of such content and caused the National Center for Missing and Exploited Children (NCMEC) to call Apple’s record on child protection “rotten,” the lawsuit alleged.

After Apple chose not to pursue CSAM detection, the child protection organization called the move “one of the greatest tragedies for survivors of child sex abuse, for families who have lost children due to that trauma, and for overall efforts to end this crime.”

An Apple spokesperson did not immediately respond to a request for comment.

iCloud is designed to “make image- and video-based content easier to locate, view, share, and retain across devices and applications,” the suit said. 

“For users who traffic in CSAM, such functionality reduces friction associated with maintaining large collections of illicit material, enables repeated access and redistribution without manual file handling, and allows such material to remain available and organized over long periods of time — thereby contributing to the ongoing circulation and safeguarding of CSAM within Apple’s ecosystem,” the lawsuit said.

Federal law requires U.S.-based tech firms to report CSAM they find to NCMEC. In 2023, Apple made 267 reports compared to the 1.47 million filed by Google and 30.6 million by Meta, according to the lawsuit. 

Because Apple maintains end-to-end control over its hardware, software, and cloud infrastructure, it cannot say it is ignorant of CSAM distribution, according to the lawsuit.

“Apple's failure to deploy available detection technology is not a passive oversight — it is a choice,” the press release said. “Apple designed, built, and profited from the very system it allowed to be weaponized against children.”

The state is seeking statutory and punitive damages as well as a court order directing Apple to implement strong CSAM detection tools.

Get more insights with the

Recorded Future

Intelligence Cloud.

Learn more.

Recorded Future

No previous article

No new articles

Suzanne Smalley

Suzanne Smalley

is a reporter covering digital privacy, surveillance technologies and cybersecurity policy for The Record. She was previously a cybersecurity reporter at CyberScoop. Earlier in her career Suzanne covered the Boston Police Department for the Boston Globe and two presidential campaign cycles for Newsweek. She lives in Washington with her husband and three children.


文章来源: https://therecord.media/apple-csam-west-virginia-lawsuit
如有侵权请联系:admin#unsafe.sh