{"id":677504,"date":"2024-12-09T08:44:18","date_gmt":"2024-12-09T03:14:18","guid":{"rendered":"https:\/\/www.digit.in\/?p=677504"},"modified":"2024-12-09T08:44:31","modified_gmt":"2024-12-09T03:14:31","slug":"apple-sued-for-not-implementing-csam-detection-in-icloud","status":"publish","type":"post","link":"https:\/\/www.digit.in\/news\/general\/apple-sued-for-not-implementing-csam-detection-in-icloud.html","title":{"rendered":"Apple sued for not implementing CSAM detection in iCloud\u00a0"},"content":{"rendered":"\n<p>Apple is facing a lawsuit for its decision not to introduce a system that would have scanned iCloud photos for child sexual abuse material (CSAM). The lawsuit claims that by not taking stronger action to stop the spread of CSAM, Apple is forcing victims to relive their traumatic experiences.<\/p>\n\n\n\n<p>The lawsuit accuses Apple of making a public promise with \u201ca widely touted improved design aimed at protecting children,\u201d but failing to take action by implementing &#8220;those designs or take any measures to detect and limit\u201d this harmful content.<\/p>\n\n\n\n<p>Also read: <a href=\"https:\/\/www.digit.in\/news\/general\/apple-accused-of-monitoring-employees-devices-and-stifling-free-speech.html\" target=\"_blank\" rel=\"noopener\" title=\"\">Apple accused of monitoring employees\u2019 devices and stifling free speech<\/a><\/p>\n\n\n\n<p>Apple initially announced in 2021 that it would create a system to scan iCloud photos using digital signatures from the National Center for Missing and Exploited Children and other organisations. This would have helped detect known CSAM in users&#8217; iCloud accounts. However, the company reportedly dropped these plans after concerns were raised by security and privacy groups. They warned that implementing such a system could potentially allow government surveillance.&nbsp;&nbsp;<\/p>\n\n\n\n<p>The current lawsuit comes from a 27-year-old woman who is suing Apple under a pseudonym, reports The New York Times (via TechCrunch). She shared that her relative molested her as an infant and shared abusive images of her online. Despite efforts to stop this, she continues to receive notices from law enforcement almost daily about individuals being arrested for possessing these same images.&nbsp;&nbsp;<\/p>\n\n\n\n<p>James Marsh, the attorney involved with the case, has stated that as many as 2,680 victims could potentially qualify for compensation if the lawsuit is successful.&nbsp;&nbsp;<\/p>\n\n\n\n<p>Also read: <a href=\"https:\/\/www.digit.in\/news\/general\/apples-icloud-practices-spark-378bn-legal-action-heres-why.html\" target=\"_blank\" rel=\"noopener\" title=\"\">Apple\u2019s iCloud practices spark $3.78bn legal action, here\u2019s why<\/a><\/p>\n\n\n\n<p>In a statement to The Times, a company spokesperson said Apple is \u201curgently and actively innovating to combat these crimes without compromising the security and privacy of all our users.\u201d&nbsp;&nbsp;<\/p>\n\n\n\n<p>This isn\u2019t the first legal action against Apple related to CSAM detection. In August, a 9-year-old girl and her guardian also sued the company, accusing it of failing to tackle the spread of CSAM on iCloud.&nbsp;&nbsp;<\/p>\n\n\n\n<p>The outcome of this lawsuit could have major implications for Apple and how tech companies balance privacy and child protection efforts.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Apple is facing a lawsuit for its decision not to introduce a system that would have scanned iCloud photos for child sexual abuse material (CSAM). The lawsuit claims that by not taking stronger action to stop the spread of CSAM, Apple is forcing victims to relive their traumatic experiences. The lawsuit accuses Apple of making [&hellip;]<\/p>\n","protected":false},"author":2342,"featured_media":649029,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_gspb_post_css":"","footnotes":""},"categories":[186989],"tags":[190792,246307,247992,247990,241557,211393,246308,247991],"contenttype":[186],"digitlang":[165350],"dealstore":[],"offerexpiration":[],"acf":[],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/www.digit.in\/wp-json\/wp\/v2\/posts\/677504"}],"collection":[{"href":"https:\/\/www.digit.in\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.digit.in\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.digit.in\/wp-json\/wp\/v2\/users\/2342"}],"replies":[{"embeddable":true,"href":"https:\/\/www.digit.in\/wp-json\/wp\/v2\/comments?post=677504"}],"version-history":[{"count":3,"href":"https:\/\/www.digit.in\/wp-json\/wp\/v2\/posts\/677504\/revisions"}],"predecessor-version":[{"id":677507,"href":"https:\/\/www.digit.in\/wp-json\/wp\/v2\/posts\/677504\/revisions\/677507"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.digit.in\/wp-json\/wp\/v2\/media\/649029"}],"wp:attachment":[{"href":"https:\/\/www.digit.in\/wp-json\/wp\/v2\/media?parent=677504"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.digit.in\/wp-json\/wp\/v2\/categories?post=677504"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.digit.in\/wp-json\/wp\/v2\/tags?post=677504"},{"taxonomy":"contenttype","embeddable":true,"href":"https:\/\/www.digit.in\/wp-json\/wp\/v2\/contenttype?post=677504"},{"taxonomy":"digitlang","embeddable":true,"href":"https:\/\/www.digit.in\/wp-json\/wp\/v2\/digitlang?post=677504"},{"taxonomy":"dealstore","embeddable":true,"href":"https:\/\/www.digit.in\/wp-json\/wp\/v2\/dealstore?post=677504"},{"taxonomy":"offerexpiration","embeddable":true,"href":"https:\/\/www.digit.in\/wp-json\/wp\/v2\/offerexpiration?post=677504"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}