Apple Rebuts Privacy Concerns Over CSAM Photo Scanning

Apple Rebuts Privacy Concerns Over CSAM Photo Scanning

Contents

Embroiled in a controversial public debate over privacy, Apple has shared an FAQ to clear doubts regarding its decision to scan iCloud photos.

You Are Reading :[thien_display_title]

Apple Rebuts Privacy Concerns Over CSAM Photo Scanning

Apple outlined a set of upcoming child safety features last week, but following widespread privacy concerns and an online uproar, the company has shared a detailed FAQ that aims to clarify some of the key doubts and answer questions. Apple’s new child-safety measures involve scanning iCloud photos to detect Child Sexual Abuse Material (CSAM) and notifying parents when children receive explicit photos in Messages. The intentions, at least on the surface, are noble, but multiple security experts and privacy advocates have raised alarms about the long-term ramifications of these new features, going as far as to question whether this invalidates Apple’s privacy claims and creates a backdoor.

The Electronic Frontier Foundation (EFF) called the proposals a narrowly-scoped backdoor, and that it was plausible to widen the scope of scanning iCloud photos beyond just CSAM images under pressure from authorities. Experts have also questioned the implementation method as the CSAM hashes will be baked in at the iPhone and iPad OS level, which means they can be enabled in any country beyond the United States where oversight levels vary, laws are different, and demand from authorities might be more persuasive.

Apple has tried to answer some of those concerns with its FAQ paper. Starting with communication safety in Messages, Apple says it is not the same as CSAM detection, and that it can’t actually access what is shared over its messaging platform. More importantly, Apple says the new child safety tool in Messages won’t break end-to-end encryption. Apple assures it only adds a level of intervention when explicit images are shared with a phone used by a minor, but the information, communication details, notification sent to the parents, or the image evaluation results are not accessible to Apple. Answering concerns if the notification feature can prove to be an issue for children in abusive homes, Apple says it is making it more accessible for victims to find support via Siri and Search.

See also  18 Best Video Game Movies According To Rotten Tomatoes (And 2 Stuck With 0%)

Apple In Damage Control Mode, But Concerns Remain

Coming to the more contentious topic of scanning photos, Apple says it is not going to scan photos stored on a phone’s local storage. Additionally, disabling iCloud Photos will mean the CSAM image scan won’t happen on such devices. When discussing the legal aspect of reporting accounts to authorities, Apple says possessing CSAM images is illegal in the United States, and therefore, it is obligated to report such incidents to appropriate law enforcement agencies. Addressing concerns if Apple can be pressured into expanding the scope of its image scan system, the company says it will not bow down to demands from any government to do so.

Regarding the issue of misdetection by an automated image scanning system, Apple says the chances of failure are as low as one in a trillion per year. The company claims it can not alter the image hashes itself for scanning anything beyond CSAM content, and it won’t report an account to authorities before conducting a human review to minimize the chances of an error. However, concerns still remain. The new CSAM detection tech itself serves as a template that a similar hash-based scanning system can be developed by an independent or state-sponsored agency. And once that happens, an overreaching government can use it to commit acts such as silencing dissent, or worse. Yes, Apple has promised not to budge, but in key markets — and even those where its footprint is expanding at a rapid pace — an absolute choice between user privacy and profits could definitely put Apple in a tricky situation.

See also  10 Gorilla Grodd Facts Flash Fans Should Know

Link Source : https://screenrant.com/apple-csam-iphone-photo-scanning-child-safety-privacy-faqs/

Reviews -