Apple is reportedly ditching a controversial plan to scan users’ photos stored in iCloud for child sexual abuse material, or CSAM, amid an ongoing privacy push.
These safety tools, announced in August 2021, were meant to flag illicit content while preserving privacy. But the plans drew widespread criticism from digital rights groups who argued that the surveillance capabilities were ripe for potential abuse.
Apple put the plans on pause a month later. Now, more than a year after its announcement, the company has no plans to move forward with the CSAM-detection tool.
APPLE’S NEW APP STORE UPDATE INCREASES MAXIMUM PRICE POINTS $10,000
The company says it is developing new features that will better balance users privacy and protect children. Such parameters will allow parents to limit their child’s contacts, restrict content and screen time and provide an app store carefully curated for kids.
Apple says the best way to prevent online exploitation of children is to interrupt it before it happens. The company pointed to new features it rolled out in December 2021 that enabled this process.
TIM COOK SAYS APPLE WILL BUY U.S.-MADE CHIPS BUILT AT TSMC’S ARIZONA FACTORY
Communication safety in messages, for instance, includes warnings when questionable photos are being sent, and expanded guidance in Siri, Spotlight, and Safari Search.
APPLE SUED BY WOMEN WHO CLAIM AIRTAG DEVICES LET STALKERS TRACK VICTIMS
The company is working on updates to communication safety in messages to cover nudity in videos and other child safety protections. Apple says it is also working with child safety professionals to make reporting incidents to law enforcement more seamless.
CLICK HERE TO GET THE FOX BUSINESS APP
The company announced Wednesday it will now offer full end-to-end encryption for nearly all the data its users store in its global cloud-based storage system, making it more difficult for hackers, spies and law enforcement agencies to access sensitive user information.
Read the full article here