Last week, Apple previewed a number of updates meant to beef up child safety features on its devices. Among them: a new technology that can scan the photos on users’ devices in order to detect child sexual abuse material (CSAM). Though the change was widely praised by some lawmakers and child safety advocates, it prompted immediate pushback from many security and privacy experts, who say the update amounts to Apple walking back its commitment to putting user privacy above all else.Apple has disputed that characterization, saying that its approach balances both privacy and the need to do more to protect children by preventing some of the most abhorrent content from spreading more widely.What did Apple announce?Apple announced three separate updates, all of which fall under the umbrella of “child safety.” The most significant — and the one that’s gotten the bulk of the attention — is a feature that will scan iCloud Photos for known CSAM. The feature, which is built into iCloud Photos, compares a user’s photos against a database of previously identified material. If a certain number of those images is detected, it triggers a review process. If the images are verified by human reviewers, Apple will suspend that iCloud account and report it to the National Center for Missing and Exploited Children (NCMEC).Apple also previewed new “communication safety” features for the Messages app. That update enables the Messages app to detect when sexually explicit photos are sent or... Continue reading at 'Engadget'
[ Engadget | 2021-08-12 18:30:29 UTC ]
Tech publisher Wired, which found that more than 20 percent of its daily readers use ad-blocking software, thinks it's found a way to recoup some of that lost advertising revenue: start charging users for blocking ads. This morning, Wired began telling readers who use ad blockers it will... Continue reading at AdWeek
[ AdWeek | 2016-02-09 00:00:00 UTC ]
More news stories like this