![]() Apple has largely dealt with it by saying the system is being misunderstood. ![]() ![]() Here's the gist - the new child safety features have faced immense pushback from privacy advocates and geopolitical activists. It's a complicated, confusing, and contentious system, and I've already made a 43-minute video on how exactly it all works and a 20-minute video on how I think Apple could make it work better, or at least a whole lot less contentiously. Starting this fall in the U.S., with Communication Safety, Apple's going to be blurring explicit images in the Messages app for children, potentially alerting parents for children under 13, and with CSAM detection, hash-matching for known child exploitation material using a hybrid client-server process on upload to iCloud Photo Library.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |