Scanning iCloud Photos for Child Sexual Abuse ➝

A great collection of thoughts on Apple’s recent CSAM-related announcements, put together by Michael Tsai. The common theme I’ve seen among the reactions overall is that the privacy conscious are concerned about what this could become, while Apple and some others are defending what it is currently.

If this system stays as it is today, I don’t think many people would protest — obviously child sexual abuse material is objectively wrong. But there are very real concerns about how the system could be abused by authoritarian governments, nefarious actors within Apple, or groups/individuals targeting others by tricking them into adding an image to their iCloud Photo Library.

I don’t really know what could be done about any of those things beyond simply taking Apple’s word for it. But as time goes on, I’m finding it more and more difficult to trust that Apple is always going to make the right decisions.

➝ Source: mjtsai.com

Previous:
Next: