Apple Tackles Child Sex Abuse Imagery: Slippery Slope or Necessary Intervention?
Apple recently announced a new set of features aimed at combatting Child Sexual Abuse Materials (CSAM), which include the ability to scan a user's phone and iMessages. Since the announcement, the company has repeatedly clarified the safeguards that are in place, but privacy advocates have bemoaned the potential for abuse and "mission creep."
The exchange of CSAM imagery through Electronic Service Providers (ESPs) like Facebook, Instagram, Dropbox, Google Drive, etc has reached epidemic-like proportions, and a New York Times report on the issue illustrated how confounding and disturbing the problem has become.
In this episode of Vision Slightly Blurred, Sarah and Allen discuss the implication for photographers, and react to the various arguments made by privacy advocates as well as abuse experts.
Create your
podcast in
minutes
It is Free