Apple plans to take a step against child abuse by scanning US iPhones for images of child sexual abuse. Although this is being applauded by child protection groups, according to ABC News, it is raising concern among some security researchers that the system could be misused, including by governments looking to surveil their citizens.
The tool designed to detected known images of child sexual abuse is called “neuralMatch,” and will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified. Apple confirmed in a blog post that the new scanning technology will “evolve and expand” over time.
Matthew Green, a top cryptography researcher at Johns Hopkins University, warned that the system could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child pornography. That could fool Apple’s algorithm and alert law enforcement. “Researchers have been able to do this pretty easily,” he said of the ability to trick such systems.
The question is, do parents need to be worried if they take innocent photos of their naked child bathing or something similar? What if an innocent naked or half naked photo of a child gets flagged? These are questions Apple will need to address . It is seemingly a good idea. However, it does raise privacy issues.
Share this
- Click to share on Facebook (Opens in new window)
- Click to share on Twitter (Opens in new window)
- Click to share on WhatsApp (Opens in new window)
- Click to share on LinkedIn (Opens in new window)
- Click to share on Tumblr (Opens in new window)
- Click to share on Pinterest (Opens in new window)
- Click to print (Opens in new window)