Apple plans to install special software on iPhones in the US to look for evidence of child abuse, writes the Financial Times, quoted by Reuters. This is a red flag about the issue of personal device surveillance, security researchers warn.
LATEST NEWS
See more articles in the technology category
The company presented the proposed system - known as "neuralMatch" - to academics in the US earlier this week, according to two security experts who were briefed on the virtual meeting. The plans could be widely publicized in the coming days, they said.
The automated system would proactively alert a team of human reviewers if it believes illegal, pornographic images are detected. The next step of the procedure aims to inform the Police, for thorough checks.
Initially, the program would be launched only in the USA.
Apple declined to comment on the information at the request of the Financial Times.
How does the system work?
Apple's neuralMatch algorithm will continuously scan the photos stored on every US user's iPhone. They are also uploaded to the iCloud backup system.
The images, converted into a string through a process known as "hashing", will be compared to those in a database of child sexual abuse images. The system was trained on approximately 200,000 such images, collected by the US non-profit National Center for Missing and Exploited Children.
According to Financial Times sources, every photo uploaded to iCloud in the US will receive a "security voucher" that certifies whether or not that content is suspicious.
Once a certain number of photos are flagged negatively, Apple will allow all suspect images to be decrypted. Later, if the suspicions are confirmed, the competent authorities will be alerted.