Apple on Thursday announced new features which will implement a system that checks photos on iPhones, iPads in the United States before they are uploaded to its iCloud storage services to ensure the upload does not match known images of child sexual abuse.
"At Apple, our goal is to create technology that empowers people and enriches their lives — while helping them stay safe. We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM)," Apple said in a statement.
"This program is ambitious, and protecting children is an important responsibility," it said. "Our efforts will evolve and expand over time."
How does it work?
New technology will allow software powering Apple mobile devices to match abusive photos on a user's phone against a database of known CSAM images provided by child safety organizations, then flag the images as they are uploaded to Apple's online iCloud storage, according to the company.
Apple said its messaging app will use on-device machine learning to warn about sensitive content without making private communications readable by the company. The tool Apple calls "neural Match" will detect known images of child sexual abuse without decrypting people's messages. If it finds a match, the image will be reviewed by a human who can notify law enforcement if necessary.
Communication safety in Messages:
The new image-monitoring feature is part of a series of tools heading to Apple mobile devices, according to the company.
Apple's texting app, Messages, will use machine learning to recognize and warn children and their parents when receiving or sending sexually explicit photos, the company said in the statement.
“When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo. As an additional precaution, the child can also be told that, to make sure they are safe, their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it," Apple said.
Messages will use machine learning power on devices to analyze images attached to missives to determine whether they are sexually explicit, according to Apple.
What do researchers have to say?
But researchers say the tool could be put to other purposes such as government surveillance of dissidents or protesters.
Matthew Green of Johns Hopkins, a top cryptography researcher, was concerned that it could be used to frame innocent people by sending them harmless but malicious images designed designed to appear as matches for child porn, fooling Apple's algorithm and alerting law enforcement -- essentially framing people.
"This is a thing that you can do," said Green told Associated Press.
"Researchers have been able to do this pretty easily." Tech companies including Microsoft, Google, Facebook and others have for years been sharing "hash lists" of known images of child sexual abuse. Apple has also been scanning user files stored in its iCloud service, which is not as securely encrypted as its messages, for such images.
The company has been under pressure from governments and law enforcement to allow for surveillance of encrypted data. Coming up with the security measures required Apple to perform a delicate balancing act between cracking down on the exploitation of children while keeping its high-profile commitment to protecting the privacy of its users.
Apple believes it pulled off that feat with technology that it developed in consultation with several prominent cryptographers, including Stanford University professor Dan Boneh, whose work in the field has won a Turing Award, often called technology's version of the Nobel Prize.
Apple was one of the first major companies to embrace "end-to-end" encryption, in which messages are scrambled so that only their senders and recipients can read them. Law enforcement, however, has long pressured for access to that information in order to investigate crimes such as terrorism or child sexual exploitation.
"Apple's expanded protection for children is a game changer," John Clark, the president and CEO of the National Centre for Missing and Exploited Children, said in a statement. "With so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material." Julia Cordua, the CEO of Thorn, said that Apple's technology balances "the need for privacy with digital safety for children." Thorn, a nonprofit founded by Demi Moore and Ashton Kutcher, uses technology to help protect children from sexual abuse by identifying victims and working with tech platforms.
(With inputs from Associated Press)
(To receive our E-paper on whatsapp daily, please click here. We permit sharing of the paper's PDF on WhatsApp and other social media platforms.)