Protochrome

On Recent Developments at Apple

Fairly recently Apple announced, then backed down on, the introduction of features into iCloud and iOS meant to discover photos of child abuse so that their creators and distributors may be brought to justice. You can read more about this from The Verge and Wired.


The most likely implementation of this system would be that it scans images on devices and in iCloud to see if any of the images have a hash that matches the hash of an image known to depict child abuse (based on a set of records maintained by a separate third party), and would notify some relevant authority once relevant conditions are satisfied.


Though this would potentially be a useful tool to help curtail the spread and creation of abusive photos, the underlying system presents significant concerns. One such concern regards the set of images regarded as abusive. If someone were to insert non-abusive images into this set, innocent people may start getting flagged, which could lead to arrests and other privacy invasions. Though this on its face is bad, if many benign images relevant to certain groups of people were to be included, this could bring about systematized harassment of this group under the guise of looking for abusive material. For example, groups could be targeted by the use of images depicting certain religious iconography, images detailing information relevant to labor organisation, and potentially non-abusive pornographic images depicting acts castized by some (say, homosexual or interracial pornography). Another concern is that of more direct censorship, as the very same technology could be used to identify those who spread or hold photographic proof of police brutality or other abuses of power.


If this technology is built and deployed, it becomes possible that bad actors may attempt to alter the list of images for their advantage, or in an even more straightforward manner a representative of some state may explicitly demand that Apple run the scanning software in a way such that it includes a particular set of images.


The implementation and deployment of such a software presents a significant avenue for harassment and invasions of privacy.



So, what has this to do with Protochrome? Fortunately very little, but some of the underlying cryptographic elements are comparable: hashing is used by both to identify photos as part of a set. Protochrome is, in function and by design, more respectful of privacy. Protochrome does not and cannot make any claim as to the content of the photographs. Beyond this concern the system is designed so that the only way a photo should enter the identified set is with the explicit intent of the photographer. Protochrome further does not store any identity information and does not require the original photographer to be the one to verify their image, so if the photographer does not wish to keep their copy of the image they may delete it, and anyone else with an exact copy of the image can still verify its authenticity without linking the photo to its photographer.


Even though Protochrome appears to be in the clear for now, Apple's incident underscores the need for continued vigilance to protect privacy rights. For this reason, Protochrome will be held under tighter lock-and-key, so to speak: it will not be open-sourced, and if any abuses of this system are found, it will most likely be shut down and disestablished. Though Protochrome represents an interesting and potentially important piece of technology, it need not continue to exist if it means causing harm to or infringing on the rights of people.


Elijah Cohen (eli173 at protonmail.com)