Content deleted Content added
→Characteristics: NCMEC |
→Characteristics: ref rafok |
||
Line 13:
[[Apple Inc]] reported as early as August 2021 a [[Child Sexual Abuse Material]] (CSAM) system that they know as [[NeuralHash]]. A technical summary document, which nicely explains the system with copious diagrams and example photographs, offers that "Instead of scanning images [on corporate] [[iCloud]] [servers], the system performs on-device matching using a database of known CSAM image hashes provided by [the [[National Center for Missing and Exploited Children]]] (NCMEC) and other child-safety organizations. Apple further transforms this database into an unreadable set of hashes, which is securely stored on users’ devices."<ref name="apcsam">{{cite news |title=CSAM Detection - Technical Summary |url=https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf |publisher=Apple Inc |date=August 2021}}</ref>
In an essay entitled "The Problem With Perceptual Hashes", Oliver Kuederle produces a startling collision generated by a piece of commercial [[neural net]] software, of the NeuralHash type. A photographic portrait of a real woman (Adobe Stock #221271979) reduces through the test algorithm to the same hash as the photograph of a piece of abstract art (from the "deposit photos" database). Both sample images are in commercial databases. Kuederle is concerned with collisions like this. "These cases will be manually reviewed. That is, according to Apple, an Apple employee will then look at your (flagged) pictures... Perceptual hashes are messy. When such algorithms are used to detect criminal activities, especially at Apple scale, many innocent people can potentially face serious problems... Needless to say, I’m quite worried about this."<ref name="rafok">{{cite news |last1=Kuederle |first1=Oliver |title=THE PROBLEM WITH PERCEPTUAL HASHES |url=https://rentafounder.com/the-problem-with-perceptual-hashes/ |access-date=23 May 2022 |publisher=rentafounder.com |date=unknown}}</ref>
==Gallery==
|