Apple's child abuse detection software may be vulnerable to attack

1 month ago 19
PR Distribution
Technology 20 August 2021

By Matthew Sparkes

Close up of digital data and binary code in network.

Apple has plans to detect images of child sexual abuse on some of its devices

Yuichiro Chino/Getty Images

Apple’s soon-to-be-launched algorithm to detect images of child sexual abuse on iPhones and iPads may incorrectly flag people as being in possession of illegal images, warn researchers.

NeuralHash will be launched in the US with an update to iOS and iPadOS later this year. The tool will compare a hash – a unique string of characters created by an algorithm – of every image uploaded to the cloud with a database of hashes for known images …

Existing subscribers, please log in with your email address to link your account access.

Paid quarterly

Inclusive of applicable taxes (VAT)

Read Entire Article