Over a year ago, Apple announced plans to scan for child sexual abuse material (CSAM) with the iOS 15.2 release. The technology is inevitable despite imperfections and silence about it. And then, the ...
AI-generated content is more explicit, extreme and complex than other types of child pornography that have been seen in the ...
The European Union’s home affairs commissioner, Ylva Johansson, has confirmed the Commission is investigating whether or not it broke recently updated digital governance rules when her department ran ...
Cloudflare’s legal chief on CSAM frameworks, trusted reporters, and the limits of enforcement models
Can the CSAM enforcement model work beyond its original scope? Cloudflare’s legal chief explains why it’s not that simple.
On Friday, Sens. Marsha Blackburn (R-Tenn) and Richard Blumenthal (D-Conn) sent co-written letters to Amazon, Google, Integral Ad Science, DoubleVerify, the MRC and TAG notifying the companies that ...
Earlier this year, Apple announced a new system designed to catch potential CSAM (Child Sexual Abuse Material) by scanning iPhone users’ photos. After an instant uproar, Apple delayed the system until ...
Apple is being sued by victims of child sexual abuse over its failure to follow through with plans to scan iCloud for child sexual abuse materials (CSAM), The New York Times reports. In 2021, Apple ...
The European Commission has again been urged to more fully disclose its dealings with private technology companies and other stakeholders, in relation to a controversial piece of tech policy that ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results