Google searches Gmail and Drive for cartoons depicting child sexual abuse

One of the issues tech companies face is sharing child sexual abuse images (CSAM) on their platforms. This is an ever-evolving issue, with both concerns for user privacy and the protection of minors in a constant state of balance.

A recent search warrant revealed that Google was also looking for cartoons depicting CSAM and other previously known types of CSAM.

The search warrant in question dated from late 2020 in Kansas. We do not name the subject of the warrant because the state has never laid charges.

Google systems found “digital art or cartoons of children engaging in sexually explicit behavior or engaging in sex” in the artist’s Google Drive account. This was likely an automated analysis, with the AI ​​looking for “hashes,” digital descriptions of a file, created from already known CSAM material.

Like Apple, other big tech companies also regularly scan emails for CSAM images. It’s easier to tackle email services, which aren’t encrypted by design.

The tricky part is when these tech companies have to search for encrypted services, like iCloud Photos, for illegal content.

How do they balance user privacy with the need to protect children? How do they secure the tools so that they cannot be misused? What happens in the event of false positives?

These are all questions that might be too big for a business to answer. One possible course of action for NCMEC would be to form a consortium with large technology companies. That way everyone would be on the same page. It would also be necessary to include privacy advocates to protect user privacy.

Editor’s recommendations:


Source link

Comments are closed.