Starset Society 中文镜像站

批评人士称,苹果通过其新的儿童保护工具在iPhone内置了一个“后门”

Critics Say Apple Built a ‘Backdoor’ Into Your iPhone With Its New Child Abuse Detection Tools

苹果公司计划在其平台上推出旨在打击儿童性虐待内容(CSAM)的新功能,这引发了不小的争议。

Apple’s plans to roll out new features aimed at combating Child Sexual Abuse Material (CSAM) on its platforms have caused no small amount of controversy.

近年来,CSAM在各大互联网平台上泛滥,这一问题一直困扰着执法官员和科技公司。苹果公司正试图率先根除这一问题。就在2018年,就有科技公司报告称,存在多达4500万张包含性侵儿童内容的照片和视频——这个数字高得可怕。

The company is basically trying to a pioneer a solution to a problem that, in recent years, has stymied law enforcement officials and technology companies alike: the large, ongoing crisis of CSAM proliferation on major internet platforms. As recently as 2018, tech firms reported the existence of as many as 45 million photos and videos that constituted child sex abuse material—a terrifyingly high number.

然而,尽管CSAM的泛滥是真实存在的,批评人士仍然担心,苹果的新功能——包括对用户设备和信息的算法扫描——构成了对用户隐私的侵犯,更令人担忧的是,有一天它可能会被用于搜索除CSAM以外的其他类型的材料。那时,这种功能可能会为监控人们提供途径,并介入加密通讯——而那是隐私保护最后也是最好的方法之一。

Yet while this crisis is very real, critics fear that Apple’s new features—which involve algorithmic scanning of users’ devices and messages—constitute a privacy violation and, more worryingly, could one day be repurposed to search for different kinds of material other than CSAM. Such a shift could open the door to new forms of widespread surveillance and serve as a potential workaround for encrypted communications—one of privacy’s last, best hopes.

Read more at Gizmodo
翻译:墨淞凌@STARSET_Mirror翻译组
审校:CDN@STARSET_Mirror翻译组

STARSET_Mirror