如何识别媒体偏见_面部识别软件:宝贵资产,还是社会偏见的体现?

如何识别媒体偏见

In January 2020, Robert Williams, a resident of Farmington Hills, Michigan, was returning home after a seemingly normal day at work, when he received a call from his wife that a police officer had called to request him to turn himself in. Confused, Williams arrived at his home, only to be taken under arrest by the Detroit P.D. and taken to the Detroit Detention Center. Despite knowing he did not commit “felony larceny”, as the officers had claimed, Williams contained his anger in fear of police retaliation due to his appearance as a black man. He then spent the next 18 hours on the floor of the overcrowded detention center, despite his innocence, completely oblivious to any sort of “incriminating evidence” that gave the police grounds to arrest him.

2020年1月,密歇根州法明顿希尔斯市的居民罗伯特·威廉姆斯(Robert Williams)在看似正常的一天工作后返回家中,当时他接到妻子打来的电话,警察打来电话请他上交。威廉姆斯到达他的家,之后被底特律PD逮捕并带到底特律拘留所。 尽管知道他没有像官员们所说的那样实施“重罪盗窃”,但威廉姆斯还是因为担心自己是黑人而遭到警察的报复而愤怒。 然后,尽管他无罪,但接下来的18个小时仍在人满为患的拘留所地板上,完全不顾任何提供警察逮捕他的“定罪证据”。

Williams’ false arrest was a result of inaccurate facial recognition software, which misidentified him as the man in a blurry surveillance footage of a black man committing larceny at a watch store. After realizing their mistake, the cops let him go, nonchalantly claiming that the “the computer must have gotten it wrong.” He was eventually released after spending almost 30 hours at the detention facility.

威廉姆斯的错误逮捕是由于不正确的面部识别软件造成的,该软件将他误认为是一名黑人在一家手表店实施盗窃的模糊监控录像中的那个人。 在意识到他们的错误之后,警察放开了他,冷淡地宣称“计算机一定把它弄错了”。 他在拘留所度过了近30个小时后最终被释放。

However, simply being released does not erase the fact that this costly mistake forced Williams to spend over a day in a detention center with no wrongdoing on his part. It does not erase the trauma of spending 30 hours in a filthy, overcrowded detention facility, and being arrested in front of his own home, by officers meant to protect citizens like him. It does not erase the image in the heads of Williams children and wife, seeing him handcuffed and guided into a cop car after being arrested. It does not erase the fact that this blurry image of a black man was enough evidence to gain an arrest warrant; that the police believed it was enough. It does not erase the fact that this software, clearly proven to have large consequences due to its mistakes, is still widely used in law enforcement.

但是,仅仅被释放并不能消除这样一个事实,即这一代价高昂的错误迫使威廉姆斯在看守所里呆了一天,没有做错任何事。 这并不能消除在一个肮脏,人满为患的拘留所里呆了30个小时,并被打算保护像他这样的公民的官员逮捕在自己家门前的痛苦。 这并没有消除威廉姆斯孩子和妻子的脑海中的图像,因为他被捕后被戴上手铐并被引导到警车上。 这并不能消除这样一个事实:黑人的模糊形象足以获得逮捕证。 警方认为足够了。 它不能消除以下事实:该软件因错误而被证明具有重大后果,但仍在执法中广泛使用。

Facial Recognition software is used for a variety of purposes—from the technology that unlocks your phone to the identification software used by law enforcement to identify individuals in surveillance footage. However, not only do most facial recognition systems have statistically significant inaccuracies, they also prove to be biased.

面部识别软件可用于多种用途,从解锁手机的技术到执法部门用来识别监控镜头中人员的识别软件。 但是,不仅大多数人脸识别系统在统计上都不是很准确,而且还被证明是有偏差的。

According to the Gender Shades Project, the MIT thesis of Joy Buolamwini, when evaluating the accuracy of three popular facial recognition systems from the companies Microsoft, IBM, and Face++, it was found that all three companies performed worse when identifying darker-skinned and female people, as opposed to lighter-skinned and male faces. In fact, in the IBM system, the error rate difference between lighter skinned men versus darker-skinned women was a whopping 34.4%. Furthermore, as Buolamwini tested darker and darker skinned women into these systems, the error rate steady increased to the point where the darkest-skinned women had identification error rates approaching nearly 50%, across all three companies tested.

根据Gender Shades Project (麻省理工学院的Joy Buolamwini论文),当评估Microsoft,IBM和Face ++公司的三种流行的面部识别系统的准确性时,发现在识别深色皮肤和女性皮肤时,这三家公司的表现都较差。人,而不是肤色较浅的男性面Kong。 实际上,在IBM系统中,肤色较浅的男性与肤色较黑的女性之间的错误率差异高达34.4%。 此外,随着Buolamwini在这些系统中测试了皮肤越来越黑的女性,错误率稳步上升,在所有这三个测试公司中,皮肤最黑的女性的识别错误率接近50%。

With such significant inaccuracy rates and clear racial bias, why is this software being used as evidence in such a high stakes context? As unlucky as he seems, Robert Williams was sadly fortunate enough to have only been held a day, as many innocent black men in the future many not be as lucky, giving black people another reason to fear law enforcement, when not only is the system working against them, but also the AI technology backing the police.

由于存在如此高的不准确率和明显的种族偏见,为什么在如此高风险的情况下将此软件用作证据? 罗伯特·威廉姆斯(Robert Williams)看上去很不幸,但不幸的是,幸好只有一天才被关押,因为将来许多无辜的黑人都不那么幸运,这给了黑人另一个害怕执法的理由。反对他们,还有AI技术支持警察。

Following the Williams case, the Detroit PD’s response was not to eliminate facial recognition software, rather, to reduce its value as evidence. But even if facial identification is neither incriminating nor inaccurate in the future, another question is raised: why are citizens’ faces being used against them by law enforcement in the first place? Is it ethical to infringe on the privacy of people and use their photos to arrest them, possibly wrongfully?

在威廉姆斯案之后,底特律PD的React不是消除面部识别软件,而是降低其作为证据的价值。 但是,即使脸部识别在将来既不会引起歧义,又会带来不准确的情况,但还会引发另一个问题:为什么执法者首先会使用公民的脸对他们进行歧视? 侵犯人们的隐私并使用他们的照片将他们逮捕,可能是错误的,是否合乎道德?

Despite the glaring negatives of facial recognition systems in law enforcement, there are still some positive uses for this innovative software. According to the NCIC Missing Person and Unidentified Person Statistics, in 2019, there were 87,438 people missing in the United States. Facial recognition software could possibly help identify these people in public surveillance more efficiently and at a larger magnitude.

尽管面部识别系统在执法中存在明显的弊端,但该创新软件仍然有一些积极的用途。 根据NCIC失踪人员和身份不明人士统计数据 ,2019年,美国有87,438人失踪。 面部识别软件可能可以帮助在公共监视中更有效,更大范围地识别这些人。

Having both benefits and very serious setbacks, Facial Recognition systems might have a place in law enforcement both now and in the future, though the weight of that place is still a topic to be debated. It seems that the software proves to be most useful where inaccuracies will not have extremely large negative consequences, but what this means for the future of Facial Identification AI is not a clear cut path; however, it should be noted that there is, in fact, a path to the future for this software.

面部识别系统既有好处,也有严重的挫折,无论现在还是将来,它都可能在执法中占有一席之地,尽管这个地方的分量仍然是一个有争议的话题。 在不准确不会造成极大负面影响的地方,该软件似乎被证明是最有用的,但是这对于面部识别AI的未来意味着什么? 但是,应该指出的是,实际上,该软件还有通往未来的道路。

Sources

资料来源

I was wrongfully arrested because of facial recognition. Why are police allowed to use it?”, Washington Post, Robert Williams, 24 Jun. 2020

由于面部识别,我被误捕。 为什么允许警察使用它? ”,华盛顿邮报,罗伯特·威廉姆斯,2020年6月24日

‘The Computer Got It Wrong’: How Facial Recognition Led To False Arrest Of Black Man”, npr KQED, Bobby Allyn, 24 Jun. 2020

'计算机出了错':面部识别如何导致黑人的错误逮捕 ”,npr KQED,Bobby Allyn,2020年6月24日

Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification”, MIT Media Lab, Joy Buolamwini, 2018

性别阴影:商业性别分类中的交叉准确性差异 ”,麻省理工学院媒体实验室,乔伊·博拉姆维尼,2018年

The Growth of AI Adoption in Law Enforcement”, Forbes, Kathleen Walch, 26 Jul. 2019

执法中人工智能采用的增长 ”,福布斯,凯瑟琳·沃尔奇,2019年7月26日

翻译自: https://medium.com/swlh/facial-recognition-software-a-valuable-asset-or-a-manifestation-of-societys-biases-78b44d7a81a8

如何识别媒体偏见

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值