客观算法是一个神话

The protests across the U.S. and around the globe in the wake of the murder of George Floyd have raised awareness about structural inequalities. Though the specific focus has been on police brutality, scholars, activists, and artists are sounding the alarm on how systemic racism has been amplified in other areas like the tech industry, through communication and surveillance technology.

吨他在美国和世界各地的乔治·弗洛伊德的谋杀案发生后抗议活动引发了结构性的不平等意识。 尽管特别关注警察的野蛮行为,但学者,激进主义者和艺术家对通过通信和监视技术在系统性行业等其他领域如何加剧系统种族主义发出了警报。

In Coded Bias, a documentary by Shalini Kantayya, the director follows MIT Media Lab researcher and Algorithmic Justice League founder Joy Buolamwini as she discovers one of the fundamental problems with facial recognition. While working on a facial recognition art project, Buolamwini realizes that the computer vision software was having trouble tracking her face, but it worked fine when she put on a white mask. It was just the latest evidence of the type of bias that’s baked into facial recognition and A.I. systems

在Shalini Kantayya的纪录片Coded Bias中 ,导演跟随麻省理工学院媒体实验室研究员和算法正义联盟创始人Joy Buolamwini,因为她发现了面部识别的根本问题之一。 在进行面部识别艺术项目时,Buolamwini意识到计算机视觉软件无法跟踪自己的脸,但是当她戴上白色口罩时,效果很好。 这只是偏见类型的最新证据是真实烤到面部识别和人工智能系统

Along with Buolamwini, Kantayya interviews authors, researchers, and activists like Cathy O’Neil, Meredith Broussard, Safiya Noble, and Silkie Carlo, unraveling the problems of current technology like facial recognition or crime prediction software. These technologies often connect back to the dark historical practices of racialized surveillance, eugenics, or physiognomy.

Kantayya与Buolamwini一起采访了作家,研究人员和活动家,例如Cathy O'NeilMeredith BroussardSafiya NobleSilkie Carlo阐明了面部识别或犯罪预测软件等当前技术的问题。 这些技术通常与种族化监视,优生学或相貌学的黑暗历史实践联系起来

The film, which was screened at Sundance, focuses a critical eye on the assumed “objectivity” of algorithms, which O’Neil defines as “using historical information to make a prediction about the future.” While algorithms are often understood as unbiased, objective, and amoral, they can reproduce the biases of the humans that create them. Broussard says that we imbue technology with “magical thinking,” which lauds its benefits but obscures its negative effects.

这部电影在圣丹斯(Sundance )放映,聚焦于算法的“客观性”,奥尼尔将其定义为“使用历史信息对未来进行预测”。 尽管算法通常被理解为无偏见,客观和不道德的,但是它们可以重现创建它们的人类的偏见。 布劳萨德说,我们给技术注入了“魔术思维”,这种技术赞扬了它的好处,但掩盖了它的负面影响。

Coded Bias explains how algorithmic bias can have negative effects on the real world. The film depicts how a Houston school district used a secret algorithm in their “value added” teacher evaluation system, which classified even award-winning teachers as “bad teachers”; or how the facial recognition software police use often misidentifies Black suspects.

偏见编码说明了算法偏差如何对现实世界产生负面影响。 影片描述了休斯顿学区如何在其“增值”教师评估系统中使用秘密算法,该算法甚至将获奖教师归类为“不良教师”; 或警察使用的面部识别软件经常误认为黑人嫌疑犯。

The film also shows how the technologies that are deeply embedded in our lives augment existing asymmetrical power dynamics. The algorithms that shape people’s lives are often hidden in a “black box” — built by large tech companies who use proprietary rights protections to block the public from knowing how their algorithms work and what data is being collected.

电影还展示了深深植根于我们生活中的技术如何增强现有的不对称功率动态。 塑造人们生活的算法通常隐藏在“黑匣子”中,这是由大型科技公司利用专有权利保护来建立的,以阻止公众知道其算法如何工作以及正在收集哪些数据。

Kantayya talked to OneZero over the phone about how she learned about algorithmic bias, and how she hopes Code Bias can empower citizens to understand and protect their rights.

Kantayya通过电话与OneZero交谈 她如何了解算法偏见,以及她如何希望Code Bias可以使公民了解和保护自己的权利。

Coded Bias will be released in select theaters this fall. This Q&A has been edited for length and clarity.

编码偏见将于今年秋天在部分剧院上映。 已对本问答进行了编辑,以确保篇幅和清晰度。

OneZero: How did you come to learn about algorithm bias and what inspired you to make a film about it?

OneZero:您是如何学习算法偏差的?是什么激发了您制作有关该算法的电影的?

Kantayya: A lot of my work is on disruptive technology and how they make them more or less fair and equal. Issues around race, gender, and class are things I tend to think about, and I discovered the work of women like Joy Buolamwini, Cathy O’Neil, and Zeynep Tufekci. I became interested in the dark underbelly of big tech and that sort of sent me down the rabbit hole.

Kantayya:我的很多工作都是关于破坏性技术,以及它们如何使它们或多或少地公平和平等。 我倾向于考虑种族,性别和阶级等问题,并且发现了乔伊·布拉姆维尼(Joy Buolamwini),凯茜·奥尼尔(Cathy O'Neil)和Zeynep Tufekci等女性的作品。 我对大型科技的黑暗弱点产生了兴趣,这让我失望了。

What are some of the instances of algorithmic bias that are featured in the film?

影片中有哪些算法偏见的例子?

There was an Amazon algorithm that wasn’t trying to be biased, but the algorithm picked up on past sexism in hiring practices and retention, and started to sort out any woman who had a women college or sport on her resume. So unknowingly, this A.I. discriminated against women in the hiring process. The central part of the film is that facial recognition doesn’t work as well on dark faces or on women, and yet those are the people who are most targeted by racial profiling. Just recently, a man in Detroit was held for 30 hours based on being wrongly identified by facial recognition. So the examples are really everywhere.

曾经有一种Amazon算法没有试图偏见,但该算法在招聘实践和保留方面借鉴了过去的性别歧视,并开始对简历上有女子大学或体育专业的任何女性进行分类。 如此不知不觉中,该AI在招聘过程中歧视了女性。 这部电影的中心部分是面部识别在深色面Kong或女性身上效果不佳 ,但是这些人是种族特征最针对的人。 就在最近,底特律的一名男子因面部识别错误而被关押了30个小时。 因此,示例确实无处不在。

What do companies, people who are building this tech, need to do to weed out some of the bias that gets encoded in their algorithms?

正在开发此技术的公司的公司需要做什么以消除算法中编码的某些偏见?

The audience of my film isn’t actually technologists. The audience is actually citizens. Should companies be more responsible or inclusive in their hiring practices or who is in the room when these technologies are being built? Absolutely. But my focus really isn’t on the companies. To me, it isn’t about making a perfect algorithm, it’s about making a more humane society. The film is about empowering citizens to understand these tools so that we can have laws and protections against these practices; so that it won’t be up to tech companies to sell their technology to the FBI or police or other entities without an elected person being in the loop. Just the fact that corporations are doing this doesn’t make it better than when a government does it. I think citizens need to start demanding from Google and Apple that they do not replicate authoritarian surveillance technology.

我电影的观众实际上不是技术人员。 观众实际上是公民。 公司在雇用实践方面应该更负责任或更具包容性吗?或者在构建这些技术时谁在会议室? 绝对。 但是我的重点并不是公司。 对我而言,这不是要制定一个完美的算法,而是要建立一个更加人道的社会。 这部电影的目的是使公民了解这些工具,以便我们可以针对这些做法制定法律和保护措施; 这样一来,在没有选举人参与的情况下,技术公司就不必将其技术出售给FBI,警察或其他实体。 公司这样做的事实并没有比政府做的更好。 我认为,公民需要从Google和Apple开始要求他们不要复制专制监视技术。

What are some of the ways ordinary citizens can fight against these systems of surveillance?

普通公民可以通过哪些方式与这些监视系统作斗争?

I think we have a moment where big tech is listening. In the last month, IBM said it would basically get out of the facial recognition game — stop researching it, stop offering it, stop selling it. Amazon said it would press pause for one year on its sale to police, and Microsoft said it would stop selling to police as well. So I feel like because we have this movement for equality on the streets, we have a moment to actually pass meaningful legislation. And I think we need to push for legislators that will protect our data as a part of human rights and will protect us from these invasive technologies that violate our civil rights.

我认为我们有一会儿大技术正在倾听。 在上个月,IBM表示它将基本上退出面部识别游戏-停止研究,停止提供,停止销售。 亚马逊表示将暂停向警方出售一年,微软表示也将停止向警方出售产品。 因此,我觉得因为在大街上有争取平等的运动,所以我们有时间实际通过有意义的立法。 而且我认为我们需要推动立法者,以保护我们的数据作为人权的一部分,并保护我们免受侵犯我们公民权利的这些侵入性技术的侵害。

翻译自: https://onezero.medium.com/objective-algorithms-are-a-myth-22b2c3e3d702

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值