计算机与计算机网络_让计算机承担责任

计算机与计算机网络

For all of human history our innovation and ingenuity have been accelerating at a staggering pace. The stone age, which lasted us a good two and a half million years, eventually gave way to the bronze age, which lasted us merely the next 2000. Things, it’s very often said, have gone very much downhill since.

在整个人类历史上,我们的创新和创造力一直以惊人的速度加速发展。 石器时代持续了我们两年半的好时光,最终被青铜器时代所取代,青铜器时代仅持续了下一个2000年。从那以后,人们常说事情已经走下坡路。

The iron age followed on for the next eight centuries making the industrial age possible for the next two. If, as is commonly suggested, our most recent years can be described as the silicone age then the past half-decade has almost certainly given way to the age of algorithms.

在接下来的八个世纪中,铁器时代随之而来,这使得接下来两个世纪的工业时代成为可能。 如果像通常所建议的那样,如果我们将最近几年描述为硅树脂时代,那么过去的十年已经几乎可以肯定地被算法时代所取代。

Algorithms, as they’re used today, being perhaps the most useful modern invention we have ever come up with. They combine the worlds of technical know-how and linguistic hand-waving to create a word which often describes nothing at all while explaining away everything under the sun.

今天使用的算法也许是我们有史以来最有用的现代发明。 他们将技术知识和语言动手世界融为一体,创造出一个词,在解释阳光下的所有事物时,这些词通常根本什么也没描述。

部署算法 (Deploying The Algorithm)

Depending on the product, audience, or kind of attention you’re trying to grab, the algorithm can be rolled out to draw people in with curiosity or drive them away in fear.

根据产品,受众或您想要吸引的注意力的种类,可以推出该算法以引起人们的好奇心或驱使他们恐惧。

Tech companies, app developers, and ‘cutting edge’ service providers most often use ‘the algorithm’ as a way to wake-up customers with futuristic-sounding offerings. The word algorithm, used in a positive tone, generates intrigue and curiosity about the magic box operating behind the curtain. A mere trifle of the word’s full potential.

科技公司,应用程序开发人员和“尖端”服务提供商最常使用“算法”作为用具有未来感的产品唤醒客户的方式。 以积极的语气使用的单词算法会引起对幕后魔术盒的好奇和好奇。 这个词的全部潜能只是小事。

Image for post
Photo by Franki Chamaki on Unsplash
照片由 Franki ChamakiUnsplash拍摄

It is also, more usefully, a wonderful asset on which you can blame any number of costly errors, bad decisions, and misbehaviour without outing anyone at all as responsible. When it comes to repair, liability, or even blame “The algorithm did it” is enough to end the conversation before it begins.

更有用的是,它也是一种奇妙的资产,您可以责怪任何代价高昂的错误,错误的决策和不当行为,而不必承担任何责任。 当涉及到维修,责任甚至责备“算法做到了”时,足以在对话开始之前结束对话。

In recent weeks “the algorithm” has been put on public trial in the UK for a series of blunders and screw-ups affecting tens of thousands of school pupils with a single misjudged calculation.

最近几周,“算法”已在英国公开审判,涉及一系列错误和错误,仅凭一次错误的计算就影响了成千上万的小学生。

指责算法 (Blame It On The Algorithm)

In the absence of conventional exams in this unusual year, grades for pupils throughout the country have been calculated on previous work, mock exam scores, and the best judgment of their teachers. Long after submission of grades, the final marks were adjusted in bulk using an automated process to ‘fix’ marks and bring 2020 results into closer alignment with previous years.

在这一不寻常的年份中,由于没有常规考试,我们根据以前的工作,模拟考试成绩和老师的最佳判断来计算全国学生的成绩。 提交成绩后很长时间,最终分数使用自动化流程进行了批量调整,以“修正”分数,并使2020年的结果与往年更加接近。

In the vast majority of cases, these adjustments have lowered expected grades by important margins. School leavers, university applicants, and continuing students have faced inevitable mass disruption and disappointment as a result. Outcry and protests have been justifiably loud.

在大多数情况下,这些调整将预期成绩降低了很多。 因此,离校生,大学申请者和继续学习的学生不可避免地面临大规模的混乱和失望。 强烈的抗议和抗议声很高。

Precisely who students should blame for adjustments they claim to be unsound and unfair depends very much on where they live within the UK. In Scotland, where results were released earliest, the examining body was chosen as ‘bad cop’, taking the fall for a disastrous decision.

究竟应该归咎于谁的学生是他们声称自己不健全和不公平的调整,这在很大程度上取决于他们在英国的居住地。 在最早公布结果的苏格兰,考核机构被选为“坏警察”,并作出了灾难性的决定。

In England, it was ‘the algorithm’ which was blamed for making the change, apparently single-handedly. In Wales, where results are released last, they abandoned plans to make any adjustment at all.

在英格兰,显然是单枪匹马进行修改的原因是“算法”。 在最后公布结果的威尔士,他们放弃了进行任何调整的计划。

The UK government initially backed the examining bodies algorithm with the PM describing the system as “robust” despite the nation’s misgivings. Education secretary Gavin Williamson defended the system, telling reporters there would be “no change” to the government’s approach.

英国政府最初对检查机构算法进行了支持,总理表示,尽管该国存有疑虑,但该系统还是“稳健”的。 教育部长加文·威廉姆森(Gavin Williamson)为这一制度辩护,告诉记者,政府的做法“不会改变”。

A few days later and a weekend to think it over, official decisions had changed, the algorithm was gone and teachers were back in charge. “We now believe it is better to offer young people and parents certainty by moving to teacher-assessed grades,” Williamson said in a press conference on Monday.

几天后和一个周末进行了深思,官方决定已经改变,算法消失了,老师又重新负责了。 威廉姆森(Williamson)在周一的新闻发布会上说:“我们现在认为,最好改由教师评估的成绩来为年轻人和父母提供确定性。”

The Scottish government made a similar reversal in the previous week amidst similar backlash

上周,苏格兰政府在类似的反弹中做出了类似的逆转

究竟什么是算法? (What Exactly Is An Algorithm?)

While the word invokes images of mathematical models, complex computer wizardry, and an automaton working tirelessly in the background; the reality behind ‘the algorithm’ is almost always a breathtakingly dull set of instructions and rules.

这个词唤起了数学模型的图像,复杂的计算机巫术以及在后台不知疲倦地工作的自动机; “算法”背后的现实几乎总是一整套令人费解的指令和规则。

Whether you want the word to invoke futuristic science to promote or a remote automaton to blame, you want the word to sound far more capable than it really is. This, in too many cases to count, is how a companies website back-end, spreadsheet, or off-the-shelf software becomes ‘the algorithm’.

无论您是想让这个词援引未来科学来促进还是要指责远程自动机,您都希望这个词听起来比实际功能强大得多。 在很多情况下,这就是公司网站后端,电子表格或现成的软件成为“算法”的方式。

70 years ago it would have been a set of instructions followed by hand in a cubical by a white-collar employee. A time where having a human to blame was an essential part of a firm’s ethos. In traditional models, an intern, clerk, or junior associate could be fired the same day to scapegoat a company’s misdeeds and errors. Another role that’s fallen to the automated age.

70年前,这本来是一套指示,然后由白领员工亲自上演。 在这个时候,要责备人是公司精神的重要组成部分。 在传统模式中,实习生,业务员或初级合伙人可能会在同一天被解雇,以逃避公司的不当行为和错误。 另一个角色已落入自动化时代。

Image for post
Photo by Stephen Dawson on Unsplash
Stephen DawsonUnsplash拍摄的照片

As a strategy, blaming an algorithm for its resulting errors is about as nonsensical as blaming fists for violence, pencils for bad art, or the internet for procrastination. Yet it persists.

作为一种策略,将算法归咎于其导致的错误与将拳头归因于暴力,将铅笔归因于不良艺术或将互联网拖延归因于荒谬。 但是它仍然存在。

Airlines blame it for booking the same seats two or three times over, hotels and car rental firms blame theirs for very much the same reason. Credit cards, real estate agents, and service firms often blame it for automated decisions that result in refusal of services.

航空公司将预订相同的座位两次或三倍归咎于酒店,而酒店和租车公司则出于相同的原因而责怪他们。 信用卡,房地产经纪人和服务公司经常将其归咎于导致拒绝服务的自动决策。

The crimes perpetrated by ‘the algorithm’ often get worse the harder you look. Many, as a result of unintended and inbuilt bias, make decisions which — if made by a human in the same position — would be cited as sexist, racist, or discriminatory toward minority applicants.

您越看越“算法”所犯下的罪行往往会变得更糟。 许多人由于意外和内在的偏见而做出的决定,如果是由处于同一职位的人做出的,则将被视为对少数群体申请人的性别歧视,种族歧视或歧视。

A recent study published in Science showed algorithmic bias in the healthcare industry was resulting in a racial disparity affecting the healthcare of millions of patients. [1]

最近在《科学》杂志上发表的一项研究表明,医疗保健行业的算法偏差导致种族差异影响了数百万患者的医疗保健。 [1]

数字破坏 (Destruction By Numbers)

Training errors on data which excludes entire sections of the population commonly cause decisions to be skewed to favour or discriminate against different groups of people. Apple’s recent venture into the credit card market met early disaster when it was discovered the company was offering 10–20 times more credit to men due to biases accidentally built into the system.

排除整个人口群体的数据上的训练错误通常会导致偏向于偏爱或歧视不同​​人群的决策。 苹果公司最近进入信用卡市场的尝试遇到了灾难,当时人们发现该公司由于意外地内置在系统中的偏见而向男性提供了10至20倍的信用。

COMPAS, a system designed and used in the United States to guide criminal sentencing was found to have major issues which introduced a strong racial bias into its recommendations. [2]

人们发现,COMPAS是在美国设计和使用的用于指导刑事判决的系统,存在一些重大问题,在建议中引入了强烈的种族偏见。 [2]

Learning algorithms work, by picking up on patterns calculated from training data and applying them to new sets of data too. When the training data is wrong the results on live data can range from minor mistakes to catastrophic decisions.

学习算法的工作原理是,根据训练数据计算出的模式并将其应用于新的数据集。 当训练数据错误时,实时数据的结果可能会从小错误到灾难性决策。

“The problem is that bias is another kind of pattern, and so these machine learning systems are also going to pick it up,” Emily M. Bender, a p computational linguistics professor at the University of Washington told Vice News.“ And not only will these machine learning programs pick up bias in the training sets, they’ll amplify it.”

华盛顿大学计算语言学教授艾米丽·班德(Emily M. Bender)告诉《副新闻》:“问题在于,偏见是另一种模式,因此这些机器学习系统也将逐渐采用这种偏见。”这些机器学习程序会吸收训练集中的偏见,并且会加剧这种偏见。”

Despite their flaws, algorithms aren’t going anywhere fast. Having already made their way into decision making for housing, jobs, finance, accounts, loans, criminal sentencing, and government assistance programs — their use is only likely to increase from here. We should, however, begin to recognise them for the simple, dull calculators they are.

尽管存在缺陷,算法并不能很快实现。 已经进入住房,就业,财务,账户,贷款,刑事判决和政府援助计划的决策过程,它们的使用仅可能从这里开始增加。 但是,我们应该开始使用简单,乏味的计算器来识别它们。

不要责怪工具 (Don’t Blame The Tools)

Blaming an algorithm for catastrophic mistakes and life-changing decisions shouldn’t be any more acceptable than shrugging your shoulders and saying “that’s life”.

为灾难性的错误和改变生活的决策而责备算法,应该比耸耸肩说“那就是生命”更令人接受。

Image for post
Photo by Campaign Creators on Unsplash
Campaign CreatorsUnsplash拍摄的照片

Perhaps, in non-technical areas it’s easy to forget that every piece of programming has a vast array of people behind it. Every algorithm in commercial use has, at a minimum, one engineer implementing an approved design for at least one client with defined goals and objectives in mind.

也许,在非技术领域,人们很容易忘记,每个编程背后都有大量人员。 商用中的每种算法都至少要由一名工程师为至少一个考虑到既定目标的客户实施批准的设计。

There’s no algorithm in existence that came about by sheer luck or magic. People did it. People who have responsibilities, ethics, and liability for the things they produce.

根本没有运气或魔术带来的算法。 人们做到了。 对自己生产的产品负有责任,道德和责任的人。

Errors do happen. Whether in design, development, training, or testing. The responsibility to find and correct these errors long before they make awful mistakes is, and should be, truly immense.

确实会发生错误。 无论是在设计,开发,培训还是测试中。 在错误犯严重错误之前发现并纠正这些错误的责任是而且应该是真正巨大的。

To hear those responsible say ‘the algorithm did it’ and ask no follow up questions of those who commissioned, designed, or used it is as insulting as it is empty-headed. It’s falling for the most basic and lazy hand-waving technobabble imaginable in an information age which is supposed democratise data for everyone to see.

听到负责人说“算法做到了”,并且不问那些委托,设计或使用该算法的人如何跟进这个问题,就像空头一样。 在信息时代可以想象到的最基本和最懒惰的挥手技术问题上,它被认为是使数据民主化,每个人都可以看到的。

[1] https://science.sciencemag.org/content/366/6464/447

[1] https://science.sciencemag.org/content/366/6464/447

[2]https://www.propublica.org/article/how-we-analyzed-the-compas-recidivism-algorithm

[2] https://www.propublica.org/article/how-we-analyzed-the-compas-recidivism-algorithm

翻译自: https://medium.com/the-innovation/let-the-computers-take-the-blame-953a939bfdf9

计算机与计算机网络

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值