专转本就业歧视怎么消除_人工智能可以帮助消除歧视

专转本就业歧视怎么消除

Recently, a lot articles have been published covering cases in which Artificial Intelligence discriminated against minority groups. The latest examples being the A-Levels algorithm in the UK that made it more difficult for students from poor regions to get access to highly ranked universities, and the model that turned former US president Barack Obama white. Despite these major flaws in the field of AI, can AI still be used to fight discrimination?

最近,发表了很多文章,涉及人工智能歧视少数群体的案例。 最新的例子是英国的A-Levels算法 ,该算法使贫困地区的学生更难以进入排名靠前的大学, 而这种模式使美国前总统巴拉克·奥巴马(Barack Obama)黯然失色 。 尽管在AI领域存在这些重大缺陷,但AI还能用于对抗歧视吗?

人工智能的歧视 (Discrimination by AI)

When AI algorithms became popular they were often praised for their objectivity. Many researchers and practitioners believed that the algorithms cannot be biased because they only use objective data to take decisions. However, in the scientific community there have been a substantial amount of publications discussing how AI algorithms can amplify existing human biases. Nowadays, almost everyone in the AI community is aware of the fact that AI algorithms are not always free of bias.

当AI算法变得流行时,它们的客观性常常受到赞誉。 许多研究人员和从业人员认为,算法不能有偏见,因为它们仅使用客观数据进行决策。 但是,在科学界,已经有大量出版物讨论AI算法如何放大现有的人类偏见。 如今,几乎AI社区中的每个人都意识到AI算法并不总是没有偏见。

Because it is known that algorithms can be biased, the scientific community has spend a lot of effort in trying to prevent algorithms from being biased. Can these methods to prevent bias be used to analyze bias in existing systems?

由于已知算法可能会产生偏差,因此科学界已花费大量精力来尝试防止算法出现偏差。 可以使用这些防止偏差的方法来分析现有系统中的偏差吗?

人工智能的公平 (Fairness in AI)

The field that studies bias in AI algorithms is called fairness in AI. It is a very broad field that contains many techniques. These techniques can be split into separate categories:

研究AI算法偏差的领域称为AI的公平性 。 这是一个非常广泛的领域,包含许多技术。 这些技术可以分为不同的类别:

  1. Techniques to detect bias

    检测偏差的技术

  2. Techniques to prevent bias

    防止偏见的技术

  3. Techniques to correct bias

    纠正偏差的技术

The only requirement for these techniques to work is that there is data available. If we want to test whether or not credit applications in a certain city are biased, we need data about the applicants, the application and about the decision that was made.

这些技术起作用的唯一要求是有可用数据。 如果我们要测试某个城市的信贷申请是否存在偏见,我们需要有关申请人,申请和做出的决定的数据。

Description of the data required to assess discrimination in an AI algorithm for mortgage application (AI, fairness, ML, DS)
The data needed for a credit application analysis (Image by author)
信用申请分析所需的数据(作者提供的图像)

Many companies have developed techniques that can be used to detect, prevent and correct bias.

许多公司已经开发出可用于检测,预防和纠正偏差的技术。

IBM Google Microsoft (I have no relation to these companies)

IBM Google Microsoft (与这些公司无关)

检测偏差 (Detecting bias)

One powerful technique that can be used to detect bias is the so-called regression analysis. This techniques identifies relationships. If there is by example a relationship between the gender of the applicant and the rejection of credit applications, it can be concluded that the decision is biased. A nice property of this technique is that it can take into account other information like income data (this is called controlling variables). In that case, you can prove that there is bias even if we take into account differences in income between individuals.

可以用来检测偏差的一种强大技术是所谓的回归分析 。 该技术识别关系。 例如,如果申请人的性别与拒绝信用申请之间存在关系,则可以得出结论,该决定是有偏见的。 该技术的一个不错的特性是它可以考虑其他信息,例如收入数据(称为控制变量 )。 在这种情况下,即使我们考虑了个人之间的收入差异,也可以证明存在偏见。

防止偏见 (Preventing bias)

The prevention of bias in algorithms is unfortunately very complex. At first, people believed that not including certain information in the algorithm, like gender, would prevent bias (e.g. gender bias). The following example shows that this does not always work:

不幸的是,算法偏差的预防非常复杂。 最初,人们认为在算法中不包含某些信息(例如性别)可以防止偏见(例如性别偏见)。 下面的示例表明这并不总是有效的:

Two people, say Sarah (F) and Jonathan (M), work at the same company at the same department and do the same kind of job at the same level of the organization. Nonetheless, due to wage discrimination in their company, Sarah makes less money than Jonathan. When both persons apply for a mortgage they need to provide their income data. The bank cannot take a decision without this data. Even if the bank does not consider the gender of the applicant, it will still discriminate against Sarah. Why? Because Sarah makes less money because she is female.

萨拉(F)和乔纳森(M)说的两个人在同一部门的同一家公司工作,并且在组织的同一级别上从事相同的工作。 尽管如此,由于公司内部的工资歧视,莎拉的收入比乔纳森少。 当两个人都申请抵押时,他们都需要提供收入数据。 没有这些数据,银行将无法做出决定。 即使银行不考虑申请人的性别,它仍然会歧视萨拉。 为什么? 因为萨拉是女性,所以赚的钱少。

Luckily, there are some techniques that can be used to prevent bias but they are very technical and discussing them is beyond the scope of this article. These techniques still have limitations though. Preventing bias is very difficult.

幸运的是,可以使用一些技术来防止偏差,但是它们是非常技术性的,因此进行讨论超出了本文的范围。 这些技术仍然有局限性。 防止偏见非常困难。

纠正偏见 (Correcting bias)

What if we recognize that our model is biased and try to fix the bias afterwards? This approach is very promising because it means that existing AI algorithms can be fixed without going through the expensive process of changing the algorithm. There are quite a bit of techniques that can exactly do this. One such techniques is the equalized odds technique. Applied to the example above, it would make the odds of getting a mortgage equal for both men and women.

如果我们认识到我们的模型存在偏见并随后尝试纠正该偏见怎么办? 这种方法非常有前途,因为它意味着可以解决现有的AI算法,而无需经历昂贵的更改算法过程。 有很多技术可以完全做到这一点。 一种这样的技术是均等赔率技术。 应用于上面的示例,这将使男女抵押贷款的机会均等。

这是什么意思? (What does this mean?)

Even though recently a lot of articles have been published highlighting bias in AI algorithms, there exist many interesting AI techniques that can help fight against discrimination. Major companies have been developing their own tools to make sure that their algorithms are not biased and lately there has been a large amount of awareness regarding the topic of fairness in AI. Even though there are still many challenges in the field, it is a promising area and can have a major positive impact on our society.

即使最近发表了许多文章强调AI算法中的偏见,但仍有许多有趣的AI技术可以帮助对抗歧视。 大型公司一直在开发自己的工具,以确保其算法不会受到偏见,并且最近人们对AI的公平性话题有了大量的了解。 尽管该领域仍然存在许多挑战,但这是一个充满希望的领域,并且可能对我们的社会产生重大的积极影响。

Interested in a more technical description? (I have no link with the author)

对更多技术说明感兴趣吗? (我与作者没有任何联系)

翻译自: https://medium.com/analytics-vidhya/ai-can-help-fight-against-discrimination-634ab6ca4ed7

专转本就业歧视怎么消除

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值