生活中的观察者偏见例子_数据和人工智能中的性别偏见和代表性

本文探讨了观察者偏见如何在现实生活中体现,并延伸到数据和人工智能领域,揭示了性别偏见如何潜藏在这些技术中。
摘要由CSDN通过智能技术生成

生活中的观察者偏见例子

In light of the #MeToo movement and with the growing push for transparency on equal pay and equal opportunity, the world of Tech has had to come to terms with its concerning lack of female representation. It is no secret that women make up a woeful proportion of employees in the tech workforce. Statista estimates the figure to be less than 1 in 4 actual tech roles are taken up by females. The figures are equally as bad, if not worse, for Data Science. According to a report by Harnham, in America, only 18% of data scientists are women and 11% of data teams have no women at all. However, the lack of gender representation (along with other forms of representation such as racial representation), specifically in Data and AI, has ramifications outside of the workplace. It can inhibit the pursuit of gender equality in society. Data and AI are doubly impacted by a lack of female representation due to the gender data gap that exists in much of our data, as highlighted by Caroline Criado-Perez. Representation of females in the data and in the design of data solutions should be essential in any business process.

鉴于#MeToo运动的发展以及在同工同酬和机会平等方面不断提高透明度的要求,高科技界不得不考虑缺乏女性代表性。 在技​​术劳动力中,女性占雇员的比例是绝大部分,这已不是什么秘密。 Statista估计,这个数字少于女性在4个实际技术职位中所占的比例。 对于数据科学来说,这些数字同样糟糕,甚至更糟。 根据Harnham的报告,在美国, 只有18%的数据科学家是女性,而11%的数据团队根本没有女性 。 但是,缺乏性别代表制(以及其他形式的代表制,例如种族代表制),特别是在数据和人工智能中,在工作场所之外产生了影响。 它会抑制社会对性别平等的追求。 Caroline Criado-Perez强调说,由于我们的许多数据中都存在性别数据差异,数据和人工智能受到缺乏女性代表的双重影响。 在任何业务流程中,女性在数据和数据解决方案设计中的代表性都至关重要。

数据偏差 (Bias in Data)

All Data Science and AI projects will start with data. The fairness of an AI model can be limited by the fairness of the data you feed it. Unfortunately, too often, the data is reflective of the bias that exists in our society. This bias can appear in multiple forms. One form of bias will be the lack of females represented in the data. Another form of bias is where the data has not been appropriately sex dis-aggregated. That is, females are assumed to follow the same distribution and patterns as men.

所有数据科学和AI项目都将从数据开始。 AI模型的公平性可能受到您提供给它的数据的公平性的限制。 不幸的是, 数据经常反映出我们社会中存在的偏见 。 这种偏见可能以多种形式出现。 偏见的一种形式是数据中缺少女性。 偏见的另一种形式是未按性别对数据进行适当的分类。 也就是说,假定女性遵循与男性相同的分布和方式。

One such area of research where this issue is particularly troublesome, is in medical research. Prior to 1993, when the FDA and NIH mandated the inclusion of women in clinical trials, many medical trials featured no women due to their childbearing potential and complications in retrieving consistent data due to monthly hormonal fluctuations. A 2010 study found that single-sex studies of male mammals in the field of neuroscience outnumbered those of females 5.5 to 1. As a result, in order to account for women in medical trials, results from the male dominated trials are often extrapolated and women are simply treated as scaled-down men, as explained by Caroline Criado-Perez, author of Invisible Women. This evidently has profound impacts on the health of women.

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值