amazon 工作流服务_我在亚马逊仓库找了份工作,却没有和一个人说话

amazon 工作流服务

重点 (Top highlight)

A few weeks ago, I had just completed an application to work in a warehouse for Amazon. I had watched a video and completed a quiz showing that I knew that to stow items — heavy goes on the bottom, light goes on top. About 20 minutes later, Amazon emailed me that I had the job at the shift I desired. The email said to come into the warehouse recruiting office in Baltimore to take a photo for my ID and have my official documents, like my social security number and passport, ready to be scanned.

几个星期前,我刚刚完成了在亚马逊仓库应用程序的工作。 我看了一段视频并完成了一个测验,表明我知道要存放物品-重物放在底部,轻物放在顶部。 大约20分钟后,亚马逊通过电子邮件向我发送了我需要的轮班工作。 电子邮件说是进入巴尔的摩的仓库招募办公室,为我的身份证拍照,并准备好扫描我的正式文件,例如我的社会安全号码和护照。

I was conflicted. It was the easiest and most streamlined hiring process I’d ever gone through, and I was happier to have the job than to not have the job. At the same time, I got a job at one of the world’s biggest companies without ever speaking to a single human being.

我娃小号冲突。 这是我经历过的最简单,最简化的招聘过程,而且我比没有这份工作更快乐。 同时,我在全球最大的公司之一工作,却从未与任何人交谈。

Applying to work at Amazon was so easy, but it made me take the notion of automation taking over the world seriously for the first time.

在亚马逊上班很容易,但是这使我第一次把自动化这个概念认真地对待了整个世界。

A few days later, I went through with the drug test and checked some boxes for the background check, and I was set to start work at Amazon in just over two weeks. Applying to work at Amazon was so easy, but it made me take the notion of automation taking over the world seriously for the first time.

几天后,我进行了药物测试,并检查了一些盒子进行背景检查,而我准备在短短两周内开始在亚马逊工作。 在亚马逊上班很容易,但是这使我第一次把自动化这个概念认真地对待了整个世界。

When I brought my documents to the warehouse, I interacted with two human beings. I was in the recruiting warehouse where I would eventually work for less than five minutes.

当我将文件带到仓库时,我与两个人进行了互动。 我当时在招聘仓库里工作,最终不到五分钟。

“That’s it?” I asked.

“而已?” 我问。

“That’s it,” the guy said.

“就是这样,”那家伙说。

Even though I got the job, and quickly, I have mixed feelings about automating the job application process. It’s unsettling to get a job so quickly and without talking to a single person. Did they really not care about any human aspects of my application? Did they care about quality and my ability to actually do the work? And could they learn any of that by meeting me?

即使我很快就找到了工作,但我对将工作申请流程自动化也有不同的看法。 如此快速地找到工作而又不与一个人交谈很令人不安。 他们真的不在乎我的应用程序的任何人为方面吗? 他们是否关心质量以及我实际工作的能力? 他们可以和我认识吗?

Completing the orientation process involved an hour of online training videos covering everything from proper attire to safety and some quizzes. As I write this, I’ve just completed my first 40-hour week and I’ve enjoyed my job as a warehouse picker, where I pull orders off of shelves and put them out for delivery. The biggest lesson so far is that if you buy clothes from Amazon or any other online retailer, wash them before you wear them because a lot of people have to touch those clothes before they get to you.

完成入职培训需要一个小时的在线培训视频,内容涵盖从着装到安全以及一些测验的所有内容。 在撰写本文时,我刚刚完成了第一个40小时工作周,并且享受了作为仓库拣货员的工作,在那里我将订单下架,然后下达交货。 到目前为止,最大的一课是,如果您从亚马逊或任何其他在线零售商那里购买衣服,请在穿衣服之前先洗一下,因为很多人在拿到衣服之前必须先触摸这些衣服。

Automating the hiring process as Amazon has done for its hourly warehouse workers saves a lot of time, especially for large companies (Amazon topped 935,000 employees as of April 30). Some argue that automating the hiring process also eliminates bias. Writing in the Harvard Business Review Frida Polli says that without automation many applicants are never even seen, but A.I. can assess the entire pipeline of candidates. In addition, it can eliminate unconscious human biases with the right amount of auditing.

像亚马逊为其每小时的仓库工人所做的那样使招聘过程自动化可以节省大量时间,特别是对于大型公司( 截至4月30日,亚马逊的员工人数达到93.5万 )。 有人认为,自动化招聘流程还可以消除偏见。 哈佛商业评论》的作者弗里达·波利(Frida Polli)写道,没有自动化,许多申请人甚至都从未见过,但AI可以评估候选人的整个流程。 此外,通过适当的审核,它可以消除无意识的人为偏见。

However, according to Cornell professors Manish Raghavan and Solon Barocas, at The Brookings Institute, automating the hiring process as an anti-bias intervention can sometimes exacerbate bias. In the Social Science Research Network (SSRN), law professor Ifeoma Ajunwa notes that the tools prospective employees might be required to use in an automated hiring process can be more restrictive than filling out an application and handing it to a hiring manager, especially for low-wage and hourly jobs.

但是,根据布鲁金斯学院(Brookings Institute)康奈尔大学教授曼尼什·拉格万(Manish Raghavan)和索伦·巴洛卡斯(Solon Barocas)的说法,将招聘过程自动化是一种反偏见干预 ,有时会加剧偏见。 在社会科学研究网络(SSRN)中 ,法学教授Ifeoma Ajunwa指出 ,可能要求准员工在自动招聘过程中使用的工具比填写申请表并将其交给招聘经理要严格得多,尤其是对于低薪的员工。工资和小时工。

Amazon, in fact, has already discovered bias in its automated hiring tools. In 2018, the company’s machine-learning specialists discovered that their recruiting engine favored men. The computer models were trained to vet applicants by observing patterns in résumés submitted to the company over a 10-year-period, and during the 10-year period, most of the résumés came from men. As a result, the tool learned to penalize resumes that included the word “women.” Women who went to all women’s colleges were put at a disadvantage. Engineers’ attempts to fix the problem were unsuccessful.

实际上,亚马逊已经发现其自动招聘工具存在偏见 。 在2018年,该公司的机器学习专家发现他们的招聘引擎偏爱男性。 通过观察10年来提交给公司的简历中的模式,对计算机模型进行了培训,以审查申请人,并且在10年期间,大多数简历来自男性。 结果,该工具学会了对包括“妇女”一词的简历进行处罚。 上过所有女子大学的妇女处于不利地位。 工程师解决该问题的尝试未成功。

Amazon’s tool was biased because it had been trained using an overwhelmingly male sample pool of résumés.

亚马逊的工具存在偏见,因为它是使用绝大多数男性简历样本进行培训的。

Ajunwa also notes that online hiring algorithms can be restrictive for those applying to white-collar jobs. Goldman Sachs, for instance, in 2016 embraced the concept of automated interviewing as an initiative to hire a more diverse workforce. But in a 2019 New York Times opinion article, Ajunwa argued that too much automation creates a closed-loop system with no accountability or transparency.

Ajunwa还指出,在线招聘算法对于白领工作的应用可能是限制性的。 例如,高盛(Goldman Sachs)在2016年采用了自动面试的概念,以此来招募更多的劳动力。 但是在2019年《纽约时报》的一篇评论文章中, 阿junwa指出,过多的自动化会创建一个没有问责制或透明性的闭环系统。

Advertisements created by algorithms encourage certain people to send in their résumés. After the résumés have undergone automated culling, a lucky few are hired and then subjected to automated evaluation, the results of which are looped back to establish criteria for future job advertisements and selections.

通过算法创建的广告鼓励某些人发送简历。 简历经过自动筛选后,会招募少数幸运儿,然后对其进行自动评估,然后将其结果回送,以建立未来招聘和选择工作的标准。

She cites two examples of automated hiring discrimination. In one, a 2017 lawsuit opened by then Illinois Attorney General Lisa Madigan, found that automated hiring platforms discriminated against older applicants through the tools themselves. Drop-down menus for the years a person attended college didn’t go back far enough to accommodate all workers, even though people are staying in the workforce longer. In a 2016 class-action lawsuit against Facebook, Facebook was eventually pressured to curb its paid ad platform to comply with anti-discrimination laws. The Lookalike Audiences feature allowed employers to choose only Facebook users that “looked like” its current employees to see job advertisements. So if a company only had white employees, only white Facebook users would see the ads, or if the company only had women employees, Facebook would selectively target women users.

她列举了两个自动雇用歧视的例子。 在其中一个案例中,当时的伊利诺伊州总检察长丽莎·马迪根(Lisa Madigan)于2017年提起诉讼, 发现自动招聘平台通过工具本身歧视了较年长的申请人。 即使人们待在劳动力中的时间更长,但上大学的年份的下拉菜单并没有足够容纳所有工人。 在2016年针对Facebook的集体诉讼中,Facebook最终被迫限制其付费广告平台遵守反歧视法。 相似受众群体功能允许雇主仅选择“看起来像”其现有雇员的Facebook用户来查看招聘广告。 因此,如果一家公司只有白人雇员,则只有白人Facebook用户才能看到广告,或者如果该公司只有女性雇员,那么Facebook将有选择地定位女性用户。

Ajunwa argues that since employers have the discretion to choose a “cultural fit,” they can further discriminate. They could, for instance, use pre-employment personality assessments, which the Equal Employee Opportunity Commission (EEOC) found in 2018 are probably creating a pattern of discrimination against racial or ethnic minorities. The EEOC ruling found that Best Buy’s personality tests violated Title VII of the Civil Rights Act, which forced Best Buy to stop using its assessments. These personality tests, as part of Best Buy’s automated hiring process, were used to predict how workers would perform. In addition, prioritizing a “lack of gaps in employment” hurts women who have to take leave from the workplace to take care of children.

Ajunwa 认为 ,由于雇主有权选择“适合文化的人”,因此他们可以进一步歧视。 例如,他们可以使用职前人格评估,平等雇员机会委员会(EEOC) 在2018年发现该评估可能正在创造一种歧视种族或少数民族的模式。 EEOC的裁决发现,百思买的人格测验违反了《民权法》第VII篇,后者迫使百思买停止使用其评估。 这些人格测试是百思买自动招聘流程的一部分,用于预测员工的表现 。 此外,优先考虑“缺乏就业差距”会伤害必须休假以照顾孩子的妇女。

Automation is not slowing down anytime soon, and especially not in the hiring process, but safeguards are needed to prevent exacerbating workplace discrimination.

自动化不会很快减慢,特别是在招聘过程中不会减慢速度,但是需要采取安全措施来防止加剧工作场所的歧视。

According to attorneys Mark Girouard and Maciej Ceglowski, regulations by the EEOC hold companies accountable for hiring decisions and tools, and even require that companies keep the data in the case of a bias claim. So companies can be liable for bias, even if they don’t know why an algorithm chose one group over another. In the SSRN paper, Ajunwa also argues for putting more of a burden on the employer, when plaintiffs bring suits of discrimination. So if a plaintiff uses a hiring platform that has a problematic design feature like a personality test, in a court of law, the employer would have to provide statistical proof in audits to show that its hiring platform is not unlawfully discriminating against certain groups of candidates.

根据律师Mark Girouard和Maciej Ceglowski的说法,EEOC的规定要求公司对雇用决策和工具负责,甚至在有偏见的情况下要求公司保留数据。 因此,即使公司不知道为什么算法选择一个组而不是另一个组,公司也可能对偏差负责。 在SSRN论文中,Ajunwa还辩称,当原告提起歧视诉讼时,给雇主增加了负担。 因此,如果原告在人事法庭上使用具有诸如人格测验之类的设计特征的问题的招聘平台,则雇主必须在审计中提供统计证据,以证明其招聘平台没有非法歧视某些候选人群体。

In a separate paper in SSRN, Ajunwa suggests requiring employers to conduct internal and external audits. These audits would make sure that no applicants are disproportionately excluded.

SSRN的另一篇论文中 ,Ajunwa建议要求雇主进行内部和外部审核。 这些审核将确保没有任何申请人不成比例地被排除在外。

The Occupational Safety and Health Administration (OSHA) already recommends audits to ensure safe working conditions for employees, and has a safety certification system that gives marks to employers who fulfill those standards. Ajunwa argues for the same audit and certification system in automated hiring platforms, as well as union collective bargaining that can work with employers to determine criteria actually necessary for determining job fit, as well as for protecting applicant data.

美国职业安全与健康管理局(OSHA)已经建议进行审核,以确保员工的安全工作条件,并且已经建立了安全认证体系,可以为符合这些标准的雇主打分。 Ajunwa主张在自动招聘平台中使用相同的审核和认证系统,以及工会集体谈判可以与雇主合作,以确定确定工作适合性以及保护申请人数据的实际必要标准。

Right now, Covid-19 has drastically accelerated the use of A.I. in the hiring process, with examples of more recruiters moving job interviews and other recruiting interactions online. In an effort to curb face-to-face interactions during the pandemic, many more companies are relying on A.I. and videoconferencing platforms like Zoom.

目前,Covid-19 大大加快了AI在招聘过程中的使用,例如更多招聘人员将求职面试和其他招聘互动在线进行的例子。 为了遏制大流行期间的面对面互动,更多公司依赖于AI和视频会议平台(如Zoom)。

But we still need humans to ensure that the hiring process is fair and equitable.

但是我们仍然需要人员来确保招聘过程是公平和公正的。

I can only assume that I look pretty good to an automated A.I. hiring platform. I’m a young, 23-year-old man with a bachelor’s degree, no criminal record, and no disabilities. An automated hiring process favored me getting a job at Amazon in less than 20 minutes. I had a great and fast hiring experience and was able to work on the warehouse floor very soon after filling out my application. But how can we ensure that everyone else has an equal opportunity?

我只能假设我在自动化的AI招聘平台上看起来不错。 我是一个23岁的年轻人,拥有学士学位,没有犯罪记录,也没有残疾。 自动化的招聘流程有利于我在不到20分钟的时间内在亚马逊找到工作。 我有很好的快速招聘经验,填写完申请表后很快就能在仓库工作。 但是,我们如何确保其他人有平等的机会呢?

翻译自: https://onezero.medium.com/i-got-a-job-at-an-amazon-warehouse-without-talking-to-a-single-human-c22beeeb53d6

amazon 工作流服务

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值