Visit

Good Code is a weekly podcast about ethics in our digital world. We look at ways in which our increasingly digital societies could go terribly wrong, and speak with those trying to prevent that. Each week, host Chine Labbé engages with a different expert on the ethical dilemmas raised by our ever-more pervasive digital technologies. Good Code is a dynamic collaboration between the Digital Life Initiative at Cornell Tech and journalist Chine Labbé.

Follow @goodcodepodcast on Twitter,  Facebook, and Instagram.

On this episode:

Workers have long been quantified. The 19th century saw the introduction of the punch card, to verify time spent on the job. Taylorism later introduced the measure of a worker’s productivity, task by task.

But new technologies are bringing this process to a degree previously unseen in History. Algorithms and other kind of automated tools are now widely used in the hiring process, at the very beginning – reducing job applicants to a set of data points.

Then on the job, all sorts of things are measured, including things that are not directly related to the work itself. Ifeoma Ajunwa is an Assistant Professor of Labor & Employment Law at Cornell University. She wrote a whole book about quantifying workers, and she shares some of her findings with us.

You can listen to this episode on iTunesSpotifySoundCloudStitcherGoogle PlayTuneInYouTube, and on all of your favorite podcast platforms.

We talked about:

  • In this episode, Ifeoma Ajunwa mentions her dissertation on the reentry of the formerly incarcerated into society. This work is what first introduced her to hiring algorithms, and their potential for excluding entire groups of job applicants.
  • Ajunwa also talks about a secret hiring algorithm that Amazon had been working on before the company realized it was biased against women. The tech giant tried editing its algorithm to make it bias free, but it eventually decided to kill it instead, as Reuters reported last October. “A new team in Edinburgh has been formed to give automated employment screening another try, this time with a focus on diversity,” the article says.
  • Ajunwa also gives a great example of how bias can be perpetuated by algorithms in the hiring process: the Story of Jareds, as she calls it. In this *true* story, an employment attorney who was auditing a resume screening tool found that the algorithm considered the name Jared and whether or not the person had played high school lacrosse to be the two most important factors in determining who would be a good fit for a job. The tool was replicating bias by looking for proxies for cultural fit, as Ajunwa explains, and in this case, favoring white males. Read about this example in this Quartz article.
  • In this episode, we also briefly talk about HireVue, a company which combines AI and video interviews to assess job candidates. The company says on its website that its goal is to reduce bias and foster diversity. But Ajunwa wonders how they are actually evaluating body language and facial expressions, and whether their criteria could be unintentionally replicating bias. One thing Ajunwa points out is that facial expressions mean different things to different cultures. Americans for instance, tend to smile a lot.
  • In this episode, we also talk about Tengai, a Swedish job interview robot that was designed specifically to prevent bias. Read about it on BBC.com.
  • Ifeoma Ajunwa also mentions an article she co-wrote in 2016 called “Limitless Worker Surveillance.” In this article, she looks at productivity apps and worker wellness programs and how “legal constraints (in the US) are insufficient and may leave American workers at the mercy of 24/7 employer monitoring.”
  • In this episode, we also discuss a wristband Amazon won two patents for, and that would allow the company to check the moves of its warehouse employees and nudge them through vibrations.
  • Ajunwa also mentions the ways in which UPS has used data to monitor its drivers and to increase productivity. Read about it in Wired.

Read More:

  • Is Slack making us more productive? Read about it on Vox.
  • Is work surveillance efficient? This article in The Atlantic argues that “the more bosses try to keep track of their workers, the more precious time employees waste trying to evade them.”
  • Upturn, the non-profit co-founded by David Robinson, our guest in Episode 10, has written a report about hiring algorithms and bias last year. Check it out here.