Embedded EthiCSTM @ Harvard Bringing ethical reasoning into the computer science curriculum

Advanced Computer Vision (CS 283) – Fall 2021

First time reviewing a module? Click here.

Click  to access marginalia information, such as reflections from the module designer, pedagogical decisions, and additional sources.

Click “Download full module write-up” to download a copy of this module and all marginalia information available.

Module Topic: Facial Recognition: when, where, who
Module Author: J. L. A. Donohue

Course Level: Graduate
AY: 2021-2022

Course Description: “Vision as an ill-posed inverse problem: image formation, two-dimensional signal processing; feature analysis; image segmentation; color, texture, and shading; multiple-view geometry; object and scene recognition; and applications.”

Semesters Taught: Fall 2021

Tags

  • reasonable rejection (phil)
  • contractualism (phil)
  • facial recognition (CS)
  • computer vision (CS)
  • social good (phil)
  • justice (phil)
  • consequences (phil)
  • candidate rule (phil)

Module Overview

This module focuses on the use of facial recognition in society and problematizes when that use might be justified or not. It begins with an investigation of the Gender Shades research project, introduces the concepts of reasonable rejection and candidate rules from contractualism, and discusses a variety of use cases for facial recognition software, asking students to apply the philosophical concepts to different use cases to determine whether particular uses might be justified.

    Connection to Course Technical Material

This topic was chosen in part because facial recognition hasn’t yet been the focus of another Embedded EthiCS module, in part because the professor was interested in the topic, and in part because the topic was timely. (Several companies have recently announced that they will not share their facial recognition software with law enforcement until legislative action is taken; others have discontinued development of facial recognition software entirely.) Other possible topics include privacy (are there special privacy concerns with systems that involve biometric markers, as some computer vision systems do?) and a focus on unjust uses of facial recognition software such as use by China to find and persecute identified ethnic minorities.

This course focuses on computer vision, including facial recognition software, so focusing on facial recognition software directly connected to the technical material of the course. As they already knew from course material, the data on which facial recognition algorithms are built can affect accuracy. We discussed that the inaccuracy might have special significance depending on the use case of the software, widespread societal bias, and structural injustice.

Goals

    Module Goals

By the end of the module, students will be able to:

  1. Explaining why systematic inaccuracy in facial recognition systems along identity lines is a special sort of problem above and beyond “simple” inaccuracy.
  2. Explaining the philosophical concept of reasonable rejection.
  3. Analyzing use cases of facial recognition software systems in terms of who might have a reasonable rejection to that system and thus whether or not such use should or should not be permitted.

    Key Philosophical Questions

These questions are incorporated into the scaffolded handout that walks students through step-by-step evaluation of different use cases of facial recognition technology.

  1. What constitutes reasonable rejection to a particular use of a facial recognition system?
  2. How do we determine if a rejection is reasonable?
  3. How do we create the right candidate rule in order to evaluate a particular use of a facial recognition system?
  4. What features of a particular use of facial recognition are relevant to its moral evaluation?

Materials

    Key Philosophical Concepts

Crafting the appropriate candidate rule to capture a case is important and difficult: time spent in class doing that correctly helps to structure the discussion going forward.

  • contractualism
  • reasonable rejection
  • candidate rule
  • relevant features

    Assigned Readings

“Gender shades” relates directly to the technical content of the course and helps to illustrate some (but importantly not all) of the ethical issues associated with the use of facial recognition technology.
Scanlon’s “Contractualism and Utilitarianism” is a fairly accessible introduction to contractualism as a moral theory.

  • Buolamwini, Joy, and Timnit Gebru. 2018. “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification.” In Proceedings of the 1st Conference on Fairness, Accountability and Transparency, 77–91. PMLR.
  • Scanlon, Thomas. 1982. “Contractualism and Utilitarianism.” In Utilitarianism and Beyond, edited by Amartya Sen and Bernard Williams, 1st edition, 103–28. Cambridge ; New York: Cambridge University Press. (Selections)

Implementation

    Class Agenda

This module was primarily interactive lecture, which worked well for the small class size. (The class was only about 12 students.) For a larger class, it might work better to lean more on the small group discussion and a bit less on the lecture components.

  1. Welcome & Introductions
  2. Introductory Discussion of Gender Shades
  3. Brief Introduction to Contractualism
  4. Small Group Discussion: Australia’s COVID-19 Response
  5. Debrief
  6. Small Group Discussion 2: Monitoring Problematic Gamblers

    Sample Class Activity

This module leaned heavily on small group discussions, and the discussions were very productive. A handout with scaffolded questions supported those discussions to help students use the new philosophical tools to which they had been introduced.

In small groups, students worked through a scaffolded series of questions asking about whether or not a particular use of facial recognition software was subject to reasonable rejection. They were also asked to consider if there were conditions on the use that could be placed that might address the reasonable rejection.

    Module Assignment

The assignment asks students to do individually what they did in small groups in class. Also, it offers them the opportunity to evaluate a use case of their choosing.

Choose 1 use case of facial recognition software. It can be a case we considered in class, one from the news, or a hypothetical one you want to consider. In about 250 words, argue that the use should be permitted by your company, not permitted, or permitted with conditions. Be sure to explain briefly what the use case is and defend your position. If you think it should be permitted with conditions, explain those conditions and why you chose them. Consider at least one possible rejection and explain why you take it to be reasonable or unreasonable

Lessons Learned

Student response to the module was overwhelmingly positive. Though some found the difficulty and ambiguity of evaluating reasonable rejections frustrating, they appreciated that the instructor took their frustration seriously and offered some tools for thinking it through.
Pedagogical insights:

  • Handouts helped to facilitate productive small group discussion.
  • Students were quite interested in discussing philosophical difficulties facing contractualism. How much time to spend on this issue may vary by instructor preference.

Except where otherwise noted, content on this site is licensed under a Creative Commons Attribution 4.0 International License.

Embedded EthiCS is a trademark of President and Fellows of Harvard College | Contact us