Embedded EthiCSTM @ Harvard Bringing ethical reasoning into the computer science curriculum

Abstraction and Design in Computation (CS 51) – Spring 2021

First time reviewing a module? Click here.

Click  to access marginalia information, such as reflections from the module designer, pedagogical decisions, and additional sources.

Click “Download full module write-up” to download a copy of this module and all marginalia information available.

Module Topic: Moral Responsibility and Social Networks
Module Author: Samuel Dishaw

Course Level: Undergraduate
AY: 2020-2021

Course Description: “Fundamental concepts in the design of computer programs, emphasizing the crucial role of abstraction. The goal of the course is to give students insight into the difference between programming and programming well. To emphasize the differing approaches to expressing programming solutions, you will learn to program in a variety of paradigms — including functional, imperative, and object-oriented. Important ideas from software engineering and models of computation will inform these different views of programming.” (Course description)

Semesters Taught: Spring 2021, Spring 2022, Spring 2023

Tags

  • networks (CS)
  • fake news (CS)
  • moral responsibility (phil)
  • intervening agents (phil)
  • free speech (phil)

    Module Overview

The topic for this module builds upon a previous iteration by Ronni Gura Sadovsky. The main changes are the notion of intervening agents and the discussion of the Oversight Board, which had not been created at the time that the module was first developed.

The module introduces the notions of moral responsibility and of intervening agents, and then asks what user behavior, if any, Facebook is responsible for. Finally, the module considers Facebook’s recently created Oversight Board, and whether the creation of that board makes a difference regarding Facebook’s own responsibility. The central example here is the (at the time) pending decision from the Oversight Board regarding whether to uphold Donald Trump’s suspension from Facebook.

    Connection to Course Technical Material

The topic was chosen for two main reasons. One reason was to critically examine the idea that programmers can simply absolve themselves of any responsibility by being ‘neutral’ and letting users behave towards each other in whatever way they decide. The second reason was that the question of what responsibility social networks bear for bad outcomes was, in the wake of the 2019 Tech congressional hearings and the 2020 US elections, a very salient issue in the public sphere.

The course discusses issues in computer programming, and how programming can be done well. The module complements the course by looking at the broader impacts of programming. The concept of moral responsibility is used to illustrate how far- reaching these impacts can be, and how much of the responsibility for bad consequences rests with the initial designers of a program.

Goals

Module Goals

By the end of the module, students will be able to:

  1. Consider some key notions in the philosophy of moral responsibility (causation, intervening agents, omissions).
  2. Apply these notions to different cases where Facebook is thought to bear some responsibility for bad outcomes resulting from the use of its platform.
  3. Introduce Facebook’s recently created independent Oversight Board, and consider whether Facebook bears any responsibility for the decisions of the Oversight Board, the bad outcomes that result from those decisions.

    Key Philosophical Questions

The discussion of questions (2) and (3) are each paired with a case study. The first case concerns discriminatory housing ads on Facebook. On at least some previous versions of the housing ad form that owners post on Facebook, users had the option of excluding particular groups of individuals from seeing the ad, on the basis of race, gender or religion. The second case pairs with question (3) and is concerned with vaccine misinformation on Facebook, focusing on the case of Robert Kennedy Jr. In this case, students are asked to consider whether Facebook bore any responsibility for the fact that, as a result of Kennedy Jr.’s posts, fewer people will get a vaccine when it will be available than otherwise would have.

  1. Are we only responsible for those outcomes that are the direct result of our own actions?
  2. Can we be responsible for bad outcomes that are the direct result of someone else’s action, but which we played a role in enabling?
  3. Can we be responsible for bad outcomes that we merely allowed to happen?

Materials

    Key Philosophical Concepts

The notion of causation is used in the formulation of a first-pass principle of moral responsibility, which says that one is responsible for some bad outcome just in case one caused it.
An intervening agent is someone who acts ‘in between’ another agent and a bad outcome. The users on Facebook are intervening agents relative to Facebook. When Facebook users act wrongly, they perform an action that they couldn’t have performed weren’t it for something that the programmers of Facebook previously did. For that reason, the question of whether Facebook is responsible for bad outcomes that results from the use of their platform needs to address whether one can be responsible for bad outcomes that are mediated by intervening agents.
The notion of omission was introduced to draw a distinction between two potentially different types of cases: one in which Facebook introduces a parameter in their design that foreseeably leads to a bad outcome (e.g. including discriminatory options in the housing ad form), and one in which Facebook merely allows users to post their opinions (e.g. vaccine misinformation).
The notion of foreseeability was not emphasized as much as the others, but came up naturally in discussions, and some of the principles of moral responsibility that we considered had a foreseeability condition (e.g. we are responsible for the bad outcomes caused by the actions of other people that we enabled, but only of their acting in that way was foreseeable)

  • Responsibility
  • Causation
  • Intervening Agents
  • Omission
  • Foreseeability

    Assigned Readings

The Zimmerman piece provides insight into the notion of intervening agency as well as into how philosophers think about moral responsibility more generally.

Like most journals published in law, Douek (2021) is very long. But even if one should prefer not to assign it for that reason, it contains helpful information for the instruction, both regarding the workings of Facebook’s Oversight Board, and regarding worries about the extent to which it really is independent from Facebook itself.

  • Zimmerman, MJ (1985). “Intervening Agents and Moral Responsibility”, Philosophical Quarterly.
  • Douek, E (2019). “Facebook’s ‘Oversight Board’: Move Fast with Stable Infrastructure and Humility”, North Carolina Journal of Law and Technology.

Implementation

Class Agenda

  1. Responsibility as Causation
  2. Intervening Agents: Housing Discrimination
  3. Omissions: Vaccine Misinformation
  4. Facebook’s Oversight Board

    Sample Class Activity

Most students find that (B) is more plausible than (A). This sets up the TA to revise the initial principle of responsibility —viz. that we are only responsible for what we cause—since it is not clear that Facebook caused anyone in particular to post a discriminatory ad. The revised principle of responsibility adds a second sufficient condition for moral responsibility, namely that someone is responsible for some bad outcome if they “do something that foreseeably would lead to that bad outcome”.

Having introduced the problem of discriminatory housing on Facebook, students discussed who is responsible for the discriminatory ads, in particular whether it is just (A) the users who posted them, or (B) Facebook and the users who posted them.

    Module Assignment

The essays are peer-evaluated. Each student receives three essays from other students. They are asked to paraphrase the main thesis of each essay and grade them along a provided rubric. Students thus learn not only to express their views using argument, but also to evaluate the arguments of others, and respond to them in a helpful way (feedback on essays is also peer-graded).

Do you agree or disagree with the following claim?

If the Oversight Board upholds, or reverses, Facebook’s original decision to suspend Trump’s account, Facebook will bear no moral responsibility for the consequences of that decision.

In defending your position, be sure to use at least one notion discussed in this module. For instance, you may want to consider whether the Oversight Board is an intervening agent relative to Facebook, and whether the Oversight Board’s impact on the platform is something that Facebook allows to happen.

    Lessons Learned

  1. [note: waiting on sampling of student answers from Stuart]
  2. Given the richness of this case, an alternative module could focus exclusively on the Oversight Board and discuss not only issues of moral responsibility related to it, but also the broader governance question of whether this sort of independent board is the best way for Facebook to be regulated.
  • Students reported finding the notion of intervening agents useful for thinking about the moral responsibility of social networks.
  • The case of Facebook’s Oversight Board is very complex, and disanalogous in key respects from the two other cases that are the focus of this module (Facebook’s responsibility for discriminatory ads, misinformation). In those cases, the question is what responsibility Facebook bears for the behaviors of its users (and the consequences thereof). In the case of the Oversight Board, the question is what responsibility Facebook bears for the decisions (and consequences thereof) of a collective agent that Facebook itself created, whose mandate is to uphold Facebook’s stated values, and whose decisions Facebook has promised to treat as binding (but which are not binding in any other way, and in particular not by law).

Except where otherwise noted, content on this site is licensed under a Creative Commons Attribution 4.0 International License.

Embedded EthiCS is a trademark of President and Fellows of Harvard College | Contact us