Embedded EthiCSTM @ Harvard Bringing ethical reasoning into the computer science curriculum

Abstraction and Design in Computation (CS 51) – Spring 2022

First time reviewing a module? Click here.

Click  to access marginalia information, such as reflections from the module designer, pedagogical decisions, and additional sources.

Click “Download full module write-up” to download a copy of this module and all marginalia information available.

Module Topic: Privacy vs. Perfection
Module Author: Ellie Lasater-Guttmann

Course Level: Undergraduate
AY: 2021-2022

Course Description: “Fundamental concepts in the design of computer programs, emphasizing the crucial role of abstraction. The goal of the course is to give students insight into the difference between programming and programming well. To emphasize the differing approaches to expressing programming solutions, you will learn to program in a variety of paradigms — including functional, imperative, and object-oriented. Important ideas from software engineering and models of computation will inform these different views of programming.” (Course description)

Semesters Taught: Spring 2021, Spring 2022, Spring 2023

Tags

  • Optimization [CS]
  • Perfection [CS]
  • UX design [CS]
  • Anonymity [CS/phil]
  • Aggregation [CS/phil]
  • Privacy [phil]
  • Liberty of action [phil]
  • Protection from power [phil]

Module Overview

The module looks at three cases of user tracking to pinpoint how to balance optimizing the UX or backend with privacy concerns. The three cases progress from least powerful for optimization and least privacy-violating to most powerful and most concerning. Students consider whether anonymity and aggregation are useful tools to protect user privacy

    Connection to Course Technical Material

This module could be improved by connecting it more closely to the course content if possible. Despite not being as closely connected as ideal, the topic was engaging to students. They reported walking away with a clearer understanding of the ways privacy is violated online.

The course as a whole is devoted to writing beautiful, elegant, and useful code. This module was designed to bring additional components into that discussion of perfect software. Do violations of user privacy make software less perfect?

Goals

Module Goals

By the end of the module, students will be able to:

  1. Become familiar with the ways in which user behavior is being tracked online
  2. Understand the ethical values associated with anonymity and privacy
  3. Design a framework that will aid developers in balancing privacy with site optimizations

    Key Philosophical Questions

Students walked away with a better understanding of how their activity is being tracked online, which caused them discomfort. That being said, there was a variety of opinions about the reasonableness of these privacy violations given the benefits of tracking software to the company – a variety which signaled that the students were engaged with the material.

  1. How does testimonial injustice cause harm, and how should we mitigate that harm?

Materials

    Key Philosophical Concepts

Certain products violate the first value of privacy but not the second, and vice versa. Similarly, certain products use anonymity and/or aggregation to decrease these privacy violations.

  • Privacy promoting the liberty of action
  • Privacy as protection from power
  • Anonymity
  • Aggregation

    Assigned Readings

While I recommend this article, it did not play a central role in our discussion on the day. I would recommend bringing it into the activity more directly.

Implementation

    Class Agenda

For this module, a classwide regroup after each section of the activity is not necessary, particularly given that students progressed at different paces.

  1. 10 minute lecture reminding students about the two concepts of privacy (as control and as restricted access)
  2. 40 minutes of activity in small groups
  3. 20 minutes to collate the results as a class into a collection of proposed principles about user privacy online

    Sample Class Activity

This module centered on this interactive activity. While the role-playing aspect initially made students feel uncomfortable, they quickly broke through that discomfort and embraced their different characters. This aspect of the activity prompted a more thorough understanding of the privacy concerns at play.

The activity is a role-playing game where students pretend to be one of three roles: a developer at a startup, a salesperson at a user-tracking company, and a privacy-watchdog representative. The goal is to determine whether there are reasonable applications of three different user-tracking products: Google Analytics (IP tracking), Datadog (tracking on the backend), and Hotjar (front-end real-time tracking).

    Module Assignment

This assignment brought the module material more closely tied to the course material.

The Philosophy for CS 51 is “Perfection is finally attained not when there is no longer anything to add, but when there is no longer anything to take away” (Antoine de Saint-Exupéry). From what we learned in lab, what new considerations might bear on whether there’s anything to add or take away? For example, what considerations would justify a company removing tracking products from their software implementation? What considerations could justify adding tracking products?

Lessons Learned

  • If possible, I would recommend tying the module material to the course material more directly. I do not have specific recommendations, however. One possibility is to find an example where Datadog would have identified “imperfections” that they had covered in their syllabus already.
  • Students enjoy learning about real-life applications.

Except where otherwise noted, content on this site is licensed under a Creative Commons Attribution 4.0 International License.

Embedded EthiCS is a trademark of President and Fellows of Harvard College | Contact us