Authors: Margo Boenig-Liptsin, Vance Ricks
Justice refers both to individuals' receiving 'what they are owed', and to a social order that makes that possible. Equity is a related concept, focused on people having 'what they need', even if that is different from having 'the same' that everyone else has. There are a variety of ways in which issues of justice and equity intersect with computing, and each of these is both a reason for discussing justice and equity in relation to Computing as well as an example of what to discuss. This section focuses on three areas: the classroom, industry/discipline, and products.
To begin, there are justice implications in the structure of the computing classroom and forms of computing pedagogy. Look around the classroom at the students in the seats. Who are they and how did they get to be here? Social processes like which K-12 schools receive what level of resources, and specific institutional dynamics about requirements for major declaration and limited classroom sizes, mean that some underrepresented and minoritized students lack opportunities to take certain computing courses or major in computing-related fields. Next, we can look at what the students learn in the class. What is the format of their exams? Curved exams, for example, have been shown to have a disproportionately negative effect on women in STEM and to foster competition among students that hurts minoritized and underrepresented students (Hughes et al, 2014), which feeds back into the question of who attains a computing major. Another issue with the structure of classwork is the frequently lacking culture of discussion -- of structured conversation led by a trained instructor as a means of learning by speaking and listening among peers. The absence of dedicated discussion time or trained instructors to lead challenging discussions makes it harder to engage in issues of justice and equity or to make the classroom an equitable learning space--where everyone has what they need to learn.
These issues of justice in the computing classroom exist in part because, like American society at large, computing as a discipline and industry has a long history of largely unexamined systemic racism and exclusion. Computing is a powerful field that has been shaped by the personal, financial, and political capital of mainly white men who have come to define the kinds of questions that the field should address, the forms of knowing and attendant practices it specializes in, and the work cultures that are rewarded and what employees are deemed successful. Until recent work in the history of computing shone a spotlight on the role of women in the development of the field, stories of computing have often set up the narrative of the field as one created by singular white men who are the model of the "innovator" and "entrepreneur." Meanwhile, computing has systematic underrepresentation of women and people of color, both in the field and in the tech workplace, and is notorious for its "pipeline problem," or the few women and people of color who are accepted into the major and even fewer who go on to graduate with computing degrees and work in technologically advanced industries. The history of the field and current industrial practices are not only an unavoidable and indispensable part of any conversation of justice, equality and computer science, but also help connect the computing-specific justice issues to broader questions of race and gender equity and history of social justice movements in the US and the world.
Beyond the classroom and the discipline, issues of justice enter into technology products in ways that many students may already be familiar with from their ordinary lives. These issues may be environmental consequences of seemingly "clean" or resource-light industries that are disproportionately borne by immigrant or lower income communities, or they may be the chasm of difference between the compensation, benefits, and labor conditions of technology and IT workers who build technology platforms and those of gig economy laborers whose work is organized through them. These issues include the discriminatory tools of surveillance and control that predominantly surveil and police communities of color, the unhoused, and the poor. Each of these issues of justice embedded in technology products has historical, structural, organizational, and technical design reasons that a discussion focused on justice can help to distinguish so that students can not only understand that one issue but have learned a pattern that they can apply to novel circumstances. Also see the related section on Structured Ways of Thinking about Computing and Society for more examples and ideas on how to proceed on similar topics.
A discussion of justice and equity and computing should give time to each of these three areas (classroom, discipline/industry, product) and focus student attention on their interconnection. In addition, it needs to ground the discussion in specific cultural contexts where these issues play out. For example, it is difficult to conceive of a conversation about any one of the above three themes without providing students with a primer on racism in American society and history and without a discussion of the Civil Rights Movement. For these kinds of foundational lessons, it is best to consult with faculty in one's institution that are experts in this area and invite them to give guest lectures or, better, co-design the curriculum around justice to be supported with theory and history. Also see the sections on Difficult Conversations and Working Across Disciplines. In discussing justice, equity, and computing outside of the United States, consider what local context students need to know.
Throughout it is important to keep a reflective stance on one's own place and standpoint. Encourage students to examine what they think and what aspects of their identity, experience, and training contributes to their perspective. For example, how are justice and equity mobilized in computing as a field through "fairness," bias mitigation, and differential privacy, and what are the opportunities and limits of these technical approaches in relation to broader social questions? This kind of reflection needs to be facilitated through a space for discussion in which students can explore their perspectives together in an environment that is safe for all.
Key Questions:
- What do the students know about justice in relation to computing? How do they know it (e.g. through other courses, through the media, due to personal experience)? In order to have conversations about justice and equality in relation to computing or in the computing classroom, it's important to meet students where they are. Some students may not have ever thought about computing having justice dimensions or to have a history that includes injustices while others may have intuitive understanding of the inequities related to computing (whether as consumers or as students in a computing major). The history of the department and the institution are important to understand what students are likely to know already that the instructor can connect to, what they would find surprising or would require more time to understand, and what might be the expectations that they are bringing into a discussion about justice and computing.
- How can the instructor meet the students where they are? If the students do not have much background in concepts of justice and equity, the instructor should provide students with a working definition of justice and equity. They should be invited to articulate their own understanding of these concepts and share these with peers, and revisit these definitions often through the class. This activity makes the students more invested in the conversation and provides an iterative process that resonates with Computing students in particular.
- What conceptions of justice and of equity is the instructor already familiar with? This is different from the above question. It’s not asking students to think specifically about how these concepts might apply in or to computing, but rather, asking instructors to talk about their familiarity with these concepts more generally. The instructor might be unfamiliar with “equity” as distinct from equality. The instructor might see that conceptions of justice, equity, and equality can overlap or be mutually reinforcing. It isn’t necessary to reach a final consensus on the meanings of these terms, since even the “professionals” haven’t necessarily done that! But the instructor needs to understand their own perspectives before trying to manage a class or department discussion.
- Where will these topics be discussed with students? Is the plan to incorporate discussions of justice and equity in a particular course, in a sequence of courses, or separately from courses in the curriculum?
- How will the instructor create spaces for difficult conversations that students can engage in and work through in a safe, non-judgmental, and supported fashion? The instructor may need to work with someone who is a professional in supporting such conversations, for example, the Diversity, Equity, and Inclusion (DEI) office or educational technology specialist focused on DEI. If such a person / office doesn’t exist at the institution, the instructor could try reaching out to colleagues who regularly tackle such conversations in their own classrooms for advice and informal support.
- Which justice and equity topics will be covered? The instructor should examine ways in which they will connect their course, discussion, entire Computing degree curriculum to one or more of these sites of (in)justice and (in)equity:
- The structure of the classroom and forms of pedagogy
- The history of computing as a field of study
- Computing as an industry
- Products/services of computing, and their effects in and on the world
- Connections to broader social, historical, and political contexts of justice and equity, such as histories of racism, sexism, discrimination, income inequality, activism for justice
- Awareness of the individual designer/programmer/engineer of their own socio political positions and values
- Moving from awareness to action
The instructor should look over the syllabus for opportunities to integrate discussion of justice and equity, keeping in mind the above potential topics. The instructor may need to collaborate with a colleague from the humanities or interpretive social sciences (Philosophy; History; Anthropology; Science, Technology and Society; Geography) who has research and teaching expertise in issues of justice and equity. Also see the section on Working Across Disciplines.
- What project(s) or assignment(s) will provide students a chance to incorporate insights and lessons learned from the discussions of justice and equity? In-class discussions can’t merely end with discussion. Students need to have assignments or projects later in their studies to practice making changes towards equity and justice. These projects are often best achieved through teamwork that involves community partners, within which students are prepared with training on what it means to engage responsibly and equitably with the community partners (again, it is best to seek out support for this training from colleagues with expertise in action research or community engaged scholarship).
- What is the connection between discussions of general examples or examples from elsewhere to the department, institution, or community? Examining the history of one's own department and institution in relation to justice is one of the core tenets of anti-racist pedagogy (Kishimoto 2016), but it is also a useful technique for making material really resonate with students and supporting them in identifying specific opportunities where they can intervene to make changes towards greater justice and equity.
Checklist
☐ Recognize that students will have variable degrees of ability, experience, and comfort with being able to discuss issues of justice and equity related to computing.
☐ Meet students where they are.
☐ Identify instructors familiar with justice and equity concepts and/or determine how instructors not familiar with the topics can be supported in leading a discussion.
☐ Identify target courses to incorporate justice and equity.
☐ Create spaces for difficult conversations
☐ Identify justice and equity topics to cover in the course
☐ Design projects and/or assignments that incorporate insights from justice and equity discussions in class
☐ Connect examples to the department or local community.
Examples
- University of California, Berkeley
- Guilford College
- Haverford College
- Washington University in St. Louis
University of California, Berkeley
UC Berkeley's "The Value of a Home" homework assignment intended for the course DATA 100: Principles and Techniques of Data Science explores the promise and limits of data science tools to deliver more accurate and fair assessments of house values. It is based upon the open data and informational interviews with the Cook County Assessor's Office in Illinois.
Checklist walk-through: Below we show how some of the checklist items in the previous section of this document could be used at Berkeley. (The checklist did not exist when the assignment was designed and given. This walk-through is meant for illustrative purposes.)
- Recognize that students will have variable degrees of ability, experience, and comfort with being able to discuss issues of justice and equity related to computing.
- Our goal in developing this "The Value of a Home'' homework assignment for Data 100 at Berkeley was to invite students to consider the promise of data and computing project in the real-world that promise to deliver justice and equity and to investigate how this outcome is both challenging to achieve without understanding how systems of structured racism operate and have operated historically, and that the outcome depends upon the minute details of decisions and actions that the students might take as data and computing professionals. We knew, however, that both students and instructors in this class were oriented to teaching and learning technical skills such as linear regression and model fitting, and had no immediate interest in tackling theoretical questions like transparency, analyzing fairness of models, or considering structural racism. So, we started by choosing a publicly available data set, from the Cook County Assessor's Office, that supports teaching the technical lesson while also opening up a rich real-world context.
- Meet students where they are.
- In the homework, we don't take any knowledge of the social context for granted. We take time to explain the context, point students to important terms like "transparency," "bias," and "fairness" to get them to start asking questions about them. Most importantly, we try to situate the student in the real-world example giving them the "front seat" view of the problem and walking them through how data and computing professionals they might aspire to become went about "solving" it.
- Identify instructors familiar with justice and equity concepts and/or determine how instructors not familiar with the topics can be supported in leading a discussion.
- We developed this homework assignment in a team of students and instructors who were trained in and had experience teaching in STS, sociology, and history. Students who contributed to the project both had real motivation to understand the history of redlining in the Chicago area, uncover the history of the Cook County Assessors' program and did extensive research (historical, theoretical, interviews) to pull together the information for the case study.
- Identify target courses to incorporate justice and equity.
- In our case, we had a target course in mind from the start. We believe all of the core Data science courses and many of the computing courses are ready for material on justice and equity to be integrated, in different ways, of course. We picked a course we knew the teaching faculty of, and which students who worked on the curriculum had taken themselves so they understood what the assignment needed to do and how to supplement it.
- Create spaces for difficult conversations
- The completed homework assignment has many questions for students to reflect about at the end that are less for thinking of on one's own and more for discussion among others. Luckily, instructors of Data 100 are trying out new forms of teaching the course this semester that includes a mandatory discussion section and a "fireside chat" forum where an instructor or trained-TA guided-discussion can take place.
- Identify justice and equity topics to cover in the course
- In this homework, we wanted to foreground the concept of "transparency" and related ideas of bias and fairness. These are ubiquitous concepts in computing and data science today, that both carry different meanings and promises among different communities of professionals. The homework situates these concepts in a real-world context where the struggle over property value is grounded in historical oppression that, we want students to learn, is essential to working towards justice with algorithms.
- In this homework, we wanted to foreground the concept of "transparency" and related ideas of bias and fairness. These are ubiquitous concepts in computing and data science today, that both carry different meanings and promises among different communities of professionals. The homework situates these concepts in a real-world context where the struggle over property value is grounded in historical oppression that, we want students to learn, is essential to working towards justice with algorithms.
Guilford College
In the Fall 2019, Spring 2020, and Fall 2020 semesters, the introductory programming course (CTIS 210) included a lab assignment: “Do algorithms do the right thing?” Philosophy and CS instructors collaboratively designed the assignment.
Checklist walk-through: Below we show how some of the checklist items in the previous section of this document could be used at Guilford.
- Recognize that students will have variable degrees of ability, experience, and comfort with being able to discuss issues of justice and equity related to computing.
- For this lab assignment, we asked students to submit their recorded answers to selected discussion questions about their understanding of systemic advantage/disadvantage and fairness, and to explain how they connected those concepts to their data gathering and testing in the first part of the lab assignment. Assessments were based on the students’ ability to explain those connections.
- Meet students where they are.
- Since the phrases, “systemic advantage” and “systemic disadvantage” might be unfamiliar to many people, we provided the following example to introduce some of the underlying concepts:
- Suppose that every grocery store puts the full-sized candy bars on the top shelves. It just so happens that the overwhelming majority of the shelf stockers are tall people, and they want to be able to easily reach their favorite full-sized candy bars. In this example, the widespread practice occurs because of what the people deciding where to stock the candy consider 'efficient' or 'normal'. They might or might not mean to make it harder for other people to see or get the full-sized candy bars. However, as a result of the widespread practice, tall people have a consistent advantage and shorter people (including people in wheelchairs) have a consistent disadvantage.
- Since the phrases, “systemic advantage” and “systemic disadvantage” might be unfamiliar to many people, we provided the following example to introduce some of the underlying concepts:
- Identify target courses to incorporate (discussion of) justice and equity
- The instructors decided to design this assignment for the required introductory course, for three reasons. First is that the CTIS 210 instructor has a longstanding interest in including ethical assignments in their courses and in collaborating with colleagues in other departments, including Philosophy, to make that happen. Second is that the instructors felt strongly the importance of starting, in the students’ very first course, to accustom them to noticing and grappling with some of the ethical implications that are present in basic computing concepts and methods. Third is that this assignment would serve as a proof of concept that students (and the course itself) benefited from such an assignment and how analogous assignments can be incorporated into intermediate- and advanced-level courses as the students continue through the curriculum.
- Identify justice and equity topics to cover in the course
- The instructors decided to focus on the notion of fairness, as one that many people care a great deal about, and as one that naturally leads to reflection on ways in which a programmer’s (perhaps unintended) conceptions of fairness, as expressed in the software they design/create/maintain, can have demonstrable effects on individuals and groups. The lab assignment requires students to design a program to analyze an hypothetical set of job applicants, using the tools developed in previous labs. The design process required them to scrutinize their own assumptions about those who submit "non-standard" information in job applications, and to explain whether and why there may be systemic advantages or disadvantages that job application evaluation algorithms are likely to promote and amplify.
Haverford College
We have integrated the ProPublica Machine Bias article and risk assessment analysis into our Introduction to Data Structures course (a traditional CS2 course using the Goodrich, Tamassia, and Goldwasser Data Structures and Algorithms in Java text). We introduced two multi-week programming assignments that are meant to be the first and second programming assignments in the course (after some initial small assignments to teach basic Java programming). The data structures goals are to introduce objects, classes, ArrayList, data structure design, and reading in data. In the first programming assignment students create the data structures that will hold the per-person data that represents a defendant from the ProPublica dataset. In the second assignment, students reproduce the analysis by ProPublica showing differences in error rates of the risk assessment for white and Black defendants. As part of this process, we incorporated the following
Checklist walk-through:
- Identify instructors familiar with justice and equity concepts and/or determine how instructors not familiar with the topics can be supported in leading a discussion.
- One difficulty with introducing this assignment as a standardized part of our curriculum (no matter who teaches the course) was the concern of some professors that they didn’t have the depth of knowledge about the criminal justice system and risk assessments to appropriately introduce the topic in class. To aid them in situating the discussion in the appropriate societal context, we gathered and created videos and readings that can be shown in class to introduce the topic and assigned for after class reading or viewing. Those materials as well as the assignment can be found here.
Washington University in St. Louis
At Washington University in St. Louis, we ran two fairness exercises with students in an introductory course:
- Cake cutting -- this algorithm cannot be fooled, so it’s an example of a provably fair algorithm. The students try to cheat each other out of a stylized cake. No matter which color they claim as their favorite, if they follow the protocol of the algorithm, each side gets at least ½ of the cake.
- Unfair auction -- here the students enact an auction where the computer knows the bids of every player, and cheats to elicit the highest bid possible. Students discover this cheating after the auction is over when they compare bids. This exercise is intended to show situations where a computer or algorithm could cheat.
Resources
- Thomas Philip, "Analyzing our Classrooms as Sites of Ideological Contestation and Transformations for Discussing and Addressing Race and Racism," A talk presented in UC Berkeley's History Department, March 1, 2021.
- Maurianne Adams, Lee Anne Bell (Eds.), Teaching for Diversity and Social Justice: 3rd edition, 2016.
- Jessie Daniels, Mutale Nkonde, Darakhshan Mir, "Advancing Racial Literacy in Tech: Why ethics, diversity in hiring and implicit bias trainings aren't enough" Data and Society, 2019.
- Kyoko Kishimoto, "Anti-racism pedagogy: From faculty's self-reflection to organizing within and beyond the classroom" Race, Ethnicity, and Education, 2016.
- Our Data Bodies project
- The Algorithmic Justice League
- Data For Black Lives (D4BL)
- The Digital Equity Laboratory
- Wendy L. Hill, "The Myth of the STEM Pipeline," Inside Higher Ed, October 2, 2019.
- UC Berkeley's Curriculum packages (Jupyter Notebook homeworks, mini-lectures, and discussion questions).
- UC Berkeley's Data 4AC: Data and Justice, being offered for the first time in Spring 2021 (Spring 2021 syllabus).
Related Pages
Margo Boenig-Liptsin (author)
Vance Ricks (author)
Ron Cytron
Sorelle Friedler