Teaching Responsible Computing Playbook

Teaching Responsible Computing Playbook

Learning Outcomes and Assessments

Authors: Margo Boenig-Liptsin

What do you hope that students will be able to do after taking your course? How will you evaluate how well they're able to do this? These are the twin questions of Learning Outcomes and Assessments. It is useful to consider learning outcomes at the beginning of course design or the design of a Responsible Computing intervention and work backward from them to inform the shape of the lesson. Responsible Computing-inspired learning outcomes and assessments can come in different shapes and sizes.

Some learning outcomes focused on mastery of understanding might be more micro and closely connected with the technical lessons. For example, you might want to teach students a bit about where the dataset comes from that they are working with or something about the history of the technique that they're learning.

Other understanding-focused learning outcomes can be more macro, teaching students concepts from other disciplines that have a bearing on their technical learning. For example, this could include teaching them the concept of power or institutions and asking them to apply these concepts to their lessons.

Another set of learning outcomes can focus on ways of thinking, building capacities of reflection, or communicating about Responsible Computing issues with others. For example, by teaching students to pay attention to their expertise and the credibility and power that goes along with it, students can gain confidence and experience verbally exploring these issues with their peers. By practicing expository writing at the scale of a paragraph or an essay, students can develop their capacity for writing analytically on relatively complex issues in ways that can go beyond expressing a simple personal opinion.

Evaluating students' grasp of these different learning outcomes requires different forms of assessment. These may also be different in character or extent from what is used in other computing courses in your program. While a multiple-choice/short-answer exam graded along a rubric might work for more micro-focused learning objectives, more macro-focused objectives might require long-form writing where students apply concepts to new cases. Learning objectives that aim to teach students ways of thinking, reflecting, and communicating may need to be assessed continuously in real-time, as students participate in discussions with their classmates. Alternatively, these can be assessed through long-form writing or community-engaged projects, with appropriate guidance (e.g. help with how to identify a topic, think about the audience, write a thesis, engage with a community partner while considering discrepancies in values and methodologies), and scaffolding of their work through smaller assignments and opportunities for feedback along the way.

Asking about learning outcomes and assessments is useful not just at the level of a single course, but also at the level of the course/program of study where students might encounter the course. Thinking about it at the scale of programs helps to recognize the ways in which learning outcomes of single courses connect to and can be developed through a series of courses. Keep in mind that learning outcomes that focus on higher-level skills like the ability to engage critically with a specific technology or to learn a different way of thinking about problem-solving require time, accumulation/steeping in material, and development of personal maturity. They are not something students can learn once and simply apply, but need to experience repeatedly, practice working with, develop other capacities (such as, for example, understanding the history of a field, race relations in the US, or the ability to read research articles). Finally thinking about learning outcomes and assessments at the program level might also be useful for accreditation purposes.


Key Questions:

  • What will students be able to do after taking your course? What will students be able to do differently after engaging with the course that has a Responsible Computing component built-in? In the context of integrating work in Responsible Computing into an existing technical curriculum, it may be more intuitive to start with the second variant of this question. This question focuses on what including Responsible Computing in the course adds. Beyond just adding, it draws attention to how Responsible Computing learning is related to -- and transforms -- technical learning. For example, if a course on image recognition is used to introduce a lesson about the historical origins of facial recognition technology that draws back to the eugenic tools of phrenology, this may transform the way that students consider the broader purposes and social contexts of these algorithms deployment.
  • What are the student learning outcomes and how do they relate to the program outcomes? At an initial stage, decide on micro or macro learning outcomes. Also, consider whether learning outcomes related to reflecting on and/or communicating responsible computing to others will be included. Once there are Responsible Computing-related student learning outcomes for a few courses, consider how these together form a broader program-level set of learning outcomes.
  • How will the team evaluate if the students have achieved the learning outcome (and/or develop assessment rubrics)? Depending on how objectives are implemented, it may be necessary to come up with clear, reliable standards for assessing student learning through exercises, discussions, project work, or written assignments. If possible, specify levels of accomplishment in the assessment rubrics. Test these out for clarity with other potential members of your teaching team.
  • What forms of assessment are already used, and how might these need to be adapted to give students the appropriate ways to express their knowledge of the embedded Responsible Computing lessons? In addition to standard ways of evaluating code, it may be necessary to develop more open-ended writing prompts and rubrics (and training for instructors to grade them). Develop scaffolding to support student learning toward the outcomes, including possibly intermediate as well as final assessments.
  • What changes need to be made to the structure of the class in order to allow students to pursue more high-level and reflective learning outcomes as well as to effectively assess these? For example, if introducing a discussion component for students to discuss Responsible Computing issues with their peers, consider whether there is teaching staff (e.g., trained student peer instructors) to lead these discussions and support students in the development of their reasoning and communication.
  • What is the plan to get feedback from experts outside of the department (especially for macro learning objectives)? Consider showing the package of outcomes and assessments to a Teaching & Learning Specialist, a colleague, or a peer from a different department with relevant experience.


Checklist

☐ Set overall goals for your course or course module.

☐ Design an initial set of learning outcomes.

☐ Develop assessment rubrics.

☐ Develop assessment with appropriate scaffolding.

☐ Figure out if the course structure needs to change to support the planned assessments and learning outcomes.

☐ Develop a plan to get feedback from non-computing experts on planned learning outcomes and assessments

☐ In implementing novel kinds of assessments, observe any difficulties and take feedback from students. Iteratively revise.


Examples

UC Berkeley

At UC Berkeley, we created a Jupyter Notebook homework, "The Meaning of Speech: Sentiment Analysis of President Trump's Tweets" that teaches students how to use sentiment analysis on a corpus of text while considering the contexts and meaning of speech and the opportunities and limits of abstraction. The homework is intended for UC Berkeley's course, DATA 100: Principles and Techniques of Data Science.

You can find this and other Jupyter Notebook homeworks with responsible computing components, discussion questions, and grading rubric for the homework questions on the UC Berkeley's Human Contexts and Ethics Curriculum Packages page.

Below is a checklist walk-through on how we prepared this homework.

  • Set overall goals for your course or course module.
    • The goals for the course overall have been mostly to teach students the technical skills they need to perform data analysis, using real-world datasets. The goal of our homework was to show the real-world implications of these data and computing projects in order to not only give students a richer understanding of Responsible Computing but also a more engaging way to learn the technical material.
  • Design an initial set of learning outcomes.
    • We designed the following learning outcomes for this homework assignment.
      • By working through this notebook, students will be able to:
        • Understand how technical data science tools can be applied to real-world problems, how they can be used to investigate and expand one’s understanding in this case, of politics and public communication.
        • Know what sentiment analysis is, how to do sentiment analysis, and understand its use in a social media context.
        • Recognize that the sentiment of a word is context-dependent and consider how this affects the sentiment score that a word is given.
        • Understand the significance of different aggregation and abstraction processes used in creating a sentiment score and their relationship to the meaning and tone of the text.
  • Develop assessment rubrics.
    • We had the homework assignment to work with. It was in the form of a Jupyter Notebook, where students typed questions right in and undergraduate TAs would grade these questions. Since the TAs who would be grading the homework aren't necessarily familiar with the Responsible Computing content, we provided rubrics for them for what each student’s answer could look like. We explained what full credit and partial credit looked like, with the main emphasis on students giving the questions a try and demonstrating their thinking. To the students, we gave them hints following questions in case it wasn't immediately clear what we were looking for in the answer.
  • Develop assessment with appropriate scaffolding.
    • The homework we developed walked students through the technical lessons and gradually built in a real-world context and more nuanced questions about the stakes for Responsible Computing. Discussion questions at the end of the notebook could be used in a conversational setting, for an in-person discussion led by the instructor that built on the hands-on material in the notebook.
  • Figure out if the course structure needs to change to support the planned assessments and learning outcomes.
    • We worked with the course instructors to give guest lectures to help provide additional context on the Responsible Computing dimensions of the homework, advised the undergraduate TAs on how to grade the homework, encouraged instructors to create opportunities for students to discuss the homework's Responsible Computing dimensions (not just the technical lessons), and helped scaffold those discussions with suggested discussion questions. We also have talked with course instructors and Data Science administration about macro changes in the course, such as systematic teaching with the data science lifecycle, that can help provide more opportunities to introduce Responsible Computing components into the course and make the best use of the said homework.
  • Develop a plan to get feedback from non-computing experts on planned learning outcomes and assessments
    • We repeatedly brought our homework to undergraduates who had taken the course before, to TAs and instructors who had taught the course and discussed it with program administrators who have in mind the entire arc of the curriculum. The more feedback like this, the better!
  • In implementing novel kinds of assessments, observe any difficulties and take feedback from students. Iteratively revise.
    • We've revised the homework each semester that it has been offered, and this process has made the homework significantly more effective over time. The revisions address special needs of different instructors who teach the course, for example with regards to where in the semester the homework is given/what lessons come before and which follow, changing formats of teaching (on-line v. in-person), and changing social contexts (e.g. Trump Twitter in Fall 2020, during the National Election, has a different significance than Trump Twitter in Spring 2021). The broad lessons about Responsible Computing dimensions of sentiment analysis on speech remain the same, but the specific valences and political significance changes and needs to be openly acknowledged in the assignment because it is foundational to how students will engage with the content.

Further, in a dedicated class to Responsible Computing topics that is a required component of the data science major, Data 104: Human Contexts and Ethics of Data, we use the "HCE Vignette" assignment as a final project for which students prepare throughout the semester.

The goals, description, and grading rubric for the assignment are included below.

HCE Vignette Assignment: Goals and Description

In lieu of a traditional research paper, the capstone project for this course is a 1,000-word “vignette:” a brief exposition of a specific ethical or social problem intrinsically related to data science, with practical intent. In essence, the vignette can be thought of as a kind of public service announcement, which enlists your budding expertise as a data scientist to help foster public conversation and democratic deliberation. It represents an opportunity for you to develop a crucial but often overlooked dimension of scientific expertise: the ability to communicate expert knowledge to non-experts to help shape—but not control—public conversation and effectively promote social change.

Although the vignette should be practical and “solution-oriented,” you are not being asked to offer your audience a simple solution to your identified problem. Such a task would exceed what you can realistically achieve in 1,000 words. Instead, the point is to clarify the most significant features of a relevant problem using the analytical tools learned in the course to highlight what you take to be the most promising avenues for collectively addressing it.

Shorter and less formal than a standard academic paper, the vignette nonetheless draws on substantial independent research, which you will conduct throughout the semester. With the guidance and help of your classmates and GSI, you will develop your vignette through a series of graduated stages, from developing a topic, articulating a concise thesis, and writing an annotated bibliography to produce a rough draft for peer review.

Since the HCE vignette is an unusual format for a research paper assignment, we will ask that you choose one of three basic “templates” to follow to help you structure your vignette with a specific audience in mind:

  • Write an op-ed for the Daily Cal or other newspaper aimed at your fellow students or a specific public (Suggestion: avoid writing to the “everybody” or the “public”).
  • Write an internal memo to be circulated among employees of a particular company regarding that company’s work.
  • Write a policy paper that makes recommendations to government legislators or regulators.

Style guidelines for each option will be provided on bCourses. With GSI approval, you may also opt to construct your vignette according to an alternative format, which might better suit your topic of choice.

Examples of vignettes from previous semesters can be found on bCourses.

You may submit your final vignette for consideration for the UC Berkeley Ethics of Data Science Essay Prize, which will be organized in Spring 2020.

Steps to final Vignette and due dates

  • 1) Write a brief proposal (approximately 300 words) describing the topic you would like to write your Vignette about. Describe briefly what is at stake in the topic and why you are interested in it. Indicate who your potential audience might be, and which vignette template you think would be the best fit for your topic.

Due March 1 by midnight.

  • 2) Create an annotated bibliography with at least 10 sources relevant to your topic (newspaper articles about the issue; magazine articles; articles from academic journals; academic books). The annotated bibliography should consist of the bibliographical reference and a two or three sentence description of the work's main argument and how it relates to your thinking.

Due March 15 by midnight.

  • 3) Develop a thesis statement and an outline for the Vignette.

Due April 5 by midnight.

  • 4) Write a first draft of the Vignette complete with a bibliography.

Due April 18 by midnight.

  • 5) Peer-review of classmate's Vignette.

Due in section the week of April 20.

  • 6) Revised final Vignette

Due May 10 by midnight.

HCE Vignette Peer Review Questions and Grading Rubric

Georgia Tech

At Georgia Tech, we developed/adapted the following rubric to assess the ability of our students to work effectively with different types of communication.

Image of Georgia Tech SLO
Georgia Tech Student Learning Outcome

University at Buffalo

At University at Buffalo, we developed a grading rubric to assess ethics-related assignments. Somewhat interestingly, when we started assessing student work the rubric was not effective. The problem was not with the rubric or student work, but that the assignments discussed ethical issues but did not require students to perform any analysis. We first used this rubric while rewriting existing assignments and creating ones to make certain they require students to respond to the ethical issues they raised. Once this was complete, in CSE 442, we then presented the rubric to teach students the different criteria for them to consider when evaluating the ethics of a situation and, finally, to evaluate their responses.


Resources


Related Pages


Authors and Contributors

Margo Boenig-Liptsin

Margo Boenig-Liptsin (author)

Atri Rudra

Atri Rudra

Ellen Zegura

Ellen Zegura