Towards Responsible Recommending
Recommendations to platforms and policymakers on how to move towards a more responsible recommending ecosystem.
Recommender systems are a core component of many large online platforms. They significantly shape people’s online (and even offline) experiences in highly automated ways, be it on social media, video streaming, or dating apps. But they are also drivers of online harms.
Much of the debate around holding platforms accountable for this focuses on mitigating negative outcomes — such as disinformation or hate — rather than the root causes, including the recommendation engines facilitating or reinforcing these outcomes. At the same time, these systems remain opaque and removed from public scrutiny. Public-interest researchers studying them struggle to gain access to high-quality data and have even been threatened by platforms. And users often lack the information and the means to effectively shape their experience on a platform.
This paper seeks to contribute to the debate by shifting focus to recommender systems and proposing a comprehensive set of actions for platforms and regulators that together can form the basis of a more responsible and accountable recommendation ecosystem, with a focus on the largest and most impactful online platforms promoting user-generated content.
In light of this, we make the following recommendations to large platforms and policymakers:
Layered oversight and scrutiny:
1. Disclose detailed information on how recommender systems work and how they are operated, made accessible on a designated information page.
2. Provide detailed information about content demotion (or reduction) policies, including criteria for when content is demoted and descriptions of how such policies are implemented.
3. Report aggregate data on demoted content as part of existing transparency reporting formats.
4. Provide data and tools to analyze the most widely viewed and engaging content, pages, creators, and links.
5. Provide qualified independent researchers conducting research in the public interest with access to data and documentation while adhering to data protection rules and principles and robust security protocols.
6. Create a safe harbor for researchers, civil society organizations, and journalists conducting public interest research in compliance with research ethics and data protection rules and principles.
7. Conduct or commission audits of recommender systems and publish audit reports.
Informed and empowered users:
1. Allow users to better customize the recommendations or content displayed to them, by providing them with more and more effective controls.
2. Enable users to exert better control over which of their data, including personal data, is collected and subsequently used to inform recommendations.
3. Allow users to opt out of personalized recommendations or to influence the degree of personalization at any point in their product experience.
4. Provide users with easily accessible and meaningful explanations of why specific content is displayed to them.
These recommendations are not meant to be seen as standalone quick fixes, but rather as complementary pieces that work together to tackle key problems at their systemic roots.