Heldôfbylding

Part I: Inertia

An exploration of how individual data stewardship initiatives may fail.

New norms, initiatives, and technologies fail to take root, or fail to give individuals and communities sufficient protection or agency.

This part explores why individual data stewardship initiatives may fail, in the traditional sense: they don't attract users; they lack the power to achieve their goals; they are legally or financially untenable; they are conflict-riven; and so on. Many of these failure cases are not special: most organizations are susceptible to them, too. But that should not diminish their importance. A data stewardship initiative invites people to depend on them, and to trust them. The learnings that are extracted from an initiative's failure are ultimately of little use to the people who are let down or harmed by it.


letters-icon-a.jpg

Data stewardship initiatives fail from a lack of demand

No matter the form of an initiative — whether a trust, co-op, a collaborative, or something else entirely — it will not succeed if it cannot attract participants. Many data stewardship initiatives face the dual challenge of not only attracting participants, but educating them on how to benefit from (or mitigate their risk towards) data. Beyond that, participants could be deterred by an unclear value proposition, a confusing design, or unreasonable expectations of a participant’s available time or expertise.

Story: A confusing consumer data trust

A non-profit builds a data trust for consumer-related data. The trust works on behalf of its members to secure better terms for member data, particularly related to consumer activity and targeted advertising.

Potential users aren't persuaded that the trust is beneficial, don't understand the benefits, or object to their data being sold at all. The trust struggles to find vendors who are interested in purchasing the trust's data, and those that do continue to engage with other, exploitative data brokers. Consumer advocates object to the trust for normalizing the exploitation of information about a person's life and relationships for financial gain.

Story: An opaque privacy app

A non-profit builds an app that claims to help users protect their data from companies. The app consists of dozens of links to optimize privacy settings in the user's social media, email, and other online accounts, and notifications about where the user's personal information has been leaked online. Many users find the setup process to be overly time-consuming, and the benefits of using the app long-term are unclear. The app fails to build a trust relationship with users over time, and most eventually delete it.

Story: Overwhelmed by decisions

An application to give users agency over how their health data — from patient records to wearables data — is developed. For legal and business reasons, the app does not make recommendations on whether users should accept or reject an individual request, or when they should negotiate better terms. Users are quickly overwhelmed by the volume and complexity of potential requests for their data. Some users see a request's presence on the app as evidence that it has been vetted for trustworthiness, and file lawsuits against the app's maker when a deal goes bad. Others begin to reject all requests that come in, or accept only requests that pay, even a nominal amount.


letters-icon-b.jpg

Data stewardship initiatives fail to overcome collective action challenges

Some data stewardship initiatives are built on theories of collective action: given enough users, collectives can negotiate better terms with platforms that handle personal data.

The success of this “data union” approach may be situational and small-scale: where the union has meaningful negotiating leverage, such as monopoly over a data source or a user community, and where the union is only negotiating with a few platforms, instead of many. At scale, however, data unions may share few of the strengths of a labor union, and add unique weaknesses. It is not clear that a "data union" can gain a critical mass of members to successfully negotiate with a platform, especially when large platforms have chosen to leave entire markets with unfavorable legal environments (e.g., Google News in Spain, Uber and Lyft threatening to leave California). Without that critical mass, it's difficult to attract new members.

Perhaps more importantly, at any scale, a data union may not be able to prevent its members from making side deals with platforms they need to use, undermining the union's negotiating power.

Story: Trading body privacy for affordable health insurance

Xander has Type 2 diabetes. He agrees to allow his insurance provider access to fitness tracker and medical device data in exchange for lower premiums, even though the insurance provider refuses to negotiate with Xander’s data union. When forced to choose between the data union and a service he needs, Xander leaves the union.


letters-icon-c.jpg

Data stewardship initiatives fail to win user trust

Some data stewardship theories of change rely on trusted expertise: whether in the form of a trusted fiduciary, or in an individual’s ability to make informed decisions about how to protect their data.

Initiatives may fail to maintain a trusted relationship with their members. They may engage non-expert fiduciaries, or show bad judgement. Security breaches may further erode an initiative’s trust.

Data stewardship initiatives may go one step further: in addition to stewarding and protecting data, they may try to help people analyze data and make decisions about their lives. This risks overreach: a health data steward may not be the ideal health advisor.

To succeed as a movement, data stewardship initiatives need to educate the public on the benefits and risks of data, especially in an age of machine learning. Without that, even successful implementations may fail to counteract a growing public distrust towards data and those who use it.

Story: Data leaks lead to real-world harm

A local government purchases ride-sharing data from a driver-owned co-op. A mistaken response to a FOIA request exposes individual ride data to the public. After a wave of stalking and domestic violence incidents, the data-sharing program is shut down.

Story: Misinterpreted data

A patient-run data co-op collects sensor readings from fitness trackers and other health wearables to build a crowd-sourced, privacy-protecting fall detection notification system. Although the data co-op is a responsible steward of the data collected, they produce a poor-quality fall detection system. After several publicized incidents of falls that do not trigger the detection system, health officials begin warning people not to use the tool, and users begin abandoning the co-op.

Story: Surveillance without representation

A labor union encourages members to install a monitoring application on their phones, promising to use the data collected to improve their negotiating position with employers. The union invests money and time in maintaining the application and the attendant data, at the cost of grassroots, person-to-person organizing support. Ultimately, much of the data collected by the monitoring application goes unused, harming workers’ faith in the union.


letters-icon-d.jpg

Data stewardship initiatives are financially unsustainable

The promises a data stewardship initiative makes are only as good as its long-term financial health. Without good planning, a stewardship initiative’s collapse or closure can harm the populations that depend on the initiative in order to run.

Story: Voice data up for grabs after collaboration falls silent

A consortium of local organizations collaborates to build and maintain a voice data set for African languages. The consortium develops a text-to-speech and speech-to-text API that is affordable and has a well-scoped acceptable use policy.

Because the datasets and software used to create the APIs are open and publicly available, competing speech recognition APIs arise that are targeted at the most lucrative use cases. Eventually, the consortium runs out of money to maintain and steward the datasets, and the project is abandoned. Some of the language datasets are adopted by corporate sponsors, who use them to develop proprietary tools. Others are left online for anyone to use, for any purpose. No one is left to enforce the acceptable use policy. A European company uses the data to develop and sell localized voice surveillance tools to governments.

Story: A failed patient hub

A foundation funds a patient advocacy group to build a shared hub for members’ health data. After the seed funding runs out, the patient advocacy group cannot raise additional money to develop the hub or pay for its hosting. Eventually, the group’s only option is to get funding from a health insurance company. Although the insurer does not explicitly request access to the group’s data as a condition of funding, the group’s members, many of whom have spent years fighting with their insurers over coverage, become wary of contributing to the hub.


letters-icon-e.jpg

Data stewardship initiatives are unable to resolve conflicts among stakeholders

Like other multi-party initiatives, data collaborations can fall apart when members cannot resolve different or opposing goals, and are unable to negotiate terms for production, use, and control over data derivatives.

Story: Should a neighborhood watch?

A neighborhood data commission fractures over a request to make camera, microphone, and sensor data available to a commercial vendor selling gunfire detection software to the city and its police department. After the request is narrowly voted down, individual business owners and residents install cameras and microphones in their own windows, and attempt to make that data available to the vendor.


Notes and further reading

Arguments in 1A, 1C, and 1E are inspired in part by Catherine Leviten-Reid and Brett Fairbain, "Multi-stakeholder Governance in Cooperative Organizations: Toward a New Framework for Research?" Canadian Journal of Nonprofit and Social Economy Research, Vol. 2, No. 2, 25-36 (Fall 2011).

Leviten-Reid and Fairbain review available empirical evidence on multi-stakeholder cooperatives, which they argue suggests some indication of successful governance. Citing Ostrom, they highlight four processes that contribute to successful governance: 1) that all actors are involved or have the opportunity to be involved in rule-making; 2) that individuals have a clear understanding of the system they are a part of and the rights and obligations they have; 3) that conflicts are resolved quickly, due to in-built mechanisms for identifying and addressing them; and 4) that members perceive a fair relationship between what they invest and the extent to which they benefit.

The examples in 1A about users being overwhelmed are inspired in part by the WEF's report "The Internet of Bodies is here: tackling new challenges of technology governance." More generally, the report explores how body-connected sensors and devices raise new governance challenges. The privacy app example in 1A is also inspired in part by Kate Cox’s article in Ars Technica, “Unredacted suit shows Google’s own engineers confused by privacy settings.”

The argument in 1B is inspired in part by the work of Mancur Olson and others, who discuss the challenges of organizing large groups of people with diffuse interests, especially when pitted against smaller groups with more acutely defined interests. See also Amy Kapczynski, The Access to Knowledge Mobilization and the New Politics of Intellectual Property, 117 Yale Law Journal 804, 811 (2008).

The fall-detection example in 1C is adapted from M. Schukat, et al., "Unintended Consequences of Wearable Sensor Use in Healthcare," Yearbook of Medical Informatics (2016).

The examples in 1D are inspired in part by the RadioShack bankruptcy case, where the company attempted to sell consumer data despite promising that it would never do so. Intervention from the FTC and 38 states ended up limiting the extent of the deal.

The example in 1E is partially a riff on controversies related to Ring, NextDoor, and ShotSpotter.


This study forms part of a collaborative research series by Mozilla Insights called Data for Empowerment. The research informs the work of the Data Futures Lab.

Bliuw skowe foar
Part II: Exclusion