This is a blog post by Anouk Ruhaak, a Mozilla Fellow embedded at AlgorithmWatch. Learn more about Mozilla Fellowships.


Why call a doctor when you can ask the internet whether that lingering headache is not in fact the first sign of your impending demise? Since the invention of sites like WebMD, our inner-hypochondriacs have flourished. Unfortunately, panic-fueled health enquiries are not as private as we may have hoped. A recent investigation by the Financial Times shows how WebMD, and services like it, share your search data with ad-companies like Google, Microsoft and Facebook. The good news? This type of non-consensual data sharing and selling is illegal, at least in Europe. The bad news? Google holds a lot more health data than just the search queries you enter online.

In fact, whether you search your symptoms online or consult a doctor in person, your data may end up with Google either way. Earlier this month, the Wall Street Journal reported that Google Health has acquired health data records from millions of Americans without their knowledge, let alone their consent. It received data from 2,600 hospitals, doctor offices and other facilities. Data on a patient’s symptoms is sent to Google and used to make treatment recommendations. In addition, Google is reportedly working on software and is training machines to provide better recommendations. Neither doctors nor patients were made aware of the data sharing, which, incidentally, appears to be legal.

Why are the institutions we rely on in our time of need — the very institutions who should have our back — so eager to hand Google the keys to the kingdom? My guess is that the promise of better data infrastructures and health outcomes is too good to pass up. Google itself says its goal is ‘improving outcomes, reducing costs, and saving lives’. But how can we trust Google will live up to these goals and not, as critics warn, exploit our data for other purposes?

After all, Google has a history of breaking promises. In 2017, DeepMind, another Alphabet company, came under attack after the UK’s National Health Services (NHS) illegally transferred 1.6 million patient records to the company. The data was used in the development of Deep Mind’s Streams app, an app aimed to help medical staff diagnose patients faster and plan treatments more efficiently. In an attempt to wiggle its way out of controversy, DeepMind promised to never link patient data to its other services, or integrate it with the Google ecosystem. One year later, that promise was broken when the company was taken over by Google Health. The Streams App is now a Google product.

And yet, five out of the six UK hospitals involved in the initial scandal recently renewed their data sharing agreements with Google. They cite the cost reduction and more speedy diagnosis of eye diseases as a reason for this decision. Clearly, our health data holds more than just monetary value. Whether that value will continue to benefit society remains to be seen. TechCrunch reports that the Stream app services were made available for free to the participating hospitals for the first three years, but it’s unclear what happens after.

Trading highly sensitive data for the uncertain promise of better healthcare is a Faustian bargain and one we should not have to make. There is no reason why we cannot make the health benefits of our data available to society at large, without surrendering control over our privacy in the process.

Trading highly sensitive data for the uncertain promise of better healthcare is a Faustian bargain and one we should not have to make.

Anouk Ruhaak

One proposal is to look at data trusts to claw back some control. Instead of handing the data over to Google, we would place it in a data trust. The data would be governed by a board of trustees who have a fiduciary responsibility to look out for the interests of the beneficiaries of the trust, for instance the patients. The data trust then licenses the data to third parties, like Google, but can assert very specific conditions on what data can be accessed, how that data is to be used and what requirements are placed on the products build on the data. Moreover, should a company violate the terms of its license, access to data could be revoked immediately.

In short, placing data in a data trust would mean we do not need to trust Google will do the right thing, we can demand it does. Until such time, there is one thing we can be certain of: Google will do what is good for Google.


Verwandte Inhalte