All data is health data.

Estimated reading time: 5 minutes

Canny readers of my book will have figured out that the Postscript, formally titled “Privacy and health data”, was a coded message. We used those pages to talk about one thing, but we were actually talking about something else.

On the surface, the chapter is about the lessons that the pandemic taught us about the uses and misuses of health data, and what those lessons mean for the future of our work on the web. And in some ways, it is. But anyone who looked closely at Espen Brunborg’s brilliant illustration for that chapter will have sussed its real meaning.

Screen cap of the frontspiece to the postscriptYou see, that’s a bunch of eyes looking at a covid test: a visual representation of how our private health data became intimately public during the years we all lost, together.

Until you realise that isn’t a covid test, is it.

It’s a “pee stick,” the little plastic wand that changes your life.

It’s a pregnancy test.

Because the chapter is actually about reproductive health data, and the political climate it lives in.

It just doesn’t say so.

And this is why I wrote it:

I cannot hope to address all the privacy risks of the reproductive health climate in the US, and the role that you play in helping to mitigate those risks. But I will draw your attention to just one illustrative example.

This is a podcast episode of Slate’s What’s Next TBD, which I need you to listen to as soon as you can. (There’s a transcript if you can’t.) The episode discusses a case involving two women in the US state of Nebraska who are being prosecuted for arranging an illegal abortion. The two women are a mother and her 17-year-old daughter who bought pills online. And the prosecutor found out about it not by reading a public post on social media, or accessing the data on an ovulation tracker app.

He found out by subpoening Meta for the contents of their private Facebook DMs where they discussed her unwanted pregnancy.

By the way, this was before the Roe ruling.

The podcast features the excellent Guardian tech journalist Johana Bhuiyan, who previously wrote about the case here. Motherboard published the legal filings here.

The podcast also offers other examples of how data has been weaponised against women protecting their reproductive health, even in the cases of natural miscarriages, which in many US states are now considered presumed murder; ta, Savita.

That’s heavy stuff to take in, and you should take as much time as you need to do so.

The point that the podcast makes about the Nebraska case is that the prosecutor did not subpoena what we might term explicit health data, such as the kind I discuss in the book’s Postscript. He subpoenaed a private chat between a mother and her child. And Meta were legally obliged to hand it over.

What does all of that mean for your work on the web?

I want you to take away these lessons from that podcast episode, that wider story, and what you learned in the book’s Postscript:

  1. All data is health data. It is merely contextual. So you need to think of all of your data inputs, collections, and sharing, in that contextual way. If you are serving vulnerable users, their health data is not $condition or $symptom or $treatment; their health data is Google searches, private messages, location data, and intimate conversations recorded on a smart speaker.
  2. All users are vulnerable.
  3. Whatever work you do to safeguard contextual health data may seem futile, given how much content data brokers already have, and how much they exchange, for profit. All you can do is remember that the less data you create, retain, and share in the first place, the less data they have to exploit.
  4. Your threat modelling for your data set should involve the type of threat actor described in the podcast: an aggressive prosecutor who loves filing subpoenas as much as he loves Jesus Christ, and can cloak his vindictive fundamentalism in the guise of “hitting back at Big Tech”, the go-to respectability excuse of the hard right.
  5. Along those lines, if you need to think of your approach to privacy in terms of minimising exposure to coordinated external litigation threats as opposed to doing the right thing by your users, if that is the approach you need to take in the volatile cultural and political context you work in, go for it.
  6. At the point where this starts to become a problem for you, commit to publishing transparency reports which note how many law enforcement requests you have received for your users’ data, and on what grounds. Once word gets around your users, networks, frenemies that this is a problem – illustrated with hard figures and data – the balance of power can slowly shift.
  7. In the long term, you need to be as creative in your uses of law and policy as people like him are about exploiting its gaps and omissions. At the very least, defend end-to-end encryption; at an extension, demand the strict regulation of data brokers. As per:

Editorial choices and life goals

So in the end, why didn’t we mention the thing we were actually writing about, in the book what I wrote, by name?

For one, I wrote the book to be as general as possible, as opposed to being a topical response to a topical issue. I didn’t want it to be “the book about X privacy problem”, because it isn’t.

For another, the book is written for a global audience; anyone outside the US (hello!) will tell you how frustrating it is to deal with material written from a myopic American viewpoint, one which is barely relevant to our own situations, on a daily basis.

And third, given the vitriol that arises over the situation, I didn’t want the good folks at Smashing dragged into deranged online harassment, and their work decried as the literal deeds of Satan.

Nor did I want to get involved in some controversy over my book being banned.



Oh hell yes I do.

After all, I am who I am today because I had a feisty US high school teacher who would collate lists of books that had been banned in conservative, Republican, and/or racist states, and tell me that those were the very books that I needed to read. She was right. Those lists sent me down rabbit holes that I’m still in today for both fiction and nonfiction reading. (As an aside: the other day, my local village library here in Scotland set up a display with a fresh copy of Zora Neale Hurston at the top, and my word that made me happy.)

So in my heart of hearts, nothing would make me happier than for my book to be banned by some fundamentalist “Moms for Statutory Rape”-type group over that Postscript, after they hear (without reading it) that it explains how to prevent reproductive data from being transferred to authority figures who know best, in ways that might prevent young women from pushin’ out babies for Jesus from elementary school onwards.

Now that’s a career goal. At least it’s mine.

Other women with strong feelings on this, well, they have different career goals. Especially for their own children.

The Author

I’m a UK tech policy wonk based in Glasgow. I work for an open web built around international standards of human rights, privacy, accessibility, and freedom of expression. The content and opinions on this site are mine alone and do not reflect the opinions of any current or previous team.