Skip to content

Our Privacy Statement & Cookie Policy

All Thomson Reuters websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.

Data privacy

Imagining the unforeseen dangers of data harm

For author Cory Doctorow, the impact to an organization’s insurability might hold the answer for effective data privacy management.

Novelist, media commentator and technology activist Cory Doctorow is among the forefront of today’s writers and thinkers who are exploring the multi-faceted ramifications of digital life. His technology credentials alone would make any cyberpunk proud: A special consultant to non-profit civil liberties group Electronic Frontier Foundation, which focuses on technology law and policies; a former Fulbright Chair at the Annenberg Center for Public Diplomacy at the University of Southern California; and a MIT Media Lab Research Affiliate.

It is through his work as a writer, however, by which Doctorow is perhaps best known. Author of the acclaimed New York Times bestselling novel, Little Brother, a story about the perils of post-9/11 government surveillance, he is also known for the influential non-fiction book Information Doesn’t Want to Be Free, which deals with laws and creative rights in the era of the Internet. We asked Doctorow for his insights on how data privacy is forcing the regulatory, commercial and legal landscape to evolve in response; he obliged us with a candor likely to provoke discussion and not a small amount of consideration.


The below interview has been edited for length and clarity. Click here to listen to the expanded conversation.


ANSWERS: There is a big push with GDPR around privacy regulations, and in the United States we’re seeing a growing chorus of voices around social media platforms and the handling of personal data. Do you feel we can effectively reconcile privacy at the regulatory level?

CORY DOCTOROW: I subscribe to Larry Lessig’s idea that we regulate not just by the laws that we make but by the technologies that are available to us, by the norms that we share with one another and by the companies that get funded. That it’s code, laws, markets and norms that ultimately end up regulating our behavior. I don’t think that law on its own is going to suffice.

For one thing, I think that there is a really big danger with law because the people who get a seat at the table when you regulate are the people who are already in business. The people who don’t get a seat at the table when you regulate are the people who haven’t started a business yet.

One of the ways that we get there is through ensuring that there is some platform for competition. Another way we might get there is by having legal intervention to protect privacy. You could get a lot of privacy regulation out of the market just by creating a regime with statutory damages for breaches.

What if you just said when you breach, the damages that you owe to the people whose data you breached cannot be limited to the immediate cognizable consequences of that one breach but instead has to take recognition of the fact that breaches are cumulative? That the data that you release might be merged with some other set that was previously released either deliberately by someone who thought that they’d anonymized it because key identifiers had been removed that you’ve now added back in or accidentally through another breach? The merger of those two might create a harm.

For example, imagine that you have a release of data about prescriptions written by doctors and there are no identifiers about who they wrote them for, but you have date, time, and hospital about which prescriptions they are writing. Then, Uber has a breach where they released all of the taxi rides in the area in which that prescription data had been released.

Now you can re-identify a huge number of those prescriptions. That might create all kinds of harms that are not immediately apparent just by releasing a database of people’s rides, but when merged with maybe that NIH or NHS database suddenly becomes incredibly toxic and compromising.

If for example we said, “Okay, in recognition of this fact that once that data is released it never goes away, and each time it’s released it gets merged with other databases to create fresh harms that are unquantifiable in this moment and should be assumed to exceed any kind of immediate thing that we can put our finger on, that you have to pay fairly large statutory damages if you’re found to have mishandled data.” Well, now I think the insurance companies are going to do a lot of our dirty work for us.

We don’t have to come up with rules. We just have to wait for the insurance companies to show up at these places that they’re writing policies for and say, “Tell me again, why we should be writing you a policy when you’ve warehoused all of this incredibly toxic material that we’re all pretty sure you’re going to breach someday, and whose liability is effectively unbounded?” They’re going to make the companies discipline themselves.

That’s a very powerful privacy regulator that doesn’t ever mention a regulation, that hardly mentions the word privacy and puts almost no brakes on what the companies can do, but instead just asks them to solve the problem of not harming us as opposed to solving the problem of which privacy regimes are good and bad.

ANSWERS: It has been suggested by some on the topic of privacy and insurance that there’s also concern that if insurers get more data about you, that they can use that against you in terms of denials of coverage. But you’re suggesting that insurance companies will be watchdogs for us because they will be able to look at data handlers and question why they have that data. Is that correct?

DOCTOROW: Yes. Ultimately, this is about externalization. The reason that companies are so cavalier with our data, the reason they collect so much and retain it for so long is because the negative externalities of that data handling or the risks associated with that data handling are borne by the wider society. Identity theft costs individuals huge amounts of money to remediate; it costs society huge amounts of money to ameliorate. Those are borne by us as a body instead of being borne by the entities whose recklessness gives rise to them.

In a regime in which those companies have to internalize those risks, either their shareholders or their insurers will discipline them because they don’t want to be left holding the bag. It’s like pollution. One of the ways that we can discipline companies about pollution is by giving the people who are negatively affected by pollution a cause of action that they can bring in court against those firms, and particularly if that cause of action includes cost recovery for the counsel that represents them, then you completely invert the economics of power abuse in legal regimes. Right now, the more money you have as a polluter, the harder it is to sue you because the better lawyers you can afford.

If there’s cost recovery built into a liability regime, then the lawyers who might potentially sue you will just float their clients. They’re not going to bill their clients a nickel because the deeper your pockets are, the more paralegals they can throw at the problem of figuring out how to sue you into a smoldering wreck because they know they can get the money out of you once they win.

This is why you hear tort reform is such a darling issue for the Chamber of Commerce and other representatives for big business. They recognize that in a regime in which no-win, no-fee attorneys can recover their costs, it is worth their while to spend unlimited amounts of money suing very large companies who have nearly unlimited amounts of money to recover once they secure a victory for their clients. It would make the biggest, most abusive actors the most disciplined and cautious about how they use their data.

Cory Doctorow
Cory Doctorow, novelist, media commentator and technology activist.

ANSWERS: Some have voiced a concern that with the advent of subject access requests, you may see an influx of class action lawsuits. Do you see that becoming a large-scale problem that GDPR and some of these other data privacy regulations might bring about?

DOCTOROW: I wouldn’t necessarily call it a problem. I think you’re right that there will be a fair whack of that in our future. There’s this analogy I’ve been working on and it’s a bit imperfect but imagine that someone is sitting around thinking about the pile of oily rags in their garage that they’ve been really meaning to do something about. They say, “Oil itself is very valuable. Maybe I can extract some oil from my oily rags and sell it and make some money.” They go off and realize that in fact, they can.

It’s not much oil but then they get to thinking, “Everybody has some oily rags lying around. Provided that I didn’t have to take any particular precautions to stop them from bursting into flame, I could probably get a lot of oily rags in one place and start to extract a kind of crude that has some marginal value that was cheaper than my operating cost and start to turn a profit.”

Taking this further, it then turns out that we have a lot of oily rags out there and the person claims that they’re providing a service and helping people out. They created a new industry, they’re creating jobs and they’re providing liquidity in the oil market and doing all kinds of amazing things that you wouldn’t have otherwise gained except for the fact that they figured out how to make money from oily rags.

Then, you start getting giant, out of control fires. Society says to them, “Guys, you can’t just store unlimited mountains of oily rags wherever you want and then expect us to put out the fires. You either have to take some precautions or pay the cost of the fires.”

They reply, “But we won’t have an oily rag industry, if we don’t have the right to pile up this stuff without any regard for the consequences.” We might say to them, “In that circumstance, then I guess you don’t have an oily rag industry. You really never really had an oily rag industry because it was only sustainable by putting the costs out on us instead of bearing them internally.”

I remember when I was a systems administrator, and log files were not things that you saved indefinitely to mine for market intelligence. They were things you had to remember to delete because otherwise, your server would fill up and crash. It was obvious that log files contained important insights about your website and its users. It was also obvious that they were potentially very risky to aggregate and to share around. It took a certain person with a certain reckless disregard for what the potential long-term harms were for allowing these log files to pile up instead of throwing them away, to create this new surveillance business model. No one came down off a mountain with two stone tablets saying, “Stop rotating your log files and start mining them.”

It’s a practice that’s only a decade or so old. It may be that just like radium suppositories and asbestos insulation and all the other ideas that seemed a good idea at the time, that 10 years from now we’ll look back on it and say, “Do you remember when we didn’t throw away our log files? That was dumb.”

In that case, would it be a problem to have a regime that makes firms internalize the cost of what is an otherwise reckless activity? Not really. To me, I think that might be a feature.


Learn more

For additional content concerning the use of personal data in the digital age, be sure to explore the rest of our multimedia series: A new dawn for data privacy and transparency.

  • Facebook
  • Twitter
  • Linkedin
  • Google+
  • Email

More answers