Last year the Dobbs v. Jackson ruling of the Supreme Court marked a drastic change in US constitutional law with regards to abortion rights.
The decision itself got plenty of media coverage worldwide but the privacy crisis it ushered in largely flew under the radar outside the US. So here is what is going on in the US after Dobbs v. Jackson, and what the EU can learn from it.
- Dobbs v. Jackson is a privacy crisis
- Big Tech are not helping
- The HIPAA is not enough
- What are the US doing to control the damage?
- What can Europe learn from this privacy crisis?
On 24 June 2022 the U.S. Supreme Court decided the Dobbs v. Jackson case. In doing so, it overturned Roe vs. Wade, a 1973 precedent that protected the right to abortion in the U.S. As a result of Dobbs, the Court now holds that the US Constitution does not protect the right to abortion, and that States are free to regulate the matter as they please.
The controversial ruling opened the floodgates to a wave of anti-abortion legislation in conservative States**.** A year from the decision, about half the States have legislation that limit or ban abortion and, in some instances, criminalizes abortion seekers and those who provide help.
The decision was harshly criticized by governments, international organizations, and many voices from academia and civil society. A letter signed by almost 200 NGOs highlights Dobbs' harsh impact on women's rights and bodily autonomy, as well as its disproportionate impact on already disadvantaged communities.
Dobbs v. Jackson is a privacy crisis
Dobbs v. Jackson dealt a blow to women’s rights and autonomy, and also ushered in a large-scale privacy crisis.
Law enforcement from conservative States is currently using women’s digital footprints to prosecute abortion seekers, including location data, Google searches, and chats with family members. Women’s data are collected through a mandate or just bought on the market, which is all too convenient and does not require involvement from a court of law. Even civilians sometimes buy these data to report abortion seekers to the authorities, in order to cash in the bounties offered by some States.
How is such a dramatic privacy crisis possible in a first world country?
A key issue is that the US has no federal data protection law but only laws for specific sectors such as health care and finance, along with State legislation such as California’s CCPA and Colorado’s CPA. A federal privacy law (the American Data Protection and Privacy Act) has been proposed but is nowhere close to being finalized.
Without any privacy protections at a federal level, the online privacy of most US citizens largely depends on the privacy culture and practices of the companies they entrust with their data, and this is bad news. Many companies are willing to sell data to the highest bidder, and most States have no laws in place to prevent them from doing so.
States with privacy laws do not fare much better. Legislations tend to frame privacy rights in terms of opt-out rights as opposed to prohibitions. But most people are simply too busy to opt-out from invasive data collection practices from every single service they use and every website they visit.
Bottom line, online services hoard enormous amounts of personal data for profit, and most of them are fair game if you have the cash.
This is nothing new. Privacy advocates have long been raising awareness of the enormous dangers of the online surveillance economy. Dobbs v. Jackson only made these risks dramatically tangible for American women.
Big Tech are not helping
For all its promises to honor and safeguard privacy, Big Tech is not doing much to protect women. A recent article from Insider found that Meta receives over 400.000 government requests for personal information a year and rarely challenges them in court.
To make things worse, even when Big Tech attempts to control the damage, it may or may not work.
Last year Google promised to delete sensitive locations such as abortion clinics from Google Maps’ location history. Later, The Washington Post and Accountable Tech both experimented with Google Maps and found that the deletion of sensitive location data is inconsistent and very unreliable.
Why can’t Google deliver on its promise after one year?
Well, Google services are privacy-invasive by design. They were built to grab the data first and worry about privacy and data governance later-if ever. Now privacy-preserving measures are needed, but they are difficult to implement when they were completely absent at the start. It is like trying to install brakes on a car that was never designed to have brakes in the first place and is now running at full speed.
Data hunger is the core issue. And it is much, much bigger than Google. Countless other services hoard all the data they can with little or no concern about privacy and data governance. As a result, the user’s digital footprint grows to the point where not even the companies themselves can keep the data under control.
The same issue recently surfaced in litigation against Meta where the company essentially admitted having little or no control over the monstrous amounts of data they collect.
Again, Google and Meta are the rule rather than the exception. Companies have an incentive to hoard all the data they can profit from. Data hunger leads to poor data governance, and poor data governance leads to privacy disasters because you cannot protect data you have no control over.
The HIPAA is not enough
But doesn’t the US have the HIPAA? Why doesn’t that solve the issue?
Here’s the thing. The Health Insurance Portability and Accountability Act (HIPAA) is not a privacy law in a proper sense, but rather a sector-focused law for healthcare providers (as we explained in another blog). Its privacy rules are very narrow in scope because the HIPAA only covers healthcare providers and companies working from them.
While HIPAA violations play a part in the US privacy crisis, the main problem is the vast amounts of health data that do not fall under the HIPAA in the first place. Googling information about medication you are taking, or using Google Maps while driving to the hospital, can add some dangerously sensitive data to your online footprint. And yet, these data do not fall under HIPAA because Google is not a health care provider.
Menstruation apps are a prominent example of the issue. These apps collect very detailed information on the reproductive status of the millions of women who use them. This information does not fall under the HIPAA and can be sold with little or no restrictions in most States.
In a nutshell, the very narrow scope of the HIPAA, combined with the lack of federal privacy laws, results in a dangerous lack of protections for sensitive data.
What are the US doing to control the damage?
Washington was the first State to react to the privacy crisis by adopting the My Health, My Data Act in April 2023. The My Health, My Data Act provides stronger protections for health data and prohibits geofencing near health care providers- that is, the use of location data (typically from smartphones) to figure out who visited a certain location. The States of Connecticut and Nevada later followed the example and passed similar laws to protect health data.
On the one hand, there is hope that this legislative trend will lead to strong protections for health data (and sensitive information in general) in the proposed American Data Protection and Privacy Act. On the other hand, States that already have strong protections for health data, will likely push back against any draft of the ADPPA that weakens those protections.So, these laws might have the perverse effect of delaying the political negotiations behind the Act by making the already thorny issue of State preemption even more complicated.
Other important developments are coming from California. Since Dobbs v. Jackson, California has reinforced its traditional position as a sanctuary State by passing legislation to prevent the prosecution of women seeking reproductive health care in the State.
Right now, the State is working on an amendment to the California Criminal Code that would shield Californian companies from warrants for reverse-keyword and reverse-location requests. In other words, Californian companies will be allowed to ignore certain highly-invasive out-of-State search warrants.
The amendment could be a game-changer because it covers Silicon Valley companies that control vast amounts of personal data. By shielding Big Tech such as Apple and Meta, from warrants, the amendment could substantially impact the privacy of women outside California. But the political negotiations around the bill are complex because it has the potential to hamper the investigation of non-abortion related crimes.
What can Europe learn from this privacy crisis?
Unlike the US, Europe has a general privacy law in the GDPR which includes specific and strict rules for sensitive data. But that does not mean that our sensitive data are safe. Europe should take a close look at what is happening in the US right now, because there are some important lessons to learn from the mess.
“Health data” is a broad category
When we think of health data, we usually think of medical files, X-rays, and so on. But these data are not the main issue in the US privacy crisis. In fact, some of the most urgent privacy threats come from search histories, location data, and (non end-to-end encrypted) personal communications such as chats and emails.
On the European side, the GDPR does not (explicitly) list these data as sensitive. As a result, many organizations in Europe do not think too much about these data and do not handle them with the required care.
Two important rulings of the EU Court of Justice might change the situation. In light of the Court’s recent case law, data that might reveal sensitive data, are themselves sensitive data (see our blogs on Sensitive data and the Bundeskartellamt ruling for more information).
This case law is an important step in the right direction and substantially broadens the scope of the notion of sensitive data. However, the Court’s approach is a far cry from the more formalistic approach most companies adopt when dealing with sensitive data. It will probably take time before the paradigm shifts in practice- and in the meantime, our sensitive data will not be as safe as they should be.
Privacy by design needs to be enforced better
You need to plan for privacy ahead of time. If you do not set up a system in a privacy-friendly way, then it becomes very difficult to implement solid privacy down the road, as shown by Google’s data deletion fiasco.
This is why the GDPR insists on the privacy by design principle. Privacy by design means that you need to plan the processing of personal data with privacy in mind from the start.
Privacy by design is no mere suggestion but rather a binding legal principle. Nonetheless, privacy by design is often ignored by the industry. This is a shame, because a privacy by design approach can greatly reduce digital footprints. We can only hope that GDPR enforcement eventually catches up and brings organizations back in line.
The same goes for other principles connected to privacy by design. For instance, data minimization means that you can only collect the personal data you really need, and storage limitation means that you cannot not store personal data any longer than needed. Too many organizations violate these principles and things won’t change until more fines start coming.
Location data is more dangerous than you think
Geolocation data plays a key role in the post-Dobbs privacy landscape. This is why the My Health My Data Act prohibits geofencing around health care providers, and why the proposed changes to the California Criminal Code deal with reverse-location requests from law enforcement.
On the European side, the GDPR has no specific provisions to protect location data, and does not count them as sensitive data (unlike the California CCPA). So, location data are only subject to the general rules of the GDPR
These general rules are probably not enough to protect location data. Or rather: they would be if enforcement caught up. The principles of privacy by design and storage limitation could play a vital role in the protection of location data, but again, they are still too underenforced to make a real impact.
Bottom line: consumers and companies alike should be very careful with location data. And again, GDPR enforcement needs to catch up!
At the end of the day, vulnerable people pay the highest price for the surveillance economy. To paraphrase the Grumpy GDPR podcast: if you think you have nothing to hide, then you are very, very privileged
This is nothing new: the impact of privacy practices on vulnerable individuals and communities is well researched by legal scholars and social scientists alike, and is an important topic of discussion in the privacy community.
Sadly, this angle is lost in the public debate around privacy. Hopefully, Dobbs v. Jackson- and the privacy mess it caused- will serve as a reminder that privacy is a necessary condition for a fair society and something we should all be striving for.