Roe v. Wade and the New, Murky Data Privacy Morass

The U.S. Supreme Court overturning Roe v. Wade opens up the potential for scrutiny of digital information to enforce laws set in motion by the decision. What new challenges does that bring to CIOs?

Joao-Pierre S. Ruth, Senior Editor

June 27, 2022

7 Min Read
US Supreme Court building before a deep blue sky
The US Supreme Court building, Washington, D.C.Blakeley via Alamy Stock Photo

Friday, the Supreme Court of the United States overturned Roe v. Wade, undoing what many had regarded as settled law on abortion rights. The decision has cleared the way for a cascade of trigger laws that may lead to new questions about data privacy and compliance. Attorneys and digital privacy advocates are weighing in on the matter just as numerous states have already or are set to enact laws to ban abortion, which might see personal data called into question as such laws are enforced.

How enforcement will affect data privacy is still largely unknown, but speculation has included concerns about how personal health information collected from apps might be used to regulate compliance with such laws. Employers who operate across state lines may find the health coverage they offer employees could come under scrutiny in different jurisdictions -- specifically if that coverage includes providing access to abortions in states where it remains legal. Some companies, such as Disney, are even offering to cover travel expenses to states that will continue to permit the procedure.

The corporate world is now left with a question: What if states demand that companies comply with regional laws that include sharing data about customers, users, or employees for the sake of enforcement?

What Does Abortion Law Mean to CIOs?

Hayley Tsukayama, senior legislative activist for the Electronic Frontier Foundation (EFF), says this a real-world example of the potential harm related to privacy and control that her organization has been worried about. The EFF is a nonprofit organization that advocates for civil liberties in the digital landscape.

“If there are local and state law enforcement agencies that are motivated to pursue prosecutions of people who are seeking abortions or seeking information on reproductive healthcare, we are concerned we will see more warrants and subpoenas and pressure on companies to release information,” she says. Customers' location data or automated license plate reader data, which can track the movement of people in the world, might be used to support prosecutions or be a starting point for prosecution.

Tsukayama compared that to the data infrastructure tapped by law enforcement for immigration enforcement. “That’s the big boogeyman in my mind in terms what companies will have to deal with.”

Data 'Sanctuary States'

With the bifurcation of the country where states such as Texas and Oklahoma are moving to criminalize activities related to abortion, the EFF is concerned with how data will be used to support claims of criminality.

“In California, we’re in support of a bill that basically creates a sanctuary state around reproductive healthcare data so that if another state subpoenas a healthcare care provider in California, California wouldn’t have to issue a responding subpoena,” says EFF's Tsukuyama.

New York has looked at a similar idea, she says, to lock down data within state borders to nix compliance with cross-state investigations into such data. “For companies, we might see more complications in terms of watching that divide grow and having to navigate if you have people in multiple states,” Tsukayama says.

The country and world already have a myriad of general data privacy policies in place or under debate, Tsukayama says, in terms of who controls personal data, how it is used, and the liabilities companies may face for keeping such information. Compliance with one set of data privacy laws, regardless of how stringent they are, does not guarantee that requirements are met in other jurisdictions.

“It can be kind of confusing,” she says. “We have regulations of varying strengths in different states across the country.” For example, compliance with California’s data privacy legislation, which she regards as the strongest, does not guarantee compliance with all other state laws. “The particular issue with reproductive privacy,” says Tsukayama, “is you are also dealing with state health laws. There are new laws being introduced all the time. It’s a fairly confusing landscape.”

Ted Claypoole, partner with law firm Womble Bond Dickinson, says one of the problems in the going discussions of data privacy is the definition of privacy itself and whether it encompasses too much. Autonomy and secrecy are often focal points of the discussion in terms of government awareness and intervention, but he says obscurity is another area of privacy. “It is an area the law protects in some cases but is also wrapped up in privacy,” Claypoole says.

For example, when a crowd gathers peacefully at a sporting event, they do not expect someone to video the entire crowd, run all the images through facial recognition, and then take some form of action, he says. “You’re not being private, you’re not being secret, you’re not going somewhere that’s hidden from everybody, and yet you don’t expect to be tracked down there.”

He foresees more problems arise because federal courts can be inconsistent in how they apply the term “privacy,” which can lead to states not understanding what they can and cannot do. That might see states push the envelope, which Claypoole says might affect companies.

Listen to "That DOS Won't Hunt" on Spreaker.

Much like privacy, data is also poorly defined, he says, from a legal and policy perspective. “Data basically means true facts,” Claypoole says. This can include whether or not an individual visits a doctor for a procedure. “The government can say, ‘We find some of these true facts to be private,’ but when people say, ‘This is my data,’ all it really means is it’s data which references you,” he says. “It doesn’t mean you own the data in any way or that you should own the data in any way.”

The other issue in this morass is who is being dealt with, Claypoole says, in terms of privacy concerns and responsibilities. The rules are changing all the time, he says, regarding what the government is allowed to know and what it is not. “The Fourth Amendment says you should have security in your body, in your papers, and in your home,” Claypoole says. “There are lots of specific rules about privacy with regard to you and the government -- and that’s some of what the Supreme Court has changed this month.”

There are increasing layers of concern, he says, regarding privacy per interactions between individuals and companies for instance on the internet, the payment system, and who has access to that information. When an individual makes a purchase, that activity might be accessible to the merchant, their bank, the customer’s bank, payment processors in between, the mobile carrier if a phone was used, the company that made the phone, and the software vendor that facilitated the transaction. “Your privacy with regard to all of these various companies that are taking your information is a whole other set of issues that has to be decided in a different way,” Claypoole says.

'Companies Are Not Ready'

Next year with the California Privacy Rights Act, as well as laws in Utah, Colorado, Connecticut, and Virginia, Claypoole sees new protection coming into play for a new kind of information listed as “sensitive information.” This can include personal geolocation information. “There’s an entire ecosystem of advertising and other data and information sources that companies are using that is built around knowing where a customer is on their phone,” Claypoole says. “That is about to be regulated and restricted in a way that it never has been, and I think companies are not ready for it.”

One way that companies might avoid headaches related to data collected from customers is to not retain it very long -- especially when it comes to information that might be used for prosecution. “Even if someone comes at you with a warrant, if you don’t have the information, you can’t produce it,” Tsukayama says. Companies could also be transparent with users to make it clear when certain information might be sought by state or law enforcement entities, she says.

Tsukayama says she hopes companies will better understand concerns about law enforcement access to information that is kept and the harm that may be brought to the consumers through the over-collection and over-retention of personal data that may not be necessary for the service being provided.

There are other conversations playing out in Washington, D.C. about general consumer privacy legislation, Tsukuyama says, where arguments persist on what amount of control individuals should have over the collection and processing of information. “We’re going to be talking about it a lot more.”

What to Read Next:

What Federal Privacy Policy Might Look Like If Passed

Why to Create a More Data-Conscious Company Culture

Priorities of Highly Successful Chief Data Officers

About the Author(s)

Joao-Pierre S. Ruth

Senior Editor

Joao-Pierre S. Ruth covers tech policy, including ethics, privacy, legislation, and risk; fintech; code strategy; and cloud & edge computing for InformationWeek. He has been a journalist for more than 25 years, reporting on business and technology first in New Jersey, then covering the New York tech startup community, and later as a freelancer for such outlets as TheStreet, Investopedia, and Street Fight. Follow him on Twitter: @jpruth.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like

More Insights