Skip to content Skip to footer
|

Privacy Not Included: Federal Law Lags Behind New Tech

The privacy of people using these newer services has been compromised, causing embarrassment or legal repercussions.

Jacqueline Stokes spotted the home paternity test at her local drugstore in Florida and knew she had to try it. She had no doubts for her own family, but as a cybersecurity consultant with an interest in genetics, she couldn’t resist the latest advance.

At home, she carefully followed the instructions, swabbing inside the mouths of her husband and her daughter, placing the samples in the pouch provided and mailing them to a lab.

Days later, Stokes went online to get the results. Part of the lab’s website address caught her attention, and her professional instincts kicked in. By tweaking the URL slightly, a sprawling directory appeared that gave her access to the test results of some 6,000 other people.

The site was taken down after Stokes complained on Twitter. But when she contacted the Department of Health and Human Services about the seemingly obvious violation of patient privacy, she got a surprising response: Officials couldn’t do anything about the breach.

The Health Insurance Portability and Accountability Act, a landmark 1996 patient-privacy law, only covers patient information kept by health providers, insurers and data clearinghouses, as well as their business partners. At-home paternity tests fall outside the law’s purview. For that matter, so do wearables like Fitbit that measure steps and sleep, testing companies like 23andMe, and online repositories where individuals can store their health records.

In several instances, the privacy of people using these newer services has been compromised, causing embarrassment or legal repercussions.

In 2011, for instance, an Australian company failed to properly secure details of hundreds of paternity and drug tests, making them accessible through a Google search. The company said that it quickly fixed the problem.

That same year, some users of the Fitbit tracker found that data they entered in their online profiles about their sexual activity and its intensity — to help calculate calories burned — was accessible to anyone. Fitbit quickly hid the information.

And last year, a publicly accessible genealogy database was used by police to look for possible suspects in a 1996 Idaho murder. After finding a “very good match” with the DNA of semen found at the crime scene, police obtained a search warrant to get the person’s name. After investigating further, authorities got another warrant ordering the man’s son to provide a DNA sample, which cleared him of involvement.

The incident spooked genealogy aficionados; AncestryDNA, which ran the online database, pulled it this spring.

“When you publicly make available your genetic information, you essentially are signing a waiver to your past and future medical records,” said Erin Murphy, a professor at New York University School of Law.

The true extent of the problem is unclear because many companies don’t know when the health information they store has been accessed inappropriately, experts say. A range of potentially sensitive data is at risk, including medical diagnoses, disease markers in a person’s genes and children’s paternity.

What is known is that the Office for Civil Rights, the HHS agency that enforces HIPAA, hasn’t taken action on 60 percent of the complaints it has received because they were filed too late or withdrawn or because the agency lacked authority over the entity that’s accused. The latter accounts for a growing proportion of complaints, an OCR spokeswoman said.

A 2009 law called on HHS to work with the Federal Trade Commission — which targets unfair business practices and identity theft — and to submit recommendations to Congress within a year on how to deal with entities handling health information that falls outside of HIPAA. Six years later, however, no recommendations have been issued.

The report is in “the final legs of being completed,” said Lucia Savage, chief privacy officer of the HHS Office of the National Coordinator for Health Information Technology.

None of this was useful to the 30-year-old Stokes, a principal consultant at the cybersecurity firm Mandiant. Four months after she filed her complaint with OCR, it suggested she contact the FTC. At that point, she gave up.

“It just kind of seems like a Wild West right now,” she said.

Protection of Consumer-app Data Varies

Advances in technology offer patients ways to monitor their own health that were impossible until recently: Internet-connected scales to track their weight; electrodes attached to their iPhones to monitor heart rhythms; virtual file cabinets to store their medical records.

“Consumer-generated health information is proliferating,” FTC Commissioner Julie Brill said at a forum last year. But many users don’t realize that much of it is stored “outside of the HIPAA silo.”

HIPAA seeks to facilitate the flow of electronic health information, while ensuring that privacy and security are protected along the way. It only applies to health providers that transmit information electronically; a 2009 law added business partners that handle health information on behalf of these entities. Violators can face fines and even prison time.

“If you were trying to draft a privacy law from scratch, this is not the way you would do it,” said Adam Greene, a former OCR official who’s now a private-sector lawyer in Washington.

In 2013, the Privacy Rights Clearinghouse studied 43 free and paid health and fitness apps. The group found that some did not provide a link to a privacy policy and that many with a policy did not accurately describe how the apps transmitted information. For instance, many apps connected to third-party websites without users’ knowledge and sent data in unencrypted ways that potentially exposed personal information.

“Consumers should not assume any of their data is private in the mobile app environment—even health data that they consider sensitive,” the group said.

Consider a woman who is wearing a fetal monitor under her clothes that sends alerts to her phone. The device “talks” to her smartphone via wireless Bluetooth technology, and its presence on a network could be detected by others, alerting them to the fact that she’s pregnant or that she may have concerns about her baby’s health.

“That is a fact that you may not want to share with others around you—co-workers or family members or strangers in a café,” said David Kotz, a computer science professor at Dartmouth College who is principal investigator of a federally funded project that is developing secure technology for health and wellness.

“We’ve seen this in the tech market over and over again,” he added. “What sells devices or applications are the features for the most part, and unless there’s a really strong business reason or consumer push or federal regulation, security and privacy are generally a secondary thought.”

“Walking Through an Open Door”

In Florida, Stokes is one of those people enamored with emerging health technologies. Several years ago, she rushed to sign up for 23andMe to analyze her genetic profile. And when she was pregnant with her daughter, she purchased a test that said it could predict the sex of the fetus. (It was wrong.)

The paternity test kit that piqued her interest earlier this year advertised “accuracy guaranteed” for “1 alleged father and 1 child.” She remembers the kit costing about $80 at a nearby Walgreens. Such tests sell for about $100 online.

“It was kind of a nerdy thing that I was interested in doing,” Stokes said.

The test was processed in New Mexico by GTLDNA Genetic Testing Laboratories, then a division of General Genetics Corp. Stokes was directed to log into a website and enter a unique code for her results. When they appeared, she noticed an unusual Web address on her screen, and she wondered what would happen if she modified it to remove the ID assigned to her.

She tried that and saw a folder containing the results of thousands of other people. She was able to click through and read them. “You wouldn’t call that hacking,” she said. “You would call that walking through an open door.”

Stokes downloaded those publicly accessible records so that she would have proof of the lax security. “There were no safeguards,” she said. She complained to the HHS Office for Civil Rights in early February. It answered in June,writing that the office “does not have authority to investigate your complaint, and therefore, is closing this matter.”

Bud Thompson, who until last month was the chief executive of General Genetics, initially said he had not heard about Stokes’ discovery. A subsequent email provided an explanation.

“There was a coding error in the software that resulted in the person being able to view results of other customers. The person notified the lab, and the website was immediately taken down to solve this problem,” he wrote. “Since this incident, we have sold this line of business and have effectively ceased all operations of the lab.”

The DNA testing company 23andMe, which helps people learn about their genetic backgrounds and find relatives based on those profiles, had a highly publicized lab mix-up in which as many as 96 customers were given the wrong DNA test results, sometimes for people of a different gender. A spokeswoman for the California-based company said she was unaware of any privacy or security breaches since that 2010 incident.

Kate Black, its privacy officer and corporate counsel, said that 23andMe tries to provide more protection than HIPAA would require.

“No matter what, no law is ever going to be narrow enough or specific enough to appropriately protect each and every business model and consumer health company,” she said.

California lawmakers have twice considered a measure to prohibit anyone from collecting, analyzing or sharing the genetic information of another person without written permission, with some exceptions.

Then-Sen. Alex Padilla, who sponsored the bill, cited a California company that marketed DNA testing, including on samples collected from people without their knowledge. In a recent interview, he said that he was amazed state law did not protect “what’s arguably the most personal of our information and that’s our genetic makeup, our genetic profile.”

The legislation failed. And Padilla, now California’s secretary of state, remains concerned: “I don’t think this issue is going away any time soon.”

Too Many Complaints to Pursue

While Stokes was troubled by her experience, she was particularly disheartened by the OCR’s response. “It was shocking to me to get that message back from the government saying this isn’t covered by the current legislation and, as a result, we don’t care about it,” she said.

The agency’s deputy director for health information privacy says there is no lack of interest. While it refers certain cases to law enforcement, OCR can barely keep up with those complaints that fall within its jurisdiction.

“I wish we had the bandwidth to do so,” Deven McGraw said. “We would love to be able to be a place where people can get personalized assistance on every complaint that comes in the door, but the resources just don’t allow us to do that.”

For its part, the FTC has taken action against a few companies for failing to secure patients’ information, including a 2013 settlement with Cbr Systems Inc., a blood bank where parents store the umbilical cord blood of newborns in case it is ever needed to treat subsequent diseases in the children or relatives. That settlement requires Cbr to implement comprehensive security and submit to independent audits every other year for 20 years. It also bars the company from misrepresenting its privacy and security practices.

But FTC officials say the number of complaints pursued hardly reflects the scope of the problem. Most consumers are never told when a company sells or otherwise shares their health information without their permission, said Maneesha Mithal, associate director of the FTC’s division of privacy and identity protection.

“It may be done behind the scenes, without consumers’ knowledge,” she noted. “Those are the cases where consumers may not even know to complain.”

Trump is busy getting ready for Day One of his presidency – but so is Truthout.

Trump has made it no secret that he is planning a demolition-style attack on both specific communities and democracy as a whole, beginning on his first day in office. With over 25 executive orders and directives queued up for January 20, he’s promised to “launch the largest deportation program in American history,” roll back anti-discrimination protections for transgender students, and implement a “drill, drill, drill” approach to ramp up oil and gas extraction.

Organizations like Truthout are also being threatened by legislation like HR 9495, the “nonprofit killer bill” that would allow the Treasury Secretary to declare any nonprofit a “terrorist-supporting organization” and strip its tax-exempt status without due process. Progressive media like Truthout that has courageously focused on reporting on Israel’s genocide in Gaza are in the bill’s crosshairs.

As journalists, we have a responsibility to look at hard realities and communicate them to you. We hope that you, like us, can use this information to prepare for what’s to come.

And if you feel uncertain about what to do in the face of a second Trump administration, we invite you to be an indispensable part of Truthout’s preparations.

In addition to covering the widespread onslaught of draconian policy, we’re shoring up our resources for what might come next for progressive media: bad-faith lawsuits from far-right ghouls, legislation that seeks to strip us of our ability to receive tax-deductible donations, and further throttling of our reach on social media platforms owned by Trump’s sycophants.

We’re preparing right now for Trump’s Day One: building a brave coalition of movement media; reaching out to the activists, academics, and thinkers we trust to shine a light on the inner workings of authoritarianism; and planning to use journalism as a tool to equip movements to protect the people, lands, and principles most vulnerable to Trump’s destruction.

We urgently need your help to prepare. As you know, our December fundraiser is our most important of the year and will determine the scale of work we’ll be able to do in 2025. We’ve set two goals: to raise $150,000 in one-time donations and to add 1,500 new monthly donors.

Today, we’re asking all of our readers to start a monthly donation or make a one-time donation – as a commitment to stand with us on day one of Trump’s presidency, and every day after that, as we produce journalism that combats authoritarianism, censorship, injustice, and misinformation. You’re an essential part of our future – please join the movement by making a tax-deductible donation today.

If you have the means to make a substantial gift, please dig deep during this critical time!

With gratitude and resolve,

Maya, Negin, Saima, and Ziggy