Skip to content Skip to footer

Workers’ Health Data Collected for COVID Safety Poses Risk to Labor Rights

Many of these technologies pose severe threats to workers’ privacy and other fundamental rights.

An emergency Department nurse checks her pre-screening app on her phone while wearing personal protective equipment before starting her morning shift at Zuckerberg San Francisco General Hospital and Trauma Center in San Francisco, California, on April 11, 2020.

With numbers of COVID-19 infections soaring again in the United States and around the world, we have to learn how to manage its long-term ramifications for our economies. As people adjust to minimizing the risk of infections in everyday settings, one critical context is work. Even though millions have shifted to working from home during the past months, remote work is not possible for every industry. While the pandemic has had a critical disruptive effect on work and employment virtually everywhere in the world, it has not affected everyone in the same ways. The International Labor Organization notes that the current crisis significantly affects women, workers in precarious situations who lack access to health care or limited social security benefits, and informal workers, who work jobs that are not taxed or registered by the government. In Latin America, 60% of workers are considered informal, with 58% of informal workers living in economic vulnerability on 13 U.S. dollars or less per day or in poverty on less than 5.5 U.S. dollars per day. Many have no choice but to work outside the home. This can involve putting their health and livelihoods on the line, especially in countries with insufficient public health care or unemployment programs.

As businesses strive to re-open, and many workers depend on them doing so, many employers are looking at experimental technologies to navigate the risk of infections among their workforce. Over the past months, dozens of new apps, wearables, and other technologies have sought to help mitigate the risks of COVID at work, not counting the many examples of pre-existing workplace technologies already in use for different purposes. Some technologies seek to trace the proximity of one person with another to estimate whether they are less than approximately six feet (or two meters) apart for a sufficient time. This data can be used to notify workers of potential exposures to COVID. Decentralized Bluetooth proximity is the most promising approach for technology-assisted exposure notification that minimizes privacy risks. But while some employers aim for that goal, others are using apps that track workers’ individualized phone location data with GPS. GPS is extremely sensitive, especially when it collects worker movements outside the workplace, and is insufficiently granular to identify when two co-workers were close enough together to transmit the virus.

Other companies ask employees to submit daily symptom checks to their employers. Some checks may be as simple as one or two yes/no questions, while others collect more granular symptom data. The more information a company collects, the greater the risk that it can be used to detect conditions, or side effects of treatments, that have nothing to do with COVID-19. This is an issue because many companies are not subject to the privacy protections in the 1996 Federal Health Insurance Portability and Accountability Act (“HIPAA”). HIPAA has a very limited scope because protections for health information in the U.S. rely on who has the data. In general, only data created or maintained by health plans, health care clearinghouses, health care providers that conduct certain health care transactions electronically, and their business associates have HIPAA protections. Data collected by any other entity, such as an employer, usually do not. Under the EU General Data Protection Regulation (GDPR), an employee’s personal data concerning their health includes all data about their “health status (…) which reveals information relating to the past, current or future physical or mental health status”. Peoples’ rights flow with their data. The European Union has always treated such personal data as sensitive with stringent limitations. In short, many of them seriously undermine employees’ privacy and other fundamental rights and collect information in a way that gives employees little protection.

Health Surveys and Contact Tracing Apps

One common category of technology to mitigate COVID-19 at work are apps that prompt workers to report information about their health status. One is ProtectWell, developed by Microsoft in cooperation with United Health, a for-profit health care company located in Minnesota. Urging its prospective users not to keep “life on hold,” ProtectWell allows organizations to build custom health surveys. It also offers Microsoft’s healthcare bot to help triage which symptoms are most concerning. When users are considered to be at risk, employers can direct them to undergo a testing process that will report the results directly back to the employer. ProtectWell’s privacy policy clarifies, as it should, that any information disclosed to the app is not considered health information as defined in HIPAA, and hence not protected as such. The privacy policy further allows United Health to share test results and responses to symptom surveys with a user’s employer, without requiring the worker’s consent. While both Microsoft and United Health plan on deploying the app for their workforces, it is not clear whether their employees have a choice in that, or how widely other organizations have taken up the app. But ProtectWell evokes many of the privacy concerns related to workplace wellness programs. Many workplaces offer wellness programs meant to incentivize employees to participate in health screenings or fitness programs. Workers often face the difficult choice of forgoing certain benefits by not participating, or giving their employers access to potentially sensitive health data which can be abused in a myriad of ways.

Another example is Check-in, a suite of products developed and marketed by Price Waterhouse Cooper (PwC). Noting that “83% of companies do not have processes and systems in place to track all of their workforces,” PwC offers customers an app that combines location tracking using GPS with a tool that monitors employees’ productivity. Once downloaded, the app activates WiFi and Bluetooth capabilities to keep track of which workers have been in close contact, and uses the phones’ GPS signals to determine when they are at the company’s premises. As the company itself notes, “app-based contact tracing can result in processing more data than is needed for the intended purpose of notifying affected individuals.” GPS data, in particular, can expose where a worker has been and what they have been doing, both inside and outside the office.

PwC does not provide detailed information about the location tracking capabilities of the app. A spokesman explains that the data collected is made available to managers to help trace workers who might have been in proximity to a COVID patient. PwC has not shown that employees’ consent to such use before the app shares their health data with their employers. Even if the policy requires this, such consent can be questionable, since workers—with their livelihoods at stake—may not exercise real choice when their employer tells them to strap it on or release personal data. In the European Union, under the GDPR, consent can’t be a valid legal ground to process the data when the employee feels compelled to consent or endure negative consequences if they do not consent. Employers may find another legal basis to process employee’s health data, such as legitimate interest. However, legitimate interest may not be possible to use if the employer’s interests “are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data.” Such an assessment might need to be done case-by-case.

Beyond its location and proximity tracking features, the app also includes a feature named “status connect” which allows employers to check in with their workers to understand factors that may inhibit their productivity on a given day. Designed to “spot productivity blockers for remote workers,” Status Connect offers employers access to a trove of sensitive information regarding their employees’ health, location (remote or on-site), and productivity. According to reports, PwC is currently testing the suite.

Blackline Safety, a Canadian company, has found another approach to location tracking by combining its “intrinsically safe” G7C wearable device (designed to detect gas leakages) and a smartphone app to supervise ‘lone’ workers. Blackline is thus an example of companies repurposing existing technology for COVID purposes, with questionable results. Blackline’s G7C wearable uses GPS tracking to locate its wearer. When using the product, “employee location data streams to the Blackline Safety Cloud,” allowing companies to “immediately retrace [an] individual’s steps” in order to see whom they may have been in contact with. This tool may be appropriate for some non-COVID purposes, such as promoting safety for employees who work in remote or hazardous environments. Still, GPS data is too sensitive, too prone to abuse, and not effective enough to serve as the basis for COVID exposure notification.

Enforcing Social Distancing

Besides mobile apps, employers can also deploy hardware in their quest to control COVID infections in their organization. Several companies have developed machine vision software designed to augment existing camera systems to monitor people’s compliance with social distancing rules. Smartvid.io, a company prominent in the construction industry, claims that its technology can help organizations identify and log the numbers of people not adhering to social distancing or not wearing protective masks. The software automatically generates reports to help managers “reward COVID-19 safety practices.” It is unclear whether and how Smartvid.io has access to the data generated by cameras equipped with its software, which could be used to collect detailed logs of workers’ locations, productivity levels, and even with whom they socialize at work.

Based in Pune, India, Glimpse Analytics is another company that claims to help employers implement health guidelines. Like Smartvid, its One Glimpse Edge’ device connects with pre-existing CCTV cameras and triggers alerts when rooms reach maximum occupancy, or when individuals appear to be too close or fail to wear masks. The software also ‘tracks’ housekeeping staff tasked with cleaning workspaces. While Glimpse Analytics maintains that its software ensures people’s privacy, as it does not recognize faces, and that all data is encrypted and processed locally, it enables sweeping workplace surveillance. It amasses large volumes of sensitive data, without requiring workers’ consent.

Likewise, Amazon recently introduced its ‘Distance Assistant.’ The software, which was made open source, aims to monitor workers’ distance to implement social distancing guidelines. Hooked up to cameras, sensors, and a TV screen, the assistant is meant to give instant visual feedback when workers are too close to each other. Amazon has deployed the software, which businesses and individuals can access free of charge, across several of its buildings. Besides the question of just how useful this piece of technology can be in keeping workers safe, it is unclear how the captured data is stored, used, and shared, and what steps Amazon is taking to maintain workers’ privacy. Data about workers’ movement patterns could likely be abused to provide managers with information about which employees associate with each other.

Conclusion

Purveyors of a variety of new and repurposed surveillance technologies seek to help employers mitigate the risks of workplace COVID infections. But many of these technologies pose severe threats to workers’ privacy and other fundamental rights. In particular, a technology that creates graphs of interactions between co-workers could stifle workers’ freedom to associate, even safely, and enable turnkey union-busting. Furthermore, many of these tools are untested and unproven, and may not be as effective as employers hope. While employers must do what they can to keep their workers safe, such efforts should not come at the price of undermining workers’ privacy.

Truthout Is Preparing to Meet Trump’s Agenda With Resistance at Every Turn

Dear Truthout Community,

If you feel rage, despondency, confusion and deep fear today, you are not alone. We’re feeling it too. We are heartsick. Facing down Trump’s fascist agenda, we are desperately worried about the most vulnerable people among us, including our loved ones and everyone in the Truthout community, and our minds are racing a million miles a minute to try to map out all that needs to be done.

We must give ourselves space to grieve and feel our fear, feel our rage, and keep in the forefront of our mind the stark truth that millions of real human lives are on the line. And simultaneously, we’ve got to get to work, take stock of our resources, and prepare to throw ourselves full force into the movement.

Journalism is a linchpin of that movement. Even as we are reeling, we’re summoning up all the energy we can to face down what’s coming, because we know that one of the sharpest weapons against fascism is publishing the truth.

There are many terrifying planks to the Trump agenda, and we plan to devote ourselves to reporting thoroughly on each one and, crucially, covering the movements resisting them. We also recognize that Trump is a dire threat to journalism itself, and that we must take this seriously from the outset.

After the election, the four of us sat down to have some hard but necessary conversations about Truthout under a Trump presidency. How would we defend our publication from an avalanche of far right lawsuits that seek to bankrupt us? How would we keep our reporters safe if they need to cover outbreaks of political violence, or if they are targeted by authorities? How will we urgently produce the practical analysis, tools and movement coverage that you need right now — breaking through our normal routines to meet a terrifying moment in ways that best serve you?

It will be a tough, scary four years to produce social justice-driven journalism. We need to deliver news, strategy, liberatory ideas, tools and movement-sparking solutions with a force that we never have had to before. And at the same time, we desperately need to protect our ability to do so.

We know this is such a painful moment and donations may understandably be the last thing on your mind. But we must ask for your support, which is needed in a new and urgent way.

We promise we will kick into an even higher gear to give you truthful news that cuts against the disinformation and vitriol and hate and violence. We promise to publish analyses that will serve the needs of the movements we all rely on to survive the next four years, and even build for the future. We promise to be responsive, to recognize you as members of our community with a vital stake and voice in this work.

Please dig deep if you can, but a donation of any amount will be a truly meaningful and tangible action in this cataclysmic historical moment.

We’re with you. Let’s do all we can to move forward together.

With love, rage, and solidarity,

Maya, Negin, Saima, and Ziggy