Thanks to covid-19, the age of biometric surveillance is here

By Michele L. Norris,

Imagine a world where you must agree to wear a small device to monitor your movements, sleep or heart rate before you can enroll in school, return to the workplace, attend a convention or book passage on a cruise ship. Thanks to covid-19, the age of biometric surveillance is already here.

Fast-tracked by the NFL, the NBA and colleges eager to bring back students, wearable technologies that help detect coronavirus and hinder its spread are being quickly embraced despite obvious questions about their impact on our privacy.

Wearing a fitness band to monitor your steps is one thing. Wearing a mandatory bracelet so an employer or institution can access continuous data about your movement and health is something very different. Well, brace yourself: This is something for which we are not prepared. Whether it’s bands, lanyards, stickers or monitors that look like wedding rings, wearables are becoming a go-to weapon in the continuing battle against covid and its emerging variants.

Last week, Israel’s Knesset approved a bill requiring travelers returning from abroad who do not have proof of vaccination or antibody status to wear electronic monitoring bracelets that ensure they remain in tight quarantine.


[Full coverage of the coronavirus pandemic]

In Ontario, the provincial government is also planning to use contract tracing wristbands in nursing homes, construction sites, schools and First Nation communities. The bands vibrate or buzz when people get within six feet of each other, and collect location data to trace people who came in contact with someone who is covid-positive.

At Michigan’s Oakland University, students returning to campus last fall were asked to wear BioButtons, a device the size of an egg yolk that, when affixed to the chest, can monitor temperature, heart rate and respiration — as well as location data.

The Professional Golfers’ Association gave its players, caddies and essential staff bracelets or bicep bands from a company called Whoop after one of those devices serendipitously helped golfer Nick Watney realize he was sick.

Watney wore the band at first to monitor sleep and heart rate to maximize his performance. But last June, the device signaled a sudden rise in his respiratory rate while he slept. Though he felt fine, he asked for a covid test before a tournament at Hilton Head. The overnight irregularity in breathing turned out to be no anomaly: Watney tested positive for covid and was quickly ushered off the course — a diagnosis that might have otherwise gone undetected as he never developed symptoms associated with covid.

His case underscores how wearables can do more than just monitor proximity. They can potentially spot the disease, perhaps better than the temperature checks now common at restaurants and bars. The Center for Genomics and Personalized Medicine at Stanford has been using data from smartwatches that track resting heart rate and respiration to monitor spikes that might indicate a viral invader. Seventy percent of the participants in one Stanford study showed a spike in their heart rate and respiration before the onset of any kind of symptom, said Michael Snyder, chairman of the Stanford genetics department.

There is a lot of understandable excitement around these devices, but understand this: The technology is running well ahead of any oversight or guardrails about who or what owns our biometric data.

So here are a few questions to keep in mind as wearables inch closer to becoming part of our morning routines. Put these questions in your back pocket in case you have to slap a wearable on your arm:

If a school, employer or any institution is providing the device, what happens to the data they collect? Where is it stored? Who has access? How long is it maintained? Is it destroyed after a set period of time?

Does the entity that requires this device clearly spell out its intended use? Does it pledge never to reach beyond those limitations? Since these devices measure such core body functions as heart rate and respiration, what’s to prevent an employer from taking action upon discovering that a worker’s health is less than robust? Or a college from making judgments about a student’s location?

If the data is stored by a third-party vendor, will it be shared or sold to other companies or agencies, or used for other purposes? Are the devices always on? Or can they be turned off or disabled for privacy at certain times of the day?

What is to prevent police departments — or the FBI, or immigration officials — from demanding access to location data?

This is complicated stuff — part of a growing debate about who owns the data that we generate with all of our digital devices. But in the quest to turn the page on the pandemic, it is easy to see how most of us might rush past these questions. Few of us actually read the “terms of use” before activating a new device or agreeing to a software update. The reality is that wearable technology is arriving at a moment when our reliance on smart devices and all kinds of apps has already worn down our expectations about privacy.

Data in any form is wealth. Data governance and data oversight are too often afterthoughts. And companies will always look out for their own interests first. If institutions are going to start handing out wearables, they should also be willing to serve up answers and assurances.

Read more: The Post’s View: Before we use digital contact tracing, we must weigh the costs Ian Marcus Corbin: Technology made the pandemic bearable. It’s also behind our national crackup. Michele L. Norris: The royal family saga is a mirror of America The Post’s View: A tool to stem the spread of the coronavirus sits in your pocket Eric Topol: How digital data collection can help track covid-19 cases in real time

Source: WP