Your Body Is Betraying Your Right to Privacy
The desire for self-awareness is not new, but these data offer a dif-ferent twist on enlightenment.
The desire for self-awareness is not new, but these data offer a dif-ferent twist on enlightenment. Millions of Americans live with a smartwatch that reminds them to stand, breathe, and take a few more steps to meet their daily exercise goals. This helpful (and healthful) algorithmic prompt only works, of course, because your smart device is tracking your bodily activity.

It literally knows you are breathing, which can be helpful to police if for some reason you stop. The data we produce—from our step count to our DNA—is increasingly coming under surveillance. Not all of this surveillance is unwelcome.
Many medical professionals have embraced digital tracking to help their patients. Smart pacemakers measure heartbeats. Digital pills record when someone last took their medication.
Smart bandages can warn of early infection. These innovations offer the potential to improve medical outcomes by linking data in and on our bodies to our digital health records. They rely on small sensors that can be placed in watches or implanted in medical devices, allowing you to monitor your own vital signs or to check on friends and family members with health issues.
Of course, there are potential downsides to making medical data so available. Digital pills might inform your doctor (or parole officer) that you’ve stopped taking your psychiatric medication; it’s no coincidence that the first such pill approved by the FDA treats schizophrenia and other mental health disorders. In addition to helping with your marathon training, the data from your smartwatch can identify times when you are using cocaine or having sex.
Recent laws criminalizing abortion raise the stakes of collecting this kind of information. Almost a third of women use period trackers to monitor their reproductive health. Many of these apps—such as Flo, used by 48 million women—collect information about the user’s mood, body temperature, symptoms, ovulation, and sexual partners, as well as their location.
Even if a user kept the result of her pregnancy test off the app, her missed period, combined with weeks of recorded nausea, would offer a pretty good clue as to her condition. In states that have restricted abortion access, prosecutors could use this data as evidence of a crime.
In states where abortion remains legal, reproductive information might find its way into the hands of marketers instead. In 2023, the Federal Trade Commission fined the “femtech” company Premom for selling data to third parties, including Google and companies in China. Premom, like Flo, which also settled a complaint by the FTC, did not disclose the fact that it was sharing this personal data—which, in the case of Premom, included information about “sexual and reproductive health, parental and pregnancy status, as well as other information about an individual’s physical health conditions and status.”
Some femtech companies have tried to protect personal data by limiting the amount they collect and localizing it on the device, refusing to log IP addresses, or creating an anonymous mode, but companies and users are still at the mercy of court orders. US companies are bound by US laws, and when abortion is criminalized in a state, data that could provide evidence of an abortion is subject to warrant requests by investigating agents. The only way to avoid turning over the data is by not collecting it, which is difficult for a business predicated on collecting data.
The rise of mental health apps and online therapy has exposed another vector of self-surveillance. The online therapy company BetterHelp has over 2 million users who benefit from their online and mobile mental health services. You can sign up and answer questions about your mental health issues (such as problems with depression, intimacy, or medications), and they provide connections, advice, and resources to help.
Then, they turn around and sell your personal data to Facebook and other targeted advertising companies—or at least they did until 2022, when the FTC brought a complaint against BetterHelp and its subsidiaries to stop the practice and ultimately imposed $7.8 million in fines. BetterHelp was not alone in marketing information about its users’ mental health. As the Mozilla Foundation reported after an in-depth investigation into the industry, many mental health apps are lax on privacy.
Most failed privacy audits, failing to secure (or even outright profiting from) personal mental health data. Even online suicide prevention services turned out to be providing data to Facebook, through automated pixel capture technologies. While there might be nuanced arguments to make about anonymity when it comes to suicide prevention, it’s hard to make the case that advertisers should get access to people in crisis for commercial gain.
And of course, if data is available for sale, it is also available to law enforcement and the government. Just imagine how mental health data could be used to establish motive in a crime or embarrass a political opponent. Police are intensely interested in the secrets our bodies can reveal.
The FBI has invested billions of dollars in its Next Generation Information (NGI) biometrics database, billed as the largest such database in the world. Through this system, the FBI collects “voice profiles, palm prints, faceprints, iris scans, tattoos, and, of course, fingerprints,” with the goal of using this information to identify suspects (and victims). The system also pulls in genetic information from CODIS—the agency’s Combined DNA Index System—which contains 21.7 million DNA profiles of offenders and arrestees (almost 7 percent of the US population).
Many states have built their own similar databases using samples from arrestees, victims, and other sources, which are sometimes collected in ethically dubious ways. The district attorney’s office in Orange County, California, for example, had a program where they would dismiss misdemeanor violations in return for a DNA sample. That “spit and acquit” sample, of course, could later be used to match suspects in future prosecutions.
New Jersey police went one step further. Under state law, all newborn babies are required to provide a blood sample to be screened for certain life-threatening genetic disorders. The blood sample goes to the Newborn Screening Laboratory, operated by the New Jersey Department of Health, which shares the results with parents as needed.
After the testing is completed (and unbeknownst to many parents), the lab retains the DNA for 23 years. The result is a rich trove of genetic information that has uses far beyond disease screening—including as evidence in criminal cases. In one instance, state police subpoenaed the laboratory for the DNA of a newborn in order to link the baby’s father to a 15-year-old crime.
In turning over the infant’s DNA, the laboratory provided a critical biological link to identify a suspect. The New Jersey public defender’s office sued to challenge this DNA matching and the laboratory’s lack of transparency, and state lawmakers are working to limit the retention of genetic data to two years. The case—and others like it—demonstrates the danger of large-scale biometric collection.
If available, DNA samples will be used for prosecution. Soon, blood samples may not even be necessary. Next-generation DNA matching can snatch genetic material from the physical environment to test it.
Since we all leave our DNA everywhere we go, this will make collection both easier and largely inescapable. New technologies are also allowing DNA to be processed much more quickly.
Source Verification
Corroboration Score: 1This story was independently reported by 1 sources. Click any source to read the original article.
Comments
0 commentsAlleged abuse by Rene Redzepi at Noma reverberates in the Boston restaurant scene
Price eager to get started at George’s
Related Articles
ScienceNicholas Bacopoulos & Calypso Gounti Seminar Room Named at Wiener Laboratory
ScienceThe Point of No Return: New Evidence Shows Antarctic Melting Is Already Locked In
Science