If you’ve ever assumed information shared on a mental health app was confidential, you’re in good company with many others who probably assume sensitive medical information is always protected. That is No true, however, and it is important to understand why.
Many of us are familiar with or are active users of some kind of digital health app. Whether it’s nutrition, fitness, sleep tracking, or mindfulness, the field of apps that can help us keep track of aspects of our health has never been smaller. larger. Similarly, the platforms that help us communicate with health care providers and receive virtual care have become more available, and often necessary, during the pandemic. online therapy in special has grown over the years, and became a critical resource for many people during quarantines and remote living.
Making health care and resources more accessible to people is vital, and the ease of accessing health resources directly from your phone is obvious.
However, among Manyheavy transcendence of Roe vs. Wade having been tipped over area number of digital privacy concerns. Recently, there has been a significant focus on fertility or period tracking apps, as well as location information, and reasonably so. On July 8, the House Oversight Committee submitted letters to data brokers and health companies “requesting information and documents about the collection and sale of personal reproductive health data.”
What has been discussed less is the huge gap in legal protections for all kinds of medical information that is shared through digital platforms, all of which should be subject to regulation and better oversight.
The US Department of Health and Human Services (HHS) recently published guide on cell phones, health information and HIPAAconfirming that the HIPAA privacy rule does not No they apply to most health apps as they are not”covered entities” Under the law.” The Health Insurance Portability and Accountability Act (HIPAA) is a federal law that creates a privacy rule for our “medical records” and “other individually identifiable health information” during the flow of certain health care transactions. medical care. Most applications that are individually selected by the user are not covered, only platforms that are specifically used or developed for traditional healthcare providers (i.e., a clinic’s digital portal for patients where they send messages or test results) .
Mental health apps are a revealing example. They, like other digital health apps, are generally not united by the privacy laws that apply to traditional health care providers. This is especially concerning because people often look to mental health platforms specifically to talk about difficult or traumatic experiences with sensitive implications. HIPAA and state laws on this topic would have to be amended to specifically include platforms based on digital applications as covered entities. For example, California currently has a pending invoice that would bring mental health applications within the scope of your state medical information privacy law.
It’s important to note that even HIPAA has exceptions for enforcement, so bringing these applications within HIPAA’s scope would still not prevent government requests for this data. It would be more useful to regulate the information that is shared with data brokers and companies like Facebook and Google.
An example of the information being shared is that which is collected during an “intake questionnaire” that must be completed on leading services such as Talkspace and BetterHelp in order to be matched with a provider. The questions cover extremely sensitive information: gender identity, age, sexual orientation, mental health history (including details such as when or if you have thought about suicide, if you have experienced panic attacks or have phobias), sleeping habits, medications, symptoms current etc Jezebel found all of these intake responses to be shared with a BetterHelp analytics company, along with the user’s approximate location and device.
Another type is all the “metadata” (ie data about the data) about your use of the app, and Consumer Reports found that this can include the fact that you are a user of a mental health app. Other shared information may include how long you are on the app, how long your sessions with your therapist are, how long you are messaging on the app, what times you log in, what times you message/talk to your therapist, your approximate location, how often you open the app app, and so on. Data brokers Facebook and Google were found to be among the recipients of this Talkspace and BetterHelp information. Applications regularly justify sharing information about users if this data is “anonymous”, but anonymous data can easily be connected to you when combined with other information.
Along with the collection and sharing of this data, the retention of data by health apps is incredibly opaque. Several of these applications do No have clear policies about how long to hold back your data, and there is no rule that requires them to do so. HIPAA does not create any record retention requirements – they are regulated by state law and are unlikely to include health applications as professionals subject to it. For example, the state of New York requires licensed mental health professionals to keep records for at least six years, but the application itself is not a professional or licensed. Request deletion of your account or data can also No delete everything, but there is no way of knowing what is left. It is unclear how long the sensitive information they collect and retain about you might be available to law enforcement at some point in the future.
Accordingly, here are a few things to keep in mind when browsing health apps that may share your data:
The accessibility to care that these types of apps have created is more than critical, and everyone should seek the care they need, even through these platforms if they are the best option for you (and they are for many people). The important thing is to be as informed as possible when using them and to take whatever measures are available to you to maximize your privacy.