Over the past few years, the use of mental health apps has increased, with a reported 54.6% increase between 2019 and 2021. This increase may be correlated with the increased prevalence of diagnosed mental health disorders during the COVID-19 pandemic. The introduction of social distancing measures has made traditional face-to-face psychotherapy services less accessible.
Several different types of apps have emerged that are broadly aimed at improving mental health. The first guides users through practices aimed at relaxing their mental state, such as meditation and deep breathing. Examples of these apps include Calm, which brought in an estimated $355 million in revenue in 2022, and Headspace, which brought in an estimated $235 million. The second category includes apps such as BetterHelp, which consists of a platform that connects people with qualified therapists and facilitates treatment (in the case of BetterHelp, clients and therapists use the app to communicate asynchronously). conduct a text messaging session). Finally, a more recently emerging category of mental health apps includes apps that employ AI tools to emulate mental health professionals. Examples of these include Elomia, Wysa, and Woebot, which take the form of automated chatbots that respond to patient comments.
While these apps undoubtedly make mental health services more convenient, they also generate large amounts of sensitive data, raising serious concerns regarding patient privacy. In May, Mozilla released a report analyzing the privacy policies of 32 mental health apps. Of those, 22 had a “privacy not included” warning label. This means he was found guilty of two or more of the following: questionable data use, user control of obscure or uncontrollable data, and a history of questionable data use. Data protection and failure to meet Mozilla’s minimum security standards. Privacy concerns about these apps are beginning to reach policy makers. In March 2023, the Federal Trade Commission (FTC) accused BetterHelp of disclosing customers’ emails, IP addresses, and dietary health survey information to Meta, Snapchat, Criteo, and Pinterest. Even though BetterHelp assured users that all data would be kept confidential and stored for several uses related to the functionality of the service, this data was used for advertising purposes.
The data generated by these apps is used for advertising purposes as well as being collected to gain insights that help us improve the apps themselves. The app, Talkspace, acts as a platform where individuals can send text messages to licensed therapists, and “regularly reviews and mines” users’ private conversations for business insights, as well as the AI ​​it incorporates into the service. I developed a bot. It’s not clear exactly what companies will do with the reviewed material or how they will gain insight.
Especially sensitive data
There are several reasons why mental health data is particularly sensitive. First, many of these companies facilitate “surveys” to get individuals to sign up for their services. These questionnaires may include identifiers such as gender identity, sexual orientation, and details about the potential client’s past experiences with mental health. If this information is shared with advertisers, as BetterHelp shared with Meta, users could find that very intimate details about their lives are reflected in the ads they are receiving. there is. This is particularly at a time when aspects of identity (such as gender and sexuality) are becoming increasingly politicized and can impact on an individual’s quality of life if they end up in the hands of employers or other parties. , which can be alarming for users. Additionally, mental health apps are unique in that data brokers can make inferences about an individual’s mental state simply by knowing that the individual is using one of these apps, providing a wealth of access to sensitive information and Ease is especially important. Even if this data is anonymized, the anonymization can be easily reversed by integrating it with other datasets. This is one of the many reasons to enact comprehensive federal privacy laws mandating data collection controls.
what will you do?
So what should be done about these concerns? First, mental health app companies should be committed to ensuring their privacy policies are understandable to the average user. A recent study found that of the 27 mental health apps examined, 24 had privacy policies that required at least a college level education to understand. Even if they are easy to understand, these policies are often hidden behind additional links or paragraphs of text, making them inaccessible to many people. Given the personal nature of the data these apps collect, and the intimacy that users develop using them, it’s important to ensure that people understand and meaningfully consent to the release of this information from the outset. is the most important.
Second, policies must be put in place to hold these digital health apps to the same privacy standards as traditional healthcare providers. Although this is not the case at this time, the sensitivity of information shared on these platforms can often be as serious as mental or physical health issues shared with in-person providers. Considering that, this is a huge oversight.
Third, there needs to be a broader discussion about the impact of introducing digital platforms into the most personal aspects of our lives. The compromise or sale of data that could be related to a particular individual can have serious consequences in a society that still stigmatizes people seeking resources to protect their mental health. In many ways, the benefits of these technologies in terms of accessibility are clear, but users are concerned about the fact that the services they use are facilitated by private technology companies that are not licensed clinical facilities. must always be aware of. And these technology companies have the unique ability to monitor mental health and other data at scale for commercial gain.
Digitalization is permeating every aspect of life experience and poses significant risks to the field of mental health. Currently, personal pain and suffering can be exploited for commercial purposes, and without national data privacy standards and guardrails for these companies, the most intimate parts of our lives could be taken away from patients’ control. It will probably be put up for sale.