When you tell a healthcare professional about mental health issues, the confidentiality of the doctor and patient protects those conversations. However, it is not necessary when you use one of the many mental health applications. Let’s see why this is so.
Is the mental health application data secure and reliable?
The short answer is: it depends. One challenge relates to the wide variety of mental health applications and the way they work. They range from mood monitoring devices and meditation collections to chatbots and one-on-one virtual visits with trained therapists.
Each application has special guidelines and associated data protection. There are also numerous cases of workers in mental health application companies mining or sharing data.
According to Salon, former employees of the Talkspace mental health application said that people at the company regularly review transcripts of patients and therapists to find common phrases and use them to improve marketing to potential customers.
The investigation also included the experience of a man named Ricardo Lori, a Talkspace user who got a job in the company’s customer support department. Executives asked him to read excerpts from his diaries of therapeutic conversations, promising anonymity. However, it was somehow rumored that Lori’s patient was described at the sessions.
These incidents emphasize the need to always carefully study the privacy policy before applying for the service. It is easy and quick to agree to the terms without it, but it can jeopardize your confidentiality.
Mozilla reveals privacy practices for mental health applications
The Mozilla Foundation has assessed the privacy and security associated with 32 mental health and prayer applications. The results showed that 25 does not meet Mozilla’s minimum security standards, such as requiring strong passwords. Researchers also had strong concerns about how 28 applications handle user data.
Jen Kaltrider, project leader, said: “The vast majority of mental health and prayer applications are extremely creepy. These applications monitor, share and use the most intimate personal thoughts and feelings of users, such as mood, mental state and biometric data.
It turns out that researching mental health applications is not good for your mental health, because it reveals how careless and greedy these companies can be according to our most intimate personal data.
Privacy risks do not end with data sharing
The unfortunate reality is that the healthcare industry is burdened with unscrupulous companies and individuals trying to take advantage of people who are often in desperate situations. For example, the US Department of Justice accused 345 people of more than 6 billion dollars in accusations of fraud in health care, as reported by the non-profit organization URAC.
Even when mental health applications store data, they do not always talk in advance about other aspects. According to Newsweek, Katie Mack published the experience of applying for the Cerebral mental health application on TikTok. She asked how the service requested payment information before contacting the therapist.
Mac gave her details, but she later discovered that only two therapists were available in her area. None of their types of expertise met her needs. Mac then learned that the application’s policy is to issue only a 30 percent refund. She called the application a “scam” and threatened to report it to state agencies.
Health data is regularly digitized
Individuals have many ways to stay on top of their health through various applications. The 23ndme DNA testing service can tell people about risk factors for late Alzheimer’s disease.
On the other hand, people who use Apple Health can monitor everything from the use of inhalers to asymmetrical walking patterns. UCLA researchers are also using data from volunteers from Apple watches and iPhones to get a better insight into depression.
When you install a new app, you’ll probably see settings for apps that are allowed to read data. Apple Health and similar services can extract information from other sources and compile it in one place. Then it is easier to review trends with your healthcare professionals.
However, hacking health data is more common than you think. A Politico study found that nearly three-quarters of health data breaches in 2021 involved hackers.
Another potential risk is the way in which company acquisitions can allow cybercriminals to access more data. Fitbit fitness company, which Google bought in 2019, has more than 30 million active users. Google itself has a number of health initiatives for consumers and suppliers.
Find out why apps need your data and what’s going on with it
Given these disturbing findings, what is the most proactive thing you can do to protect your mental health app data?
Be careful that you take the time to read how and why applications or service providers use your information. If you feel uncomfortable about anything in your privacy policy, think twice before applying for them.
Read Nekt
