This story includes a discussion of suicide. If you or someone you know is in crisis, you can get help from Suicide and Crisis Lifeline by calling or texting. 988.
It’s a feeling I can relate to. You’re feeling unwell and have multiple symptoms, so you decide to Google what’s wrong with you. Sometimes doom becomes a reality and we self-judge the worst-case scenario. But it’s not just a physical illness. CBS News has revealed that as the mental health crisis worsens among American teens, they are seeking other forms of support and information. More and more young people are turning to social media platforms like TikTok to diagnose themselves as having serious mental health issues.
Because qualified therapists are expensive and increasingly difficult to find, many young people are searching social media platforms for answers, which are plentiful, free, but not necessarily accurate. Not necessarily.
According to a recent Pew Research study, one in six Gen Zers uses TikTok as a search engine. TikTok is not only a place to watch dance videos and lip syncs, but also a place to find bite-sized informational content.
“When I’m trying to figure out how to do something, I feel like it’s easier to use TikTok,” said Alexis Diomino, a third-year psychology major.
read more: Inside America’s youth mental health crisis
It’s more than just a search engine. A place to give advice, share feelings and experiences, and talk about serious mental health disorders.
“Social media therapy”
Samantha Fridley, 19, was diagnosed with anxiety and depression at a young age. By her fifth grade, she became suicidal. Even with her therapy, she felt alone as she battled anxiety and thought about her self-harm.
“I felt like there was no one to help me, and I felt like there was no cure because I had been through so much,” Fridley said. “I started looking for other people through social media. Then he looked at Instagram. He looked at every social media he could. And Tik Tok.”
Fridley says she started looking for mental health advocacy and motivation on TikTok. That’s not unusual. The hashtag “mental health” has been searched over 67 billion times on TikTok.
This phenomenon, which is currently gaining attention, is called social media therapy.
“What they do is when they don’t feel like they’ve mastered the environment of the outside world, to calm themselves down, to make themselves feel good, to make themselves masters of that environment, to use interactive It’s entering the media space,” said Dr. Michael Rich, director of the Digital Wellness Lab at Boston Children’s Hospital.
“Currently, demand is being met by people. The question is: how well and how safely is it being met?”
Experts like Rich say talking openly about mental health can provide support and reduce stigma. But there are also concerns that relying on social media influencers as de facto therapists is dangerous.
“I think we need to understand why people are coming to these influencers for help. But we also need some kind of quality control,” Rich said. “Unfortunately, when these young people, often untrained and with good intentions, are trying to be there for their peers, they are first and foremost concerned with how much pain someone is in. , or you can’t necessarily sense what kind of situation you’re in. “That’s how close someone is to actually harming themselves. So there’s a demand here that far outweighs the need.” I think there is a real problem.”
Algorithmic feeding craze
Fridley says her search for mental health content led her down a dangerous rabbit hole. Although she had already been seeing a therapist due to her diagnosed anxiety and depression, she started watching countless videos of influencers sharing their thoughts about serious mental health conditions and on TikTok. He said he was flooded with hundreds more feeds.
“If you look at TikTok, as the algorithm gets stronger, it turns into a diagnosis and turns into other things, like ADHD, borderline personality disorder, more depression and anxiety,” Fridley said. .
The content that appears in her “For You” feed is the result of TikTok’s proprietary algorithm, which sends recommended videos based on what users search for, share, or “like.” Fridley says she was struck by vague symptoms of various mental disorders, which led her to diagnose herself.
“It got to the point where I couldn’t sleep. I’d stay up until like 3 a.m. on TikTok, like I was doing research,” she said. However, Fridley had never been professionally diagnosed with these diseases.
In an email to CBS News, TikTok said the “For You” feed “…reflects each user’s unique preferences. The system uses videos you’ve liked and shared, accounts you follow, “We rank videos and recommend content based on a combination of factors, including the comments you post.” Posts and content you create. ”
TikTok told CBS News it has begun testing ways to avoid recommending users a range of similar content on a topic to determine whether the system is inadvertently serving viewers a narrower range of content. He said he is checking.
But what are the unintended consequences that the continued provision of mental health content may have, especially if the content is inaccurate or provides misinformation? I have concerns about this.
In a recent study by the Center to Combat Digital Hate, researchers posed as 13-year-old users who searched for and liked mental health videos. They found that TikTok pushed potentially harmful content to these users every 39 seconds on average. Some users reported that he received content recommendations regarding suicide within 2.6 minutes of joining the app.
“What’s online is free, and there’s really no accountability or responsibility for this,” Rich said.
An analysis of popular TikTok videos about ADHD published in The Canadian Journal of Psychiatry found that 52% were considered misleading.
TikTok did not respond to requests for comment, but in a statement to CBS News, a spokesperson said: “Regardless of intent, we remove false information that causes significant harm to individuals, communities, or the broader public.” .
The company also said, “We care deeply about the well-being of our communities, which is why we continue to invest in digital literacy education aimed at helping people evaluate and understand the content they consume online. We strongly recommend that individuals seek medical attention from a professional medical institution.” Please advise if you need support. ”
debunk misinformation
“I even talked to people on TikTok, and I kept saying, ‘I know at some point you only care about misinformation when it comes to coronavirus or politics. is not concerned about misinformation or psychology, but we need to understand that this is mental health,” said Dr. Inna Kanevsky, professor of psychology at San Diego Mesa College.
For the past few years, Kanefsky has been fighting psychological misinformation on TikTok, debunking mental health misinformation video by video.
“When you know where you’re coming from, you can give people advice based on your experience,” Kanefsky says.
With over 1 million followers and 36 million views, Kanefusi has become a star on TikTok and a reverse influencer in her own right. But her outspoken views on mental health misinformation haven’t always been well-received. Especially from users who are on the receiving end of debunking videos.
“When I correct people… people get very angry with me. [say] They are just talking about their personal experiences. They are not harming anyone. ”
But Kanefsky says she intervenes because there is a real possibility of harm.
“People believe all sorts of things that aren’t actually true just because they’re told by someone they think they can relate to, but they think that person is more empathetic than a doctor or a scientific author. And they want to value personal experiences.”
Social media experts say that’s the crux of the problem.
“There are content creators who are trained doctors and trained clinicians who are working in this area to try to counter disinformation, but it’s almost like they’re salmon swimming upstream. ” said Robin Stevens, associate professor of communication at Southern University. Los Angeles, California.
“For us to see real change, it really needs to change at the platform level, and there needs to be a significant amount of content moderation,” she said.
Stevens runs the Health Equity and Media Lab at USC. She typically works with Black and Latinx youth and how they use social media to find solutions to public health issues they face, including mental illness. I’m researching.
Stevens has spent most of his career researching and critiquing social media platforms. But last year, she started collaborating with her Well-being Creator Collective on her Instagram. This pilot program aims to educate and train influencers and content creators on how to responsibly create mental health content.
Meta, the parent company of Facebook and Instagram, recently held a two-day summit with these content creators in Los Angeles. Stevens is one of their professional advisors.
“When they created Reel, we analyzed the content to see the level of misinformation. What was the level of how well teens responded to it? Teens What was the content that I watched over and over again?” she said. “And we feed that back to them to help them create better reels.”
“I was a little bit skeptical about what they would actually do. Was this just PR? And they, working at the Wellness Collective, actually understand how young people feel. I would say they’ve had a great approach on how to provide more supportive content that shows that there is a presence in the media,” Stevens said.
But until there is more content moderation at the platform level, how young people consume media will mean that users need to be aware of how their feeds are populated. means.
Samantha Fridley says it took a complete detox to finally free herself from the shackles of people and self-diagnosis that affected her mental health. She spent 56 days in a residential rehabilitation facility, away from her cell phone and TikTok. She still uses the app, but the way she uses it has changed dramatically. She stopped watching mental health videos and looked for content like K-pop and comedy that would reset her feed.
“This is a great resource for funny videos,” Fridley says. “But this is not a good source of information for diagnosing yourself. And once you start diagnosing yourself, you end up in a spiral that is very difficult to get out of.”
Advice for teens and their parents
Dr. Kanefsky and Dr. Ricci say parents need to take an active role in how their children engage with mental health-related social media posts. Dr. Rich says it’s like a power tool and requires education to use safely.
A strategy teens can use when their feeds are flooded with negative posts is to reset the algorithm by changing the types of videos they watch, like, and comment on. Seeing positive posts can replace negative content. You can also delete your account and start from scratch.
TikTok, Snapchat, YouTube, Meta named in federal lawsuit They were joined by families across the country who say the platform’s algorithms are causing depression, eating disorders and suicide among young people. Statements from Snapchat, YouTube owner Google, and Meta to CBS News can be found here.