
Suicide Prevention in the Age of Social Media and AI
Editor’s Note: The following story references suicide. It includes a discussion of risk factors and means of death, subjects that may be difficult for people personally affected by suicide or who are currently struggling with mental health. Help is available at the 988 Lifeline.
With soaring rates of youth suicide coinciding with the proliferation of social media, many have blamed platforms like Instagram and TikTok for the crisis. But keeping young people away from smartphones and social media may not be feasible or even a good idea, asserted Katherine Keyes, professor of Epidemiology at Columbia Mailman School, in a recent public conversation on advancing suicide prevention in a digital world. In fact, smartphones and other digital technologies could be a powerful way to predict suicidal behaviors and intervene to prevent them, added Hanga Galfalvy, associate professor of Biostatistics (in Psychiatry).
Keyes and Hanga made the remarks on October 23 at the second in a series of Dean’s Grand Rounds talks on the theme, “Public Health Responses to a Changing World.” The discussion was presented in partnership with Columbia University Seminar on Global Mental Health and moderated by Kathryn Lovero, assistant professor of Sociomedical Sciences.
Interim Dean Kathleen Sikkema, a clinical psychologist whose research focuses on mental health, opened the event by acknowledging that suicide can be a painful topic for many people affected by suicide or who are currently struggling with mental health. On the research front, interdisciplinary scholarship is critical in a rapidly shifting landscape in which “the emergence of digital assets introduces both significant opportunities and profound concerns,” she said.
Shining Light on the Dark Corners of the Internet
What role does online content play in suicidal behaviors? Keyes, who leads the interdisciplinary mental health initiative SPIRIT and the Psychiatric Epidemiology Training Program, and colleagues are studying pro-suicide web forums—sites often hosted outside of the U.S., where people can learn about suicide methods. In an analysis of aggregated web searches, the researchers found a significant correlation between searches for pro-suicide websites and the monthly rate of young people dying by poisoning and suffocation. “Young people who evolve into a suicidal crisis have a reinforcing cycle of getting deeper and deeper into these areas of the Internet and social media with this really inappropriate content,” Keyes noted.
It may be tempting to try to shield young people from online content altogether, but Keyes said that a common analogy used in the field is that while we put a fence around a swimming pool to prevent drownings, “you can't put a guardrail around an ocean, and I think that's what we’re dealing with. Guardrails around the Internet and AI are necessary but difficult, because it’s too vast,” she said. Young people will find a way to go online to talk about their mental health, so it’s important to build safe spaces for supportive discussions. “We don't want teens to not talk about their problems, including suicidal crisis, but we want to arm them with literacy and health information in order to do that in the safest way,” said Keyes.
Can Smartphones Predict Risk for Suicide?
The proliferation of smartphones and smartwatches is opening new avenues for suicide prevention, according to Galfalvy. In an example of so-called “active monitoring,” a smartphone text intervention asks participants six times a day whether they have had any suicidal thoughts in the last minute. By contrast, in “passive monitoring,” researchers analyze data automatically collected through smartphones and other devices. This active/passive distinction isn’t black and white, she noted. Data collected passively can drop off for the same reasons people stop actively responding to surveys. For instance, they might decide to put their phone down.
In a study of phone location data, Galfalvy found that how much time someone spends at home was predictive of suicidal behavior. The insight has real potential to lead to an intervention, particularly through the use of advanced analytic techniques such as machine learning and neural networks. But the time frame presents its own challenge. “If we can predict a suicide attempt two hours in advance … our intervention also has to be digital,” she said. What might this look like? “They might get a message on your phone that says something like, ‘let’s go out.’”
AI and Mental Health Risks
The mental health risks of AI chatbots have made headlines this year. Parents of young people who died by suicide are suing AI companies, alleging that the companies were negligent. Keyes read testimony by one parent about how ChatGPT encouraged their son’s darkest thoughts, even offering to write the suicide note. Recently, the company estimated that hundreds of thousands of users may be experiencing symptoms like delusional thinking, mania, or suicidal ideation, and says it has tweaked GPT-5 to respond more effectively. Because AI chatbots are still so new, public health researchers are playing catch-up, although some preliminary data suggest that more than half of young people use them. “We don’t have a basic epidemiology of how many kids use these products, how often they use them, what they use them for,” Keyes noted.
Partnerships and Data Access
Another research challenge is accessing online data from digital platforms. Keyes doesn’t see a productive path forward to work with digital technology companies. “Not only do they not have incentives to improve public health, but they have actively worked against research initiatives, blocking researchers from getting even simple data,” she says. As another approach, Galfalvy suggested that scientists source data directly from everyday Americans. They could donate data from their smartphone or smart watch and donate their time by responding to survey questions. The potential upside would be “huge,” she said. For instance, researchers could implement n of 1 trials, which compare interventions in a single participant over time.
What to Do If a Loved One Is in Crisis
Nearly all of us have been touched personally by experiencing or having a close family member experience a severe mental health crisis. Yet there is still a gap in understanding over what to do when confronted with a loved one experiencing a mental health crisis. When a friend or family member dies by suicide, they often express regret that they didn’t know the individual in question was suicidal. “Asking about suicidal behavior does not increase suicide risk,” Keyes emphasized. It’s important to learn how to talk to our family and our friends about their mental health, especially if they show warning signs like withdrawing and irritability, she added.
One place to find guidance is a suicide safety plan, Galfalvy noted. Available online, these plans vary in quality, and clinicians tailor them to the patient’s needs. Nevertheless, she said, they “can give you an idea of topics you can raise.”
  
  
  
  
  
  
  
  
  
  


