AI figured out the word people text when their suicide risk is high

If you were asked to guess the words people use when they're most at risk for suicide, you'd be right to think of obvious nouns and verbs like die, overdose and, yes, the word suicide itself. 

So when Crisis Text Line, a free mental health support service, built an algorithm to flag high-priority texts, it included those among 50 words to indicate the person messaging desperately needed help.

But when Crisis Text Line started using artificial intelligence to analyze the 22 million messages about emotional distress in its database last summer, its researchers made a surprising discovery: The word ibuprofen was 16 times more likely to predict the person texting would need emergency services than the word suicide. 

SEE ALSO: Why scientists think your social media posts can help prevent suicide

Another highly predictive type of content wasn't even a word but a crying face emoji. When people included that sad character in their messages, Crisis Text Line supervisors were 11 times more likely to call 911 for assistance. In total, Crisis Text Line has integrated 9,000 new words or word combinations that indicate high risk — and expects to add more in the future. 

Nancy Lublin, the nonprofit's CEO, says these unexpected data points have made a huge difference. Before AI detected the new words, volunteers responded to high-risk texters in less than two minutes. Now the average response time is down to 39 seconds. Lublin believes that's because the algorithm is much better at identifying those most at risk and sending them to the front of the line, like you would in a hospital emergency room. 

Julie Cerel, a clinical psychologist and president-elect of the American Association of Suicidology, says the practical implications of the technology are important. But she also believes the approach reflects a significant change in the way researchers and public health professionals try to prevent suicide. 

"What this speaks to is we are finally listening to people who are suicidal and using what they’re telling us to figure out how to help them," she says. 

In the past, it's been literally impossible to comb through and code transcripts with suicide attempt survivors at the same scale as Crisis Text Line. Now machine learning makes it possible for researchers to analyze digital conversations and look for signals that someone may be close to attempting suicide. 

That approach has gained considerable momentum in the last year. Facebook recently announced that it's incorporating AI into its suicide-prevention efforts, and the research project OurDataHelps launched last spring by asking people to "donate" their social data so scientists could better understand suicide risk. 

At Crisis Text Line, conversations that end in what's known as an active rescue are rare. Only 1 percent of those exchanges require intervention by the authorities, and Lublin considers it the last line of defense. The goal, she says, is to help texters create a safety plan and encourage them to feel capable in handling a crisis. But sometimes that approach doesn't work, which is why Lublin wants the system to be as fast and as accurate as possible in singling out high-risk people.  

Even though the new words and phrases Crisis Text Line identified might not seem immediately useful to doctors and therapists who work with patients in person, Cerel says they're evidence that people don't always choose the most obvious words to talk about suicidal feelings. 

"It's a reminder to keep asking the question," says Cerel, "and make it clear we want to hear the answer." 

If you want to talk to someone or are experiencing suicidal thoughts, text the Crisis Text Line at 741-741 or call the National Suicide Prevention Lifeline at 1-800-273-8255. Here is a list of international resources. 

WATCH: 'Pokémon Go' is helping some players cope with depression and anxiety