“Well, I love you with no regrets and respect the ending of the story,” Mao Kankan wrote in his last post on his WeChat moments social media feed before he ended his life at the age of 35. Mao, a young entrepreneur who co-founded digital game firm MaJoy, gassed himself at home on January 25, 2018. He was an idol for young people, starting his own business in 2004 and appearing on the popular China Central Television program Dialogue at the age of 21. His company’s huge debt burden may have motivated him to commit suicide.
Mao’s death sparked a huge online discussion over suicide rates among the young after several other prominent young artists also took their own lives in 2017. Award-winning director and writer Hu Qian (also known as Hu Bo) hanged himself at home on October 13. Talented actor and singer Kimi Qiao Renliang, who struggled with depression, committed suicide at the age of 28 on September 16. Ren Hang, a leading Chinese photographer and poet of the new generation, who also documented his struggles with depression on his blog, jumped to his death, aged 29.
According to statistics from the Chinese Ministry of Health in 2017, suicide has become the leading cause of death of young Chinese people aged between 15 and 35. China accounted for 26 percent of the world’s suicides, the World Health Organization said in 2009, the fifth leading cause of death nationwide. Generally, more women than men kill themselves, which is different from Western countries, the report said.
But now there is a high-tech method to try to stage early interventions when people express suicidal thoughts online. A top research center has launched an Artificial Intelligence program called PsyMap to identify Internet users at risk of committing suicide on Weibo, the Chinese version of Twitter. The system, developed by the Institute of Psychology under the Chinese Academy of Sciences (CAS) in Beijing, can scan and analyze web users’ posts and comments associated with suicidal thoughts and feelings, and give support and resources to people in need.
‘Secret Garden’
“I’ve got depression, so I want to try death. There is no particular reason for it. You don’t need to care about me going. Goodbye.” On March 18th, 2012, a microblog posted by a user called “Zoufan” dropped a bombshell on Sina Weibo. Zoufan, real name Ma Jie, was a talented college student in Nanjing. She hung herself in her dorm room hours before her farewell note auto-posted online.
Zoufan posted 1,896 tweets in total, detailing her daily life and personal feelings in a witty, humorous but deeply sad tone. After her death, her microblog gradually became a platform where depression sufferers and distressed people told of their own secrets and personal stories.
“How I wish I could pass away while I’m asleep so I won’t wake up to face a bleak world of suffering. Please let me go, let me plunge, let me die,” an Internet user called “Daianzhi” commented.
“This time I’ve prepared everything: gas, charcoal and liquor. If this way is too excruciating, then I will jump. Everything is over. No one will save me. I won’t be saved again. Wait for me. This time, it’s not a lie,” user “Walking Corpse 3” commented.
Zoufan’s posts were flooded with suicidal comments. Her last message had 1.12 million comments, of which 500,000 were written in the last six months alone. Every few minutes a new comment popped up. More and more people gathered there to tell of their own suffering and misery.
“No matter how happy I pretend to be in front of people, when I’m alone, I can’t help feeling depressed. Zoufan’s Weibo is my secret garden, allowing me to put everything deep within my heart into it without being mocked,” Liu Ya, a Zoufan follower who suffers from depression, told NewsChina.
Luo Fu, a 24-year-old with depression, is another of Zoufan’s followers. She has read every post of the deceased girl, appreciates her humor and personality, and admires her courage in facing death.
Luo regards Zuofan’s microblog as a shelter where she can freely express her disgust toward life and longing for death. Luo’s own personal microblog imitates the style of Zoufan: no pictures, very few reposts, lots of detailed descriptions of daily trivialities, nihilistic and pessimistic.
Then, to her surprise, Luo received a message from the PsyMap account. It said: “We saw your comments on Zoufan’s Weibo. Is everything all right? How are you feeling now?” The message encouraged Luo to interact with volunteers from PsyMap or contact a local psychological crisis intervention center.
AI Against Suicide
The PsyMap team is led by Zhu Tingshao, an expert scholar on cyber psychology and data mining at CAS.
Zhu’s team conducted research in 2014 concerning behavior and language patterns of suicidal social media users. Compared with those with no suicidal tendencies, the research found that those with suicidal tendencies focus more on self-expression, interact less with other users, adopt negative words and expressions and pay more attention to content related to death and religion instead of work, he said.
This research laid the theoretical foundations for the team’s later suicide identification and prevention effort. In July 2017, the team launched an AI system to automatically collect suspect comments according to a vocabulary of topics suicidal persons commonly post.
“It doesn’t look for keywords, but uses a prediction model that makes a judgment about the content,” Zhu told NewsChina.
The at-risk person will receive a message from the PsyMap account, a hotline number, run by Huilongguan Hospital in Beijing, and also suggestions for ways to receive support, including talking with PsyMap volunteers, who are all professional psychiatrists, or contacting a local mental crisis center. This is similar to the Samaritans networks in the West, where people can call if they are in crisis.
Every 24 hours, the system sifts through the comments under Zoufan’s microblog and flags up when someone appears on the brink of committing suicide. Every day from 6pm to 10pm, volunteer psychotherapists interact with those who reply via text message.
Currently, the AI system only scans comments under Zoufan’s posts instead of all the posts across social media platforms, as the team believes people who follow the deceased girl’s account are more likely to exhibit suicidal tendencies.
“Most commenters on Zoufan’s microblog are high school and college students, and are on average 21 years old,” Zhu said.
Since the system launched, PsyMap has supported 14,435 users that had expressed suicidal feelings, with an accuracy rate of 92.2 percent. So far, Zhu said, about 4,000 had replied and 8,000 people had accessed the online counselling tools.
Zhu told NewsChina that many people went online to communicate their feelings rather than talking face-to-face with a specialist or acquaintances, and Weibo was one place where suicidal thoughts were more freely expressed.
Liu Mingming, a member of Zhu’s team, said the comments people post are not just expressing their response to Zoufan’s words, but they are also recounting their own life stories. This has provided a wealth of information for analysis.
The AI has direct-messaged 4,222 at-risk users through Weibo, and they received 725 replies. Seventy-eight percent of respondents said the message they received was acceptable and they would expect more support, and more than half suggested the message could contain more specific guidance in dealing with particular mental problems.
In one case, the PsyMap team received a reply from a woman one evening in September 2017, who posted a suicide note on her personal microblog. She said she was planning to take sleeping pills to kill herself at midnight that night. The PsyMap volunteer managed to make contact with her immediately, confirmed her address and called the police. The actions prevented the woman’s suicide attempt.
“The woman’s case proves that sometimes people who publicly state their intent to commit suicide on social networking platforms are actually making a cry for help,” Liu said.
“What we’re doing can’t solve everything,” Liu told NewsChina, “If a person intends to commit suicide and we intervene, we can’t guarantee that this person won’t try again.”
“We will try our best to postpone the suicide in order to get more time to make the situation a little bit better, especially by helping them connect to friends and to organizations that can help them,” Liu said, indicating that the main role of PsyMap is to build a bridge between the sufferer and support networks.
Global Response
In response to this global epidemic, social network giant Facebook also rolled out an AI system in March 2017 to scan all posts, videos and Facebook Live streams containing suicidal thoughts, and when necessary sending mental health resources to the imminent risk users or their friends and loved ones.
Facebook chief executive Mark Zuckerberg said that the company would use AI to help identify problem posts across its network, but he acknowledged that this was a very difficult problem to address. “No matter how many people we have on the team, we’ll never be able to look at everything,” he said in May 2017. Facebook announced in November 2017 it would expand the tool to all users in the US.
Nevertheless, there is still a long way to go in devising methods to tackle online suicide prevention.
The idea of AI scanning the content of people’s posts stirs up some fears about privacy issues, given that tens of thousands of accounts have become the subject of psychological experiment.
Janis Whitlock, director of the Cornell Research Program on Self-Injurious Behaviors, says machine-learning systems on Facebook and elsewhere may startle some users and discourage them from seeking help in the future.
In an article on multimedia platform Mashable, Munmun De Choudhury, an assistant professor at the School of Interactive Computing at the Georgia Institute of Technology, commends the social media company for focusing on suicide prevention, but she suggests Facebook can be more transparent about its algorithms.
Transparency instills trust, she argues, otherwise people may worry about technology’s potential to fundamentally disrupt their professional and personal lives.
“This is not just another AI tool – it tackles a really sensitive issue,” she said. “It’s a matter of somebody’s life and death.”
Without enough trust in the tool, at-risk users may decline to share emotionally vulnerable or suicidal posts.
PsyMap also faces the challenge of mistrust. Some people did not want to receive the team’s messages and reported them to Weibo. As a result, PsyMap is restricted to sending a maximum of 200 messages to other Weibo users per day.
Weibo user and depression sufferer Liu Ya told our reporter that when she received the first message from PsyMap, she thought it was a scam. It was not until she received three messages that she began to trust the account.
The woman whom the team saved last September later expressed her worries as she felt her online behavior had been closely watched. Both she and the PsyMap volunteers made a mutual promise: the team should not call the police except in the most severe situations, and the woman would tell the PsyMap volunteers if she is going to attempt suicide.
Currently, in China, no other organization except PsyMap is focusing on social media suicide prevention. In the future, Zhu’s team plans to apply the technology to more social media platforms.
Now, Zhu’s team is working with academics in the US to introduce it on Twitter.
The team is now cooperating with researchers from Brigham Young University and the University of Maryland to extend the Chinese-language service to Twitter users with an English version, Zhu told the South China Morning Post on February 3.
The Chinese service is based on simplified characters, which are used in the Chinese mainland, but the team also plans to add traditional characters for Weibo users in Hong Kong, Macao and Taiwan.