Facebook Taking Proactive Stance Against Suicides
Facebook is working with other social media sites to help suicidal users.
March 29, 2012— -- In the wake of the suicide of a 31-year-old Taiwanese woman who told friends on Facebook she was planning to kill herself, Facebook's managers tell ABC News they have plans to work with other leading websites to provide more robust suicide prevention resources to web users.
"We're working with other internet companies at formulating a list of best practices, so that there's an understanding and a consensus, along with experts in the suicide prevention community, for online properties dealing with this issue," Frederic Wolens, a spokesman for Facebook, told ABC News.
Wolens said the suicide of Claire Lin, who killed herself on her 31st birthday on March 18, highlighted a problem that social networks have been trying to grapple with for years: how individuals who are suicidal often let their despair reflect on their social networking profiles, by chatting to friends about it or leaving other signs.
"More and more, as Facebook becomes more widespread and pervasive, it's becoming a better and better mirror for what's going on in the real world," Wolens said. "With suicides going on in the real world, the suicide touches some part of Facebook, whether it's the signs leading up to it, or people who wrote things on their Facebook."
In the case of Lin, the connection to Facebook was particularly gruesome. Lin chatted with nine Facebook friends on the website while she slowly killed herself by asphyxiation, inhaling the fumes from a charcoal barbecue in a closed room and typing messages about her slow death.
The friends begged her to open a window and put the fire out, but did not call police.
In other instances, individuals have written Facebook "status updates" confessing they wanted to kill themselves, or written messages to friends expressing suicidal thoughts. Rutgers University freshman Tyler Clementi brought widespread media and public attention to the issue after he killed himself in 2010. Moments before, he had posted a Facebook message saying, "Jumping of the george washington bridge. Sorry."
Currently, Facebook offers resources to users in the U.S. who ask for them. If a person planning suicide mentions it on Facebook, and friends report it to administrators, they will send messages to the person and his or her friends, offering help.
A private, one-on-one Facebook chat with a suicide prevention counselor would pop open on the person's Facebook page, offering counseling free of charge. The person would also be offered local resources that could be found offline, Wolens said.
For a user who reports suicidal postings by a friend, Facebook offers resources on how to help a friend through that crisis or whom they could recommend the friend contact for help.
"So in the U.S. specifically, we already have a system where when we receive a report of a user that's in distress, that goes into our safety team, which reviews the report to make sure it's an authentic report, and after we've verified it, we reach out to person who has reported it and the distressed user," Wolens said. Facebook then offers the specific chat and local resources, a model the company plans to duplicate abroad.
Facebook also already houses helpline phone numbers and other resources in its Help Center.
What the company won't do is scan users' online activity for warning signs or mentions of suicidal thoughts, Wolens said. The ability to crunch data from billions of users' messages each day -- coupled with the nuance and context of messages that might contain words like "kill myself" -- would make sorting through the data impractical.