© 2024 Kansas City Public Radio
NPR in Kansas City
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Facebook Aims To Prevent Suicides With Online Help

iStockPhoto.com

If you're considering suicide, Facebook now stands ready to get you some help.

The gigantic social-networking site said Tuesday that if any of its 800 million users type a post saying they are contemplating suicide, the site will offer to connect them to a crisis counselor through the site's chat system.

But the system requires human intervention, in the form of a friend who clicks on a link next to a troubling comment, the Associated Press reports today. Facebook says it then will send an email to the people concerned, encouraging them to call a crisis hotline or click through to a confidential chat with a counselor.

But a quick cruise over to Facebook shows no friendly button, so it's not clear exactly how this will work in real life. As of this afternoon, Facebook's help center recommends that people who've come across a direct threat of suicide "immediately contact law enforcement or a suicide hotline."

Google has tweaked its search engine so that the turns up first when a person types in "suicide," but this appears to be the first active effort by a social media site to connect users to health care professionals.

Facebook has been trying to do more to make its site more socially responsible. In March, the company announced new tools to protect users from online bullying, including a way to report threats to Facebook, and to let a parent, teacher, or trusted friend know.

Last year, the social-media giant started partnering with gay rights organizations to combat anti-gay cyberbullying.

But the anti-suicide effort is the first that isn't intended to reduce malicious use of Facebook. Instead, it's using Facebook's vast networks to try to identify people in the midst of a mental-health crisis, and get them help.

"This is really problematic," says Pam Dixon, executive director of the a nonprofit public interest research group. We all want to prevent suicide, she says, "but I'm not sure this is the right way to do it."

The biggest problem, Dixon says, is that Facebook is a public forum. Companies regularly scrape the site for information, and could use that to market worthless treatments to people in the midst of a mental health crisis. And because the site is public, health information posted there is not protected by HIPAA, the federal medical privacy law.

Information on a person's mental state might be subpoenaed from Facebook, Dixon adds, for a custody battle or other litigation. And Facebook could also be liable for the quality of mental health care delivered as part of their recommendation.

Despite those issues, many people say that sharing medical information with others on Facebook has helped them manage serious health issues, as I reported earlier this year.

This latest move by Facebook sounds like it could open the door to dozens of other potential interventions. Before too long, hearty eaters could perhaps start getting referrals to Weight Watchers, or the American Diabetes Association. And the legions of teenage binge drinkers who post their misadventures on Facebook, could suddenly be hearing from Alcoholics Anonymous.

Copyright 2020 NPR. To see more, visit https://www.npr.org.

KCUR prides ourselves on bringing local journalism to the public without a paywall — ever.

Our reporting will always be free for you to read. But it's not free to produce.

As a nonprofit, we rely on your donations to keep operating and trying new things. If you value our work, consider becoming a member.