Facebook has begun screening its users for suicide risk and contacting local authorities if it feels it is necessary, which in some instances can lead to users being forced to visit a hospital for a screening against their will, Natasha Singer reports for the New York Times.
Learn 9 program models to help address the needs of patients with behavioral health conditions
After multiple people live-streamed their suicides on Facebook Live in early 2017, the social media company ramped up its efforts to screen its users' posts for indicators of suicide risk.
Now, Facebook utilizes computer algorithms to scan its users' posts, comments, and videos for any indications of suicide risk, Singer writes. If a post is flagged by the algorithm or by another Facebook user, it's sent to a human reviewer at Facebook who has the ability to contact local law enforcement if the user appears at imminent risk of self-harm.
According to Facebook, the company worked with suicide prevention experts to develop its program and train teams with experience in law enforcement and crisis response to respond to urgent cases, Singer writes.
The new system has been implemented worldwide except in the European Union, where the collection of personal health details is restricted by data protection laws, according to Facebook. Facebook CEO Mark Zuckerberg said, "In the last year, [Facebook has] helped first responders quickly reach around 3,500 people globally who needed help."
But a New York Times review of four police reports suggests the program does not always work as intended.
Some experts also have expressed concern that Facebook's program could cause harm by unintentionally precipitating suicide, forcing nonsuicidal people to undergo psychiatric evaluations, or leading to arrests or shootings, Singer writes.
In at least one case, police contacted by Facebook found a user whose post on the social media site had been flagged as suicidal, and forced her to go to a hospital for screening, even though she said she wasn't having suicidal thoughts, Singer writes.
John Torous, director of the digital psychiatry division at Beth Israel Deaconess Medical Center, said, "It's hard to know what Facebook is actually picking up on, what they are actually acting on, and are they giving the appropriate response to the appropriate risk. It's black box medicine."
According to Singer, Mason Marks, a health law scholar and fellow at Yale Law School and New York University School of Law, in a forthcoming article argues that Facebook's software constitutes the practice of medicine, as it can lead to mandatory psychiatric evaluations. "In this climate in which trust in Facebook is really eroding, it concerns me that Facebook is just saying 'Trust us here,'" he said.
However, John Draper—director of the National Suicide Prevention Lifeline, which has received funding from and has advised Facebook—praised Facebook's efforts, saying the company "has always been way ahead of the pack, not only in suicide prevention, but in taking an extra step toward innovation and engaging us with really intelligent and forward-thinking approaches."
Emily Cain, a spokesperson for Facebook, in a statement said, "While our efforts are not perfect, we have decided to err on the side of providing people who need help with resources as soon as possible." Cain added, "These are complex issues, which is why we have been working closely with experts" (Singer, New York Times, 12/31/18).
Behavioral health conditions are prevalent, often undiagnosed or untreated, and deeply entangled with chronic disease management, which makes them one of the most costly conditions today.
This report outlines nine targeted behavioral health program models designed to more effectively address the needs of patients with behavioral health conditions.
Create your free account to access 1 resource, including the latest research and webinars.
You have 1 free members-only resource remaining this month.
1 free members-only resources remaining
1 free members-only resources remaining
Never miss out on the latest innovative health care content tailored to you.