Gillian Luchejko, a 46-year-old client services manager in New Jersey, first found out her sister Pamela Elarabi posted a photo of herself on the social-networking site Facebook as she prepared to take her own life when it appeared on the top of Luchejko’s own feed.
On the evening of Friday, June 22, around 9 p.m., after several cryptic text posts on her page throughout the day, Elarabi posted an image in which she appeared to be attempting suicide. The photo was public, so anyone looking at Elarabi’s Facebook page could see it.
Elarabi had been publishing increasingly despairing status updates on her page in the preceding months, Luchejko said, so her sister was not sure what to make of the image. She called their brother and asked him to check on Elarabi, who had struggled with suicidal thoughts and depression for most of her life.
Luchejko wasn’t the only one who found the post alarming. By the time her brother arrived, paramedics were already on the scene — a neighbor had seen the image and called 911. Elarabi was rushed to the hospital, where she died on June 23. She was 49.
While Elarabi’s siblings decided whether to turn off her life support, the disturbing image of her last moments remained visible to her 792 Facebook friends.In the midst of the chaos that evening — while Elarabi’s siblings were deciding whether to remove life support, breaking the news of her death to her two adult children, and making arrangements for a burial — the disturbing image of Elarabi’s last moments remained on Facebook, visible to her 792 friends, and anyone who visited her page.
“People kept texting us, asking what was going on, and what the Facebook post was about, and I was thinking, ‘I can’t answer you right now because she is dying,’ ” Luchejko told MarketWatch. “Everyone felt helpless.”
Facebook friends wrote, ‘What’s going on? Are you OK?’But it took more than 72 agonizing hours to get the photo taken down. Luchejko said the family considers itself relatively private, and was horrified to watch the news of her sister’s death spread across their small hometown and the Facebook community.
The experience highlights the painful challenges that social-media companies like Facebook FB, +0.89% and their users must confront as more people put practically their entire lives online. Facebook walks a fine line between intervening and saving lives while avoiding censorship. The company has the ability to step in when users appear poised to hurt themselves, but as Elabari’s case shows, it doesn’t always act in a timely manner even in the aftermath of tragedy.
With limited cellphone service in the hospital, Luchejko said the family could do very little to respond to the texts pouring in from concerned friends who had seen the post. Dozens of people continued to interact with it after Elarabi’s death, leaving sad face and heart emojis, and writing comments like “What’s going on?” and “Are you OK?” Luchejko said.
She began to publicly reply to comments on the post to tell people Elarabi had passed away. She begged followers to report the photo, as recommended in Facebook’s “Help Center” guidelines on removing images. More than 200 of Elarabi’s Facebook friends told Luchejko they had reported it.
Dozens of people continued to interact with it after Elarabi’s death, leaving sad face and heart emojis, and sent floods of concerned texts to her family.Facebook has struggled to remove alarming content in the past. In December 2016, footage of a 12-year-old girl taking her life continued to circulate despite multiple efforts to remove it. In January 2017, a 14-year-old girl killed herself on Facebook Live. That April, a Pennsylvania man posted a video of himself shooting and killing a man on Facebook.
Following these events, Facebook implemented more suicide-prevention tools and announced it would hire 3,000 more moderators to review videos to prevent live streaming of crimes. Facebook has also been developing artificial-intelligence tools to intervene in situations where users may be a danger to themselves or others, and to take down alarming content more quickly.
Phrases in a post like “Are you OK?” or “Can I help?” can trigger alerts to first responders to help the person in question. As of March, Facebook has called first responders to make more than 1,000 wellness checks, the company said.
Facebook’s community standards do not allow posts promoting self-injury and suicide, but such posts may only be removed through Facebook’s online reporting tool. Once a user makes a report by filling out an online form, trained content moderators at Facebook will review the post to see if it qualifies for deletion. And that takes time.
See also: What happens when you Google for suicide methods
A spokeswoman from Facebook said the company cannot comment on Elarabi’s case because the post was deleted and cannot be investigated. The Menlo Park-based company has a team of 10,000 to 20,000 content reviewers working at all hours around the world to monitor content, she said. Many of them are trained in suicide prevention.
If someone reports a post threatening suicide or self-harm, Facebook will automatically suggest the user contact law enforcement to help their friend. However, the spokeswoman did not have information on what the average response time is after someone reports a troubling post.
Facebook’s policy states the company will not take down live videos of suicidal content in case somebody watching can help, the company’s vice president of global policy management, Monika Bickert, explained in 2017.
“Experts in self-harm advised us that it can be better to leave live videos of self-harm running so that people can be alerted to help, but to take them down afterward to prevent copycats,” she said.
‘Facebook is trying to balance a lot of concerns’Facebook has to strike a delicate balance between censorship and safety, said Jen Golbeck, a computer scientist and associate professor at University of Maryland’s College of Information Studies. Although the company is working on artificial-intelligence solutions to automatically detect suicidal posts, it is particularly difficult to detect the content of specific images.
That means many of these cases continue to be reviewed by hand, Golbeck said. “Usually that is done quickly, but once humans are brought into the loop — especially people who are not reporting problems with their own accounts — there can be delays,” she said.
“Imagine all the requests from parents to have their children’s suggestive photos taken down or complaints about bullying,” she added. “These parents’ requests are in that stream; that’s not to say they are unimportant, but rather that once you ask a real person to do something, you have to accept that lots of other people are asking for things and it can be slow.”
Trying to translate Facebook’s policies to algorithms is difficult, leaving the responsibility up to a relatively small team of moderators.One internal document leaked from Facebook in 2017 showed moderators escalated 4,351 reports of self-harm in a two-week span in 2016, 63 of which had to be dealt with by law enforcement. 2017 brought higher figures, with 5,016 reports in one two-week period and 5,431 in another.
In these documents, moderators were told to ignore suicide threats if “intention is only expressed through hashtags or emoticons” or when the proposed method is unlikely to succeed, as well as if the user’s threat to take his or her own life appears to be planned for more than five days in the future. Trying to translate these policies to algorithms is difficult, leaving the responsibility up to a relatively small team of moderators, Golbeck said.
“Facebook is trying to balance a lot of concerns, not just about privacy and the sensitivity of the content, but also the impact on people who are suffering and potentially suicidal,” she said. “A complex policy seems to be the right solution, but it does mean that the human machine will move more slowly.”
If you or someone you know may be struggling with suicidal thoughts, you can call the U.S. National Suicide Prevention Lifeline for confidential support 24 hours a day at 1-800-273-8255.Page Break