advertisement
(If you feel suicidal or know someone in distress, please reach out to them with kindness and call these numbers of local emergency services, helplines, and mental health NGOs)
Discussing why he had chosen to portray the emotional turmoil of a teenage girl in his 2018 film Eighth Grade, American comedian Bo Burnham had said in an interview, “There’s so much commentary about the internet. There’s not a lot of description about it. And I feel like we’re all in it.”
Yet, in 2017, the United Kingdom, regardless of preparedness, was being forced to reckon with the risks posed by social media to children – a conversation that had been spurred by the tragic death of a 14-year-old middle school girl named Molly Russell.
By her father Ian Russell's account, Molly was a “positive, happy, bright young lady who was indeed destined to do good." But online, she had fallen into the "bleakest of worlds."
Although Molly's death was over five years ago, with the coroner's recent findings serving as a damning indictment of social media platforms, let's take a closer look at the allegations against platforms like Instagram and Pinterest, what content moderation efforts have been taken since, and why Molly Russell's suicide should be a wake-up call for parents and children in India.
The coroner's inquest found that Molly liked, shared, and saved over 16,300 Instagram posts of which 2,100 of them pertained to depression, self-harm, and suicide. The last time she accessed the photo and short-video sharing app was at 12:45 am on the day of her death, to save a post that conveyed a slogan linked to depression.
Revealing further details about the disturbing pieces of content that were shown to Molly on Instagram, the coroner said that she viewed a string of graphic videos categorised as content related to suicide and self-harm. The videos, which were sometimes watched by Molly in succession, amounted to 138, including some clips from the controversial Netflix drama 13 Reasons Why that revolves around a teenager's suicide.
All in all, the court reviewed six months of Instagram content over a two-week hearing. But perhaps the most incriminating revelation was that Molly had penned a note before her death that quoted a depressive post she had seen on Instagram.
Not just Instagram, digital pinboard platform Pinterest also came under the scanner. The inquest found that Molly viewed disturbing visuals on the platform after searching for posts using keywords such as “depressing quotes [sic] deep” and “suicidal [sic] quotes."
She had also created a board titled “nothing to worry about…” that featured a total of 469 images pertaining to self-harm and suicide as well as anxiety and depression. Crucially, Pinterest went one step further and reportedly sent Molly recommendation emails with subject lines like “10 depression pins you might like."
2021 proved to be a tumultuous year for Instagram and its parent company Meta after whistleblower Frances Haugen leaked internal documents initially known as 'The Facebook Files' and later dubbed 'The Facebook Papers'.
One such internal Facebook study accessed by The Wall Street Journal showed that teenagers, particularly teenage girls, reportedly attributed a significant amount of their anxiety and mental health problems to Instagram.
Amid public outcry and scrutiny from legislators, Instagram reacted by announcing a litany of features such as: 'nudges' for teen users to move away from a particular topic that they're spending too much time on. It also enhanced parental controls.
In addition, new accounts created by kids under 16 are private by default, ensuring that only approved followers can view the content posted by a minor.
Adult users are banned from sending private messages to teens who don't follow them. On the advertising front, Instagram users under 18 won't be shown targeted ads based on their interests and habits, but their age, gender, and location data can still be used to serve ads to them.
Mechanisms to verify the age of users are also being tested out by the platform.
Pinterest, on the other hand, has resorted to hiding search terms related to self-harm and suicide. "If a user searches for content related to suicide or self-harm, no results are served, and instead they are shown an advisory that directs them to experts who can help if they are struggling,” said Jud Hoffman, the platform's global head of community operations.
It's hard to say, especially when it comes to Meta-owned Instagram. The minimum mandatory age for someone to sign up on Instagram is 13 years. And all of Instagram's child-focused measures appear to be built around that key requirement.
But the platform is yet to devise a way to properly verify a user's age using AI or some other technology. Hence, the chances of these measures being circumvented by teens who simply say that they are above 18 is high. Not to mention that the prompts or "nudges" to steer teens away from certain topics is also subject to whether the user acknowledges and complies with said prompts.
Meanwhile, the testimony of Meta's head of health and well-being policy, Elizabeth Langone, as part of the inquest, was criticised for being evasive. The big tech firm appeared to walk the tightrope between protecting children on its platform and ensuring freedom of speech.
While admitting that some of the posts Molly accessed on Instagram would have violated its policies, Langone at one point also went on the defensive and said that it was "safe for people to be able to express themselves."
“Pinterest helpfully provided material about Molly’s activities on Pinterest in one go, including not just pins that Molly had saved but also pins that she [clicked] on and scrolled over,” Merry Varney, another lawyer representing the Russell family, said.
Instagram's user base has surged in India over the past few years but there's hasn't been a proper examination of how the platform's content is affecting the mental health of children in the country. In an attempt to bridge this gap, The Quint had conversations with specialists in the field of psychology to shed light on whether there's a pattern that emerges involving social media content and depression among Indian children.
"There is a correlation between negative social media experience and low mood and depression that could lead to suicide," Shreya Singhal, a counselling psychologist, said. Due to all the pictures that are being uploaded, it could lead to body image concerns as well, she added.
Reaffirming the connection between online content and depression, intermodal therapeutic practitioner Arati Kedia said that she had a case where someone who was watching stark Netflix documentaries had a psychotic episode.
While communicating in a healthy manner and building a trust factor with the child are strongly advised measures, that could get tricky in the realm of social media as its very hard for parents to monitor. "The way to log in to a website or YouTube is very easy. You can have access to adult content, content meant for over 18 years of age. But what parents can do here is continue communicating so that we're not just keeping a check on the child, it's not a matter of just putting a lock on it," Singhal said.
We need to keep discussing with the child about what they are seeing and what they are learning, she added.
(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)
Published: 06 Oct 2022,05:00 PM IST