Social media content ‘probably’ contributed to Molly Russell’s death

Content on social media sites, including Instagram and Pinterest, “likely” contributed to the death of British teenager Molly Russell, who took her own life after reading thousands of posts about suicide, depression and self-harm, a coroner ruled Friday .

Nearly five years after Russell’s death in November 2017 at the age of 14, chief medical examiner Andrew Walker said she died from “an act of self-harm while suffering from depression and the negative effects of online content.”

The result spells a reckoning for social media platforms as regulators around the world grapple with how to make the internet safe for children, and will put renewed pressure on companies developing apps used by teens .

Although not a trial, the inquiry brought social media to the dock, with executives from Meta, which owns Instagram, and Pinterest for the first time in an English court over the potential harm to a generation of young people growing up online , were grilled.

It’s also putting pressure on the UK government, at a time when it’s expected to water down already long-delayed security rules that will govern surveillance of tech sites.

In the last six months of her life, Russell liked, saved, or shared 2,100 posts related to depression, suicide, or self-harm on Instagram and went just 12 days without engaging with this harmful content on the site.

Ian Russell, her father, told the inquest that social media “helped kill my daughter”.

“You can see what your child is doing in the [offline] world much easier,” he said. “You can see if they go to the corner shop. . . smell alcohol on breath. . . The effects of the digital world are invisible.”

Molly Russell

Molly Russell © PA

Ian Russell, Molly Russell's father

Ian Russell, Molly Russell’s father © Joshua Bratt/PA

According to Ofcom, the UK communications regulator, the majority of children under the age of 13 now have a profile on at least one social media site, despite 13 being the minimum age. Russell had a secret Twitter account where she documented her true state of mind and asked celebrities for help.

Walker said Friday it was “likely that the materials used by Molly, who already has a depressive illness and is vulnerable because of her age, negatively impacted her mental health and contributed more than minimally to her death.”

platform design

Over the past year, social media companies have come under pressure as society becomes more concerned about how the platforms’ design might impact vulnerable minds.

Instagram and Pinterest are visual apps known for showcasing emerging glossy images where individuals post idealized and often edited photos.

Last year, Frances Haugen, a former product manager at Facebook, which is owned by Meta, published a plethora of internal documents showing how algorithms can drag people down psychological rabbit holes. In particular, Instagram’s internal research suggested that this could have a negative impact on the well-being of teenage girls – findings that Instagram says have been misrepresented.

A few weeks later, Instagram announced it was pausing plans to launch Instagram Kids, a product aimed at under-13s.

On Thursday, Walker said he’s concerned that kids and adults aren’t separated on Instagram, and kids’ accounts aren’t linked to an adult’s.

Algorithms, the computer rules that control the order of posts social media users see, were at the center of the Russell case. Pinterest emailed her content related to depression, and Instagram suggested accounts to follow related to suicide and self-harm.

Russell was able to “binge” malicious videos, images and clips “some of which were selected and provided without Molly’s request,” Walker said.

Engagement is often a key metric for algorithm development: promoting content that users are likely to comment, like, or share. Meta described previous recommendation systems as “content-agnostic,” but now its technology aims to proactively identify harmful content and not promote anything that’s allowed on the platform related to self-harm.

blank

Elizabeth Lagone, Head of Health and Wellbeing at Meta © Beresford Hodge/PA

blank

Judson Hoffman, Global Head of Community Operations at Pinterest © James Manning/PA

Meta and Pinterest apologized during the investigation to Molly’s family for allowing her to view content that violated their policies in 2017. They said they have since updated both their technology and their content rules.

Former meta-AI researcher Josh Simons, a research associate in technology and democracy at Harvard University, said what happened with Russell “is not just about the responsibility of platforms to monitor malicious content.” .

“It’s about the algorithms that drive content and decide what our kids see and hear every day — what drives those algorithms, how they’re designed, and who gets to control them,” he said.

moderation efforts

Since 2019, Instagram has banned all graphic self-harm or suicide images, having previously only removed images that encouraged them, and has ramped up automated technology that detects this type of content and reports it to human reviewers. Meta said the company acted on 11.3 million pieces of content related to suicide and self-harm between April and June 2022 on both Instagram and Facebook.

Some self-harm and suicide content, such as self-harm scars healed, is allowed when individuals seek supportive online communities on Instagram.

“Clumsy and misinformed approaches to social media moderation risk removing content that, while superficially sensitive, is enabling important social conversations that may not be happening elsewhere,” said Ysabel Gerrard, a lecturer at Sheffield University and an unpaid consultant in Meta’s Advisory Committee on Suicide and Self-Harm.

“As much as people are attributing to social media [negatively] It’s affecting their mental health, but there are also many people who say it’s helped save theirs,” she added.

Meta said the moderation of this type of content is nuanced, making it difficult for both artificial intelligence systems to detect and humans to understand.

It has 15,000 moderators around the world covering financial fraud and political misinformation, as well as self-harm. Last year, the company announced it would hire 10,000 people who would be dedicated to building the Metaverse, its virtual world.

“It seems unlikely that Facebook has sufficient resources on both the tech product side and the human reviewer side to solve the problem when the numbers are about as high as they want to spend on people, playing video games,” Haugen told the Financial Times.

Meta and Pinterest both admit that their moderation will never capture everything. On a site with over 1 billion users, in the case of Instagram, failure to identify even 1 percent of malicious posts can mean millions are left over.

Users can also thwart algorithms by misspelling words, mixing them with numbers, and using code words.

In just a few minutes of scrolling Instagram, the FT was able to identify self-harming content that violated Instagram’s policies using terms previously reported to the company. Then it was removed.

The investigation ends with the government changing the Online Safety Act, a law that will force technology platforms to fight harmful content online. In the current iteration, companies are expected to comply with age verification standards, as well as full risk assessments or independent audits of algorithms.

Last year, the UK introduced the Children’s Code, also known as the Age Appropriate Design Code, which places stricter restrictions on companies handling children’s data. Legislation has inspired similar regulations in California, Europe, Canada and Australia.

Baroness Beeban Kidron, who proposed the code, said: “There is a version of technology that puts the welfare and safety of children above the bottom line. . . It is not desirable to insist that children’s welfare should precede growth, it is simply a price of doing business.”

Anyone in the UK affected by the issues raised in this article can contact the Samaritans toll free on 116 123.

https://www.ft.com/content/3f9e7abc-9fbd-4339-ab11-dac16aac91e3 Social media content ‘probably’ contributed to Molly Russell’s death

Adam Bradshaw

TheHitc is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – admin@thehitc.com. The content will be deleted within 24 hours.

Related Articles

Back to top button