Several organizations join a response to The Bright Initiative by Bright Data and the Molly Rose Foundation's research
NEW YORK, Nov. 30, 2023 /PRNewswire/ -- The Bright Initiative by Bright Data, a global program providing pro-bono data technology and expertise to drive positive change and The Molly Rose Foundation (MRF), a suicide and self-harm prevention organization, released a first-of-its-kind report demonstrating the shocking prevalence of life-threatening design choices on Instagram, TikTok and Pinterest that expose a fundamental systemic failure at the hands of Big Tech.
The research proves young people are routinely recommended large volumes of harmful content, fed by high-risk algorithms, that amplify self-harm and suicide content.
In response to this groundbreaking report, several US-based non-profits joined in collective support of the report and echoed the call to make the internet safer for children and young adults.
- Design it for Us
- Accountable Tech
- ParentsTogether Action
- Fairplay
- Center for Countering Digital Hate (CCDH)
"This week, when we should be celebrating Molly's 21st birthday, it's saddening to see the horrifying scale of online harm and how little has changed on social media platforms since Molly's death," said Ian Russell, father of Molly Russell and Chair of Trustees at the Molly Rose Foundation. "The longer tech companies fail to address the preventable harm they cause, the more inexcusable it becomes. Six years after Molly died, this must now be seen as a fundamental systemic failure that will continue to cost young lives."
This week it was revealed that Molly's case was cited in a full unredacted suit of 42 attorneys general suing Meta regarding child safety risks on social media. Molly's family started MRF after her death in November 2017, and the subsequent inquest marked the first time a tech platform had been held formally responsible for the death of a child. Six years after that inquest, today's report shows how algorithms continue enabling harmful content to reach staggeringly high audiences.
Some of the key insights from the report include:
- 54% of the most engaged harmful posts surveyed on TikTok were viewed more than one million times.
- Instagram has user prompts to 'use hashtags' such as #letmedie, and TikTok recommends search results such as 'the quickest way to end it' and 'attempt tonight'.
- 99% of the Instagram Reels content recommended to researchers is considered harmful.
- Researchers are concerned social media companies prioritize growing their user base, at the expense of user safety, in a race for market share.
"The report has some incredibly disturbing findings regarding the significant failings of social media in terms of their inconsistent and at times erratic moderation of harmful content," said Or Lenchner, CEO of Bright Data. "Tech giants must take responsibility for the implications on individuals, often children and young people, who consume large amounts of harmful material on their platforms."
Key recommendations from the report include:
- Ensure that tech companies are incentivized and actively required to prioritize safety-by-design in the design and delivery of their products, by delivering on legislative priorities like the Kids Online Safety Act and the Age-Appropriate Design Code in the United States.
- Ensure open access to data by protecting and expanding the ability of civil society and researchers to access data, including that which is held by social media companies.
- Establish clear research priorities and develop new mechanisms to share research findings, data, and other emerging insights more effectively to improve our understanding of risks of harmful content and platform design and neutralize industry influence.
- Big Tech companies should urgently and immediately take action to address the risks posed by harmful content, including how their design choices and algorithms contribute to both immediate and longer-term risk profiles.
"In memory of Molly Russell, this critical report paints a clear and urgent picture of the consistent and unabated harm to young users across the world at the hands of Big Tech companies," said Zamaan Qureshi, co-chair of Design it For Us. "Molly and I were 14 in 2017. I celebrated my 21st birthday this year, but Molly didn't have that opportunity. However pervasive, the harms caused by platforms like Instagram are entirely preventable – our leaders in the U.S. and abroad must step up to hold platforms accountable and prioritize our safety."
"We are seeing a groundswell of concern among parents who are deeply troubled by the onslaught of harms served up by social media. This research affirms what parents and families are up against: social media companies whose profit models prioritize keeping kids, their most vulnerable users, constantly engaged without any regard for their well being," said Shelby Knox, Campaign Director for ParentsTogether Action. "A generation of children is in peril because social media platforms like Instagram and TikTok choose their profits, their stockholders, and their companies over children's health, safety, and even lives over and over again. Something has to be done, which is why parents across the country are calling on lawmakers to take decisive action to protect our children's futures in the digital age by regulating social media platforms."
"This important new report documents in harrowing detail how social media platforms are designed to maximize engagement by any means necessary, even promoting content glorifying suicide and self-harm to vulnerable young people," said Josh Golin, Executive Director of Fairplay. "Policymakers must understand: Every day they allow the status quo to continue, they are complicit in allowing preventable and serious harms to young people to occur."
Contacts:
[email protected]
[email protected]
[email protected]
SOURCE Bright Data Ltd.
WANT YOUR COMPANY'S NEWS FEATURED ON PRNEWSWIRE.COM?
Newsrooms &
Influencers
Digital Media
Outlets
Journalists
Opted In
Share this article