A bipartisan coalition of US attorneys-general are investigating how Facebook targets young people on its Instagram photo-sharing app and the potential harms it may cause.
At least nine states are examining whether Facebook violated state consumer protection laws and “put the public at risk” by promoting Instagram to children and young adults, “despite knowing that such use is associated with physical and mental health harms”, according to Maura Healey, Massachusetts attorney-general.
The investigation is focused in particular on the techniques used by Facebook — which recently rebranded its parent company to Meta — to boost the frequency and duration of time spent by young people on the app.
“Today I am co-leading a nationwide coalition to get to the bottom of this company’s engagement with young users, identify any unlawful practices, and end these abuses for good,” Healey said in a statement. “Meta can no longer ignore the threat that social media can pose to children for the benefit of their bottom line.”
Other attorneys-general involved include those from California, Florida, Kentucky, Nebraska, New Jersey, New York, Tennessee and Vermont.
The investigation comes after a Wall Street Journal report, based on documents leaked by Facebook whistleblower Frances Haugen, which showed that the company’s internal research had found Instagram had a mixed effect on the health of young people — something that the company had refused to acknowledge publicly.
While some users suffering loneliness reported a positive experience using Instagram, for example, it was also found to deepen teenage girls’ preoccupation with body image.
The documents, viewed in redacted form by the Financial Times, also reveal that Facebook’s pace of growth among users in key markets such as the US is slowing. In particular younger users are spending less time on Instagram and producing less content, lured away by rivals such as short-form video app TikTok. Facebook has previously disputed the Journal’s presentation of its research.
Facebook said on Thursday: “These accusations are false and demonstrate a deep misunderstanding of the facts. While challenges in protecting young people online impact the entire industry, we’ve led the industry in combating bullying and supporting people struggling with suicidal thoughts, self-injury, and eating disorders.”
The company added that it was building “new features to help people who might be dealing with negative social comparisons or body image issues”, and new parental supervision controls.
Demands to improve safeguards for children on social media apps have gathered steam this year. At a US House of Representatives hearing, Mark Zuckerberg, Facebook’s chief executive; Jack Dorsey of Twitter; and Sundar Pichai, chief executive of Google, faced multiple accusations that their platforms are deliberately designed to get young users hooked early, track children online and expose them to toxic content and predators.
Facebook suspended plans in September to launch Instagram Kids, a version of the app for under 13s, following a backlash from the public, members of Congress and a bipartisan coalition of 44 attorneys-general, over similar concerns.
However, the company has refused to quash the plans altogether, arguing that the pause would allow it more time to incorporate feedback from policymakers, parents and child-safety campaigners.