The social media giant said it still wanted to build a child-focused Instagram product but would postpone the plans in the face of criticism.
Facebook said on Monday that it had paused development of an Instagram Kids service that would be tailored for children 13 years old or younger, as the social network increasingly faces questions about the app’s effect on young people’s mental health.
The pullback preceded a congressional hearing this week about internal research conducted by Facebook, and reported in The Wall Street Journal, that showed the company knew of the harmful mental health effects that Instagram was having on teenage girls. The revelations have set off a public relations crisis for the Silicon Valley company and led to a fresh round of calls for new regulation.
Facebook said it still wanted to build an Instagram product intended for children that would have a more “age appropriate experience,” but was postponing the plans in the face of criticism.
“This will give us time to work with parents, experts, policymakers and regulators, to listen to their concerns, and to demonstrate the value and importance of this project for younger teens online today,” Adam Mosseri, the head of Instagram, wrote in a blog post.
The decision to halt the app’s development is a rare reversal for Facebook. In recent years, the social network has become perhaps the world’s most heavily scrutinized corporation, grappling with privacy accusations, hate speech, misinformation and allegations of anticompetitive business practices. Regulators, lawmakers, journalists and civil society groups around the world have criticized the company for its effects on society.
With Instagram Kids, Facebook had argued that young people were using the photo-sharing app anyway, despite age-requirement rules, so it would be better to develop a version more suitable for them. Facebook said the “kids” app was intended for ages 10 to 12 and would require parental permission to join, forgo ads and carry more age-appropriate content and features. Parents would be able to control what accounts their child followed. YouTube, which Google owns, has released a children’s version of its app.
But since BuzzFeed broke the news this year that Facebook was working on the app, the company has faced scrutiny. Policymakers, regulators, child safety groups and consumer rights groups have argued that it hooks children on the app at a younger age rather than protecting them from problems with the service, including child predatory grooming, bullying and body shaming.
Mr. Mosseri said on Monday that the “the project leaked way before we knew what it would be” and that the company had “few answers” for the public at the time.
Opposition to Facebook’s plans gained momentum this month when The Journal published articles based on leaked internal documents that showed Facebook knew about many of the harms it was causing. Facebook’s internal research showed that Instagram, in particular, had caused teen girls to feel worse about their bodies and led to increased rates of anxiety and depression, even while company executives publicly tried to minimize the app’s downsides.
On Thursday, Facebook’s global head of safety, Antigone Davis, is scheduled to testify at a Senate Commerce subcommittee hearing titled “Protecting Kids Online: Facebook, Instagram and Mental Health Harms.”
Simply pausing Instagram Kids was insufficient, said lawmakers, including Senator Richard Blumenthal, Democrat of Connecticut and the chairman of the subcommittee holding Thursday’s hearing. In a statement, he and others said Facebook had “completely forfeited the benefit of the doubt when it comes to protecting young people online, and it must completely abandon this project.”
The lawmakers added that stronger regulation was needed. “Time and time again, Facebook has demonstrated the failures of self-regulation, and we know that Congress must step in,” they said.
A children’s version of Instagram would not fix more systemic problems, said Al Mik, a spokesman for 5Rights Foundation, a London group focused on digital rights issues for children. The group published a report in July showing that children as young as 13 were targeted within 24 hours of creating an account with harmful content, including material related to eating disorders, extreme diets, sexualized imagery, body shaming, self-harm and suicide.
“Big Tobacco understood that the younger you got to someone, the easier you could get them addicted to become a lifelong user,” Doug Peterson, Nebraska’s attorney general, said in an interview. “I see some comparisons to social media platforms.”
In May, attorneys general from 44 states and jurisdictions had signed a letter to Facebook’s chief executive, Mark Zuckerberg, asking him to end plans for building an Instagram app for children.
American policymakers should pass tougher laws to restrict how tech platforms target children, said Josh Golin, executive director of Fairplay, a Boston-based group that was part of an international coalition of children’s and consumer groups opposed to the new app. Last year, Britain adopted an Age Appropriate Design Code, which requires added privacy protections for digital services used by people under the age of 18.
Mr. Golin also called on Facebook to conduct a major public education campaign to tell parents to get their children under the age of 13 off Instagram.
The Instagram revelations have also set off discontent inside Facebook. On Thursday, during a companywide meeting led by Mr. Zuckerberg, employees demanded to see the Instagram research for themselves and asked what executives planned to do about the findings, according to one attendee, who was not authorized to speak publicly.
“Teen suicide rate has increased 20 percent in the last 4 years,” read one of the top-voted employee questions to Mr. Zuckerberg. “It’s proven that Instagram is toxic for teen girls. What is Facebook doing to address this?”
During the meeting, Mr. Zuckerberg passed the question to Mr. Mosseri, who said the research actually showed that Instagram mostly improved image issues for teens, according to the attendee. Those points were publicly reiterated in a company blog post on Sunday.
It’s unclear what will happen to the team that led the development of the youth Instagram group. Facebook last year hired Pavni Diwanji, who previously oversaw the development of YouTube Kids, to build a similar experience for Instagram.
Liza Crenshaw, an Instagram spokeswoman, said the team focused on youth products would continue working on efforts related to teenagers, including “building meaningful tools to help parents and guardians support their teens.” She added that pausing the Instagram youth app was “a collective decision” by the company’s leadership and pointed to comments from Nick Clegg, Facebook’s vice president for global affairs, who said on Monday that the company planned to release the research “in the next few days.”
On Monday, Mr. Mosseri defended the company. He said Facebook’s internal research was used to help guide product decisions, including new features that allow people to pause their account or block certain words that could be used for bullying or harassment.
He pledged to introduce new parental control features for the existing Instagram app in the coming months while the company continues to consider a version for children. “Critics of Instagram Kids will see this as an acknowledgment that the project is a bad idea,” Mr. Mosseri said. “That’s not the case.”