Age limits —

As teens left Facebook, company planned to lure 6-year-olds, documents show

Internal posts and job listings show plans were moving forward.

Facebook's Messenger for Kids app is available to children as young as six.
Enlarge / Facebook's Messenger for Kids app is available to children as young as six.
Hakan Nural / Anadolu Agency

Facebook has a demographic problem. Even before investigations revealed that the company’s products were destroying teens’ mental health, interest in its flagship product was dropping off a cliff. Since 2019, teen usage of the app has declined by 13 percent, and over the next two years, it’s expected to drop another 45 percent.

“Aging up is a real issue” a researcher wrote in an internal memo revealed last week. Perhaps that’s why Facebook was considering new products targeted at children as young as six years old, according to a new document handed over to Congress by whistleblower Frances Haugen.

“Our company is making a major investment in youth and has spun up a cross-company virtual team to make safer, more private, experiences for youth that improve their and their household’s well-being,” the internal post from April 9 said. “For many of our products, we historically haven’t designed for under 13 (with the exception of Messenger Kids) and the experiences built for those over 13 didn’t recognize distinctive maturity levels across the age spectrum.”

Legal limits

There’s a reason that Facebook has mostly avoided targeting kids under 13—the Children’s Online Privacy Protection Act (COPPA) limits what companies can do to target and collect and share data on children. For example, companies are restricted from giving data to third parties without parental consent.

A brief aside: Facebook recently announced that it was rebranding its corporate entity as Meta. The Facebook name will still apply to its core app and platform. The rebranding comes as the company is increasingly under fire from governments and regulators for its role in fomenting ethnic cleansing, hate speech, insurrections, mental health crises, and more. Since the documents referred to in this article were created when Facebook, the company, went by its old name, we will use that name throughout.

Facebook does currently have one product aimed at children under 13—Messenger for Kids—and its terms of use say the company doesn’t sell users’ information to third parties. But, as Common Sense Media points out, the terms do not rule out showing kids targeted advertising. Facebook believes its app complies with COPPA.

Two years ago, though, a bug in the Messenger for Kids app allowed users to create group chats with unauthorized users. It took Facebook nearly a year to discover the bug, and it patched it the next day. Parents weren’t notified for another month, though. 

In the wake of the disclosure, Senators Ed Markey (D-Mass.) and Richard Blumenthal (D-Conn.) pressed Facebook on whether the company was violating COPPA. Kevin Martin, Facebook’s vice president of public policy, replied saying that it takes children’s privacy seriously and that the company thinks the app complies with COPPA. 

Yet the senators weren’t entirely convinced by Martin’s letter. “Facebook’s response gives little reassurance to parents that Messenger Kids is a safe place for children today,” they wrote back. “We are particularly disappointed that Facebook did not commit to undertaking a comprehensive review of Messenger Kids to identify additional bugs or privacy issues.”

Facebook was apparently undeterred and pressed forward with its initiative to get younger users onto its platforms. The company did not reply to a request for comment before publication.

Bullying and harassment

Beyond alleged COPPA violations, children can be harmed through social media apps in other ways. Facebook’s own research shows that 7 percent of teens on Instagram reported being bullied, and 40 percent of that bullying occurred through private messages. In other words, restricting kids' social media contacts to their “friends” could still expose them to significant harm.

At the time of the internal post, Facebook had seven job listings across a range of products, including several for Instagram Youth, a “paused” product that Facebook claims was aimed at kids ages 10–12. Another listing was for an undefined position that would encompass both Messenger for Kids and a proposed “Youth Platform.”

The internal post said that the new team was working to formulate “strategies on experiences for the spectrum of age groups” ranging from six to 16-plus.

It appears that Facebook wasn’t seeking to limit its focus on children to any one app. In addition to developing new strategies to attract kids, the company said that it was working to “redefin[e] existing products to take into account cognitive and social development needs that different stages of maturity have.” In other words, Facebook was interested in bringing children into every app in its portfolio.

Channel Ars Technica