In December, when Facebook launched Messenger Kids, an app for preteens and young children as young as 6, the company stressed that it had worked closely with resulting experts in order to safeguard younger users. What Facebook didn’t say is that many of those experts had received funding from Facebook.

Equally notable are the experts Facebook did not consult. Although Facebook says it spent 18 months developing the app, Common Sense Media and Campaign for a Commercial Free Childhood, two large nonprofits in the field, say they weren’t informed about it until weeks or days before the app’s debut. “They had reached out to me personally Friday before it launched, when plainly it was a fait accompli, ” says Josh Golin, executive director of Campaign for a Commercial Free Childhood. Facebook, he says, is “trying to represent that they have so much more support for this than they actually do.” Academics Sherry Turkle and Jean Twenge, well-known researchers whose work on children and technology is often cited, didn’t know about the app until after it launched.

The omissions speedily came back to bite Facebook. Eight weeks after the Messenger Kids debut, Golin helped organize a group of virtually 100 child-health advocates who asked Facebook CEO Mark Zuckerberg to kill the app because it could undermine healthy child development. That same week, Common Sense Media announced that it would help fund a lobbying endeavor around the downside of addictive technology, including a curriculum1 distributed at 55,000 public schools that would highlight concerns, such as a possible link between heavy social media use and depression.

Antigone Davis, Facebook’s global head of safety, says Facebook solicited, and listened to, input from a variety of people before launching the app. “We took much of what we heard and incorporated it into the app, ” she says. “For example, we heard from parents and privacy advocates that they did not want ads in the app, and we made government decisions not to have ads.”

Fixing Problems

Facebook’s approach to outside voices about Messenger Kids is echoed in its efforts to “fix” other controversial issues, such as fake news and election interference. As pressure mounts, Facebook touts its commitment to solving a difficult problem, often citing partnerships with third-party experts as a sign of its seriousness. Behind the scenes, however, the company sometimes obscures fiscal ties with experts, dismiss high-profile critics, or co-opts outside efforts to address the same concerns.

Last week, for example, frustrated fact-checkers pressured Facebook into a session at the company’s headquarters, claiming they had been shut out of vital data necessary to assess whether their efforts to combat fake news were working. Days after social-media analyst Jonathan Albright discovered that Russian propaganda may well be viewed millions of periods around the presidential election, Facebook called Albright, but then scrubbed the data from the internet. Cindy Southworth, one of the experts often cited in support of Facebook’s controversial project to combat revenge porn, belongs to a nonprofit that has received funding from Facebook. After former Google design ethicist Tristan Harris popularized the phrase “time well spent” to warn against the dangers of addictive technology, Zuckerberg adopted the phrase as well. Several times in recent months, he has promised to make sure that “time spent on Facebook is time well spent.” But Harris doesn’t think Facebook is sincere. “It's too bad to see Facebook co-opt the term without taking its meaning severely beyond asking exactly what we' meaningful interactions, ’” he tweeted Monday.

Facebook is “trying to represent that they have much better support for this than they actually do, ” says Josh Golin, executive director of Campaign for a Commercial Free Childhood.

The debate over kids and smartphones is far from resolved, including disagreement over the study tying social media use to depression in teens, which was conducted by Twenge. One side argues that kids are already on social media and require guidance to learn how to use it safely. The other side says tech giants have traversed the line by targeting young children and are charging ahead without understanding the effects. The only thing everyone agrees on? The need for more research and better parental controls. In this polarized climate, Facebook initially deflected criticism by presenting Messenger Kids as the result of careful consultation with a range of outside experts, even as it subtly stacked the deck.

Facebook has toyed with targeting kids under 13 since at the least 2011, when Zuckerberg vowed to someday “fight” the Children’s Online Privacy Protection Act, which requires companies to obtain parental permission before gather information on anyone under 13. Until December, though, it had not overtly targeted younger children.

Facebook’s blog post announcing Messenger Kids emphasized that the app was “co-developed” with parents and experts, through a discussion with the National PTA, Blue Star Families, and an advisory board with more than a dozen experts, from groups such as the Yale Center for Emotional Intelligence, Connect Safely, and Sesame Workshop. In an accompanying press release, Facebook cited remarks from roundtable discussions held by the National PTA and a mother who feedback to New Mexico State University’s Learning Games Lab.

Financial Ties

One Facebook post said the company had “collaborated” with the National PTA, but it did not mention Facebook’s fiscal ties to the group, or others among its consultants. The National PTA says Facebook donated money for the first time in 2017, which the organization used to fund a survey and roundtables. Facebook says it previously donated “small amounts” unrelated to the app to Blue Star Families, a nonprofit for military households. Facebook funded the research at New Mexico State. At least seven members of Facebook's 13 -person advisory board have some kind of fiscal tie-in to the company. In 2017, Facebook donated money to Family Online Safety Institute, which has two representatives on the board, as well as Connect Safely, the Yale Center for Emotional Intelligence, and Telefono Azzurro, which each have one representative on the board. In 2017, Facebook also donated at least $50,000 to MediaSmarts, which has two members on the board. One members of the security council, former Sesame Workshop executive Lewis Bernstein , now works as a consultant advising Facebook on developing content for teens, unrelated to Messenger Kids. Bernstein and other board members have gone on to write op-eds in The Hill and the San Jose Mercury News supporting Facebook’s app. WIRED previously reported that Facebook had donated to FOSI, Connect Safely, and MediaSmarts.

“There was no attempt to not be upfront about it, ” says Davis, the Facebook executive. Many of the groups on the Messenger Kids advisory board are also on Facebook’s safety advisory board, which was created in 2009. Davis says that Facebook’s financial support for safety members of the security council has previously been reported. “We’ve had that dialogue publicly many, many times, ” she says. The board is featured at the top of Facebook’s Safety Center, without disclosing that some members receive funding. On a linked page, the company says “Facebook consults” with these organizations. In a statement, Davis says “We do not want there to be a financial burden to working with Facebook.” She says the company sometimes offer “funding to cover programmatic or logistics expenses” of partner organizations.

Funding from Facebook may not have affected the feedback or research around Messenger Kids. The Facebook advisers who spoke to WIRED offered thoughtful views, based on personal experience or supported by research. Board member Michael Rich, who founded the Center on Media and Child Health at Boston Children’s Hospital, also partnered with Apple shareholders on a widely circulated letter asking the company to research the effects of smartphones on children and build better tools for mothers. Kristelle Lavallee, a content strategist at Rich’s center, who is also on Facebook’s kids advisory board, compared the desire to shut down Messenger Kids to abstinence-only education. “Nobody is saying they have the answers, because nobody does, ” she says, but as researchers and educators, “It’s actually our job to understand these tools.” Barbara Chamberlin, who runs the New Mexico lab, says she agreed to work with Facebook only after the company promised that her lab’s research would be fully integrated into the development process. National PTA chairman Jim Accomando says, “It is important that families are armed with resources and tools to help them take advantage of the opportunities that the digital world offers while building good digital habits and ensuring children have the skills they need to be responsible online.”

Participants in the discussions say some of the outside views helped shape Messenger Kids, but that Facebook appeared to have already decided some issues. Bernstein, the former Sesame Workshop executive, says at the one session he attended in Palo Alto, advisers brought up the age range. “We said 6 is really young, 7 is young for this, 8 is even young, ” he says. Facebook responded by saying the children of deployed service members would find it useful. “We said OK, but know those safeguards, ” Bernstein says.

Parents have to set up their child’s account on Messenger Kids, substantiating their identity by logging into Facebook. Children can’t be found in search, so mothers control initiate and responding to friend petitions. The app does not include advertising, and the company says it will not use the data for ad purposes, subjects that Facebook’s Davis says came up often. But the terms of service allow the company to collect information like the content of children’s messages, photos they send, and what features “theyre using”, and then share that datum among companies owned by Facebook, as well as third-party vendors handling issues including customer support and analysis.

Facebook’s is supportive of academics and advocacy groups is not uncommon. Google’s academic influence campaign has been well documented, and Google has also donated to both Family Online Safety Institute and Connect Safely. The Family Safety board includes executives from Facebook, Google, Comcast, Amazon, Twitter, Microsoft, AT& T, Netflix, and others. Common Sense Media works with Apple, AT& T, Comcast, DirecTV, Netflix, Microsoft, and others as distribution partners for its content. Comcast and DirecTV donated a combined $50 million in media and airtime for the anti-tech craving lobbying campaign.

Golin, of the Campaign for a Commercial Free Childhood, says Facebook, for all its defects, was more responsive than Google. Golin says his organization offered to meet with YouTube over their concerns about YouTube Kids but didn’t hear back. At least, he says, “I’ve met with Facebook.” Still, he says Facebook’s refusal so far to consider shutting down the app is telling. “If the parameters are just' How can we make this app a little safer and a little less harmful ?' then the conversation is already so restricted, ” he says.

Facing the Music

For two years, Mark Zuckerberg has battled crisis around bias, fake news, extremism, and Russia's interference in the 2016 election. Read WIRED's cover story on the internal drama. Facebook said Messenger Kids would help safeguard preteens who may be using unauthorized and unsupervised social-media accounts. Child-health advocates asked Facebook to discontinue Messenger Kids, claiming it will undermine childhood growth.