In December, when Facebook launched Messenger Kids, an app for preteens and young children as young as 6, the company stressed that it had worked closely with resulting experts in order to safeguard younger users. What Facebook didn’t say is that many of those experts had received funding from Facebook.
Equally notable are the experts Facebook did not consult. Although Facebook says it spent 18 months developing the app, Common Sense Media and Campaign for a Commercial Free Childhood, two large nonprofits in the field, say they weren’t informed about it until weeks or days before the app’s debut. “They had reached out to me personally Friday before it launched, when plainly it was a fait accompli, ” says Josh Golin, executive director of Campaign for a Commercial Free Childhood. Facebook, he says, is “trying to represent that they have so much more support for this than they actually do.” Academics Sherry Turkle and Jean Twenge, well-known researchers whose work on children and technology is often cited, didn’t know about the app until after it launched.
The omissions speedily came back to bite Facebook. Eight weeks after the Messenger Kids debut, Golin helped organize a group of virtually 100 child-health advocates who asked Facebook CEO Mark Zuckerberg to kill the app because it could undermine healthy child development. That same week, Common Sense Media announced that it would help fund a lobbying endeavor around the downside of addictive technology, including a curriculum1 distributed at 55,000 public schools that would highlight concerns, such as a possible link between heavy social media use and depression.
Antigone Davis, Facebook’s global head of safety, says Facebook solicited, and listened to, input from a variety of people before launching the app. “We took much of what we heard and incorporated it into the app, ” she says. “For example, we heard from parents and privacy advocates that they did not want ads in the app, and we made government decisions not to have ads.”
Fixing Problems
Facebook’s approach to outside voices about Messenger Kids is echoed in its efforts to “fix” other controversial issues, such as fake news and election interference. As pressure mounts, Facebook touts its commitment to solving a difficult problem, often citing partnerships with third-party experts as a sign of its seriousness. Behind the scenes, however, the company sometimes obscures fiscal ties with experts, dismiss high-profile critics, or co-opts outside efforts to address the same concerns.
Last week, for example, frustrated fact-checkers pressured Facebook into a session at the company’s headquarters, claiming they had been shut out of vital data necessary to assess whether their efforts to combat fake news were working. Days after social-media analyst Jonathan Albright discovered that Russian propaganda may well be viewed millions of periods around the presidential election, Facebook called Albright, but then scrubbed the data from the internet. Cindy Southworth, one of the experts often cited in support of Facebook’s controversial project to combat revenge porn, belongs to a nonprofit that has received funding from Facebook. After former Google design ethicist Tristan Harris popularized the phrase “time well spent” to warn against the dangers of addictive technology, Zuckerberg adopted the phrase as well. Several times in recent months, he has promised to make sure that “time spent on Facebook is time well spent.” But Harris doesn’t think Facebook is sincere. “It's too bad to see Facebook co-opt the term without taking its meaning severely beyond asking exactly what we' meaningful interactions, ’” he tweeted Monday.