They learned, several years after launch, that they were wrong, according to former employees.
From the early days of YouTube, videos directed at children were among the most popular content. Channels featuring animated nursery rhymes, children engaged in pretend play and adults opening new toys have garnered billions of views since the platform started in 2005.
As the site grew in popularity and advertising revenue tied to the videos poured in, an increasing number of parents complained that their children stumbled upon unseemly content through its algorithm-driven recommendations. Holding on to young viewers involved a difficult reckoning at Alphabet Inc.-owned YouTube, the first tech giant to build a children’s version of its main product. Ultimately, it decided that humans, not machine-trained algorithms, could best select content for children.
That trade-off of quality at the cost of user and revenue growth is significant as tech companies wrestle with attracting and keeping young users while addressing criticism that their products aren’t doing enough to promote the safety of younger consumers.
Instagram, owned by Meta Platforms Inc., paused the development of a children’s version of its namesake product after lawmakers and parents raised concerns. ByteDance Ltd.’s TikTok has started to test filling users’ feeds with age-based content in response to criticism that adolescents are often exposed to mature and potentially harmful videos.
YouTube Kids was introduced in 2015. Former employees say it was an effort to show parents and regulators that the company took children’s safety seriously while grabbing its next generation of users.
Employees at YouTube Kids were divided on the best way to build a product specifically for children, the former employees said. Employees debated how big a role, if any, humans should play in picking what videos are allowed in the children’s version, they said. The founding backed an open platform and were against employees selecting content, these people said, while others argued that a product made for children couldn’t be left behind to an algorithm.
The feeling among many product managers and engineers was that if a platform was 99% safe, that was enough, according to one former executive. The point was to find a way to avoid human curation, the person said.
The employees pushing for little editorial interference won out.
Within months, users were spending an average of 90 minutes a day watching videos, smashing internal goals, according to two former executives. Like its parent site, YouTube Kids was self-governed: It relied on users to flag inappropriate or troubling content and was subject to minimal review by human moderators.
Control and negative publicity followed. Watchdog groups and parents identified disturbing videos in which popular cartoon characters from Mickey Mouse and Paw Patrol to Peppa Pig were put in obscene or violent situations. Researchers chronicled the prevalence of videos they consider to have little educational merit, such as “unboxing” videos that showed children and adults opening new toys.
“I would’ve flunked it when it launched,” said Josh Golin, executive director of Fairplay, a group that advocates reducing companies’ interactions with children.
By late 2018, YouTube executives acceded to more editorial intervention and began to pour more resources into YouTube Kids. In early 2019, the Federal Trade Commission started investigating YouTube’s data-collection of minors.
The trust-and-safety team expanded from a handful of people to hundreds, former employees said. Human reviewers were hired to vet channels. The number of human curators, who pick featured videos on users’ home screens, increased fivefold to about 25 people, they said.
Alicia Blum-Ross, a researcher on children and technology, was hired by YouTube to minimize risk to children and ramp up quality content while maintaining the wide range of user-generated content that makes YouTube unique.
“There is an inherent tension there,” said Ms. Blum-Ross.
She formed an advisory board of about a dozen academic researchers specializing in disciplines such as child development and media literacy. The board informed product questions like how to recommend new content to children based on their viewing pattern.
At the end of 2019, YouTube took its most drastic step yet: It removed millions of what it deemed low-quality videos from YouTube Kids, a decision that ultimately cut the library by about 80%, according to a former executive. Over the next few months, YouTube cleared out videos of cartoon parodies and many unboxing videos that violated its policy on commercial content. The children’s platform is now mostly made up of videos that are created by preapproved content partners, according to a former executive and people who work closely with the company.
YouTube’s increased scrutiny of its children’s site coincided with a $170 million fine from the FTC for violating children’s privacy. Following the settlement, YouTube also began to repeatedly remind users to switch over to the children’s version whenever they watched a video targeting children on the main platform.
Many of the changes put in place haven’t helped growth. YouTube Kids continues to attract a fraction of the eyeballs of the main platform. In January, YouTube said the children’s version has 35 million weekly viewers, up from 11 million in November 2017. The main YouTube platform has 2 billion monthly users.
Many children also prefer the main platform. About 67% of children used the main YouTube platform in 2020, compared with 4% who said they used YouTube Kids, according to a report from Qustodio, a maker of digital-safety apps.
A continuing question for YouTube—and other tech companies thinking about creating a youth version of their main products—is tweens. The 9-to-12-year-olds, the group of kids advertisers most want to reach as they start to acquire more pocket money, have largely stuck with the main site because they think the other one is for preschoolers, the internal research and company analysis showed.
The company has added more content like sports and gaming on the children’s site to appeal to the preteens but with little change in usage. It has also added more safety measures on the main site to limit their exposure to content, called “supervised experiences,” but a former executive said the adoption rate has been low.
Many experts say there is still more work to be done.
YouTube has done a good job of turning the children’s platform into a safer place, said Jill Murphy, editor in chief at Common Sense Media, a nonprofit group that offers guidance on age-appropriate entertainment and technology.
But she and other experts caution that safe doesn’t mean high-quality or educational content. YouTube Kids remains rich with videos that often feel like commercials and have minimal educational value, she said.
“That has a slow-burn effect, and that’s more of what kids are consuming on a regular basis,” Ms. Murphy said. “You have to wonder what’s the long-term impact of that.”
Whitney Hanna, a Houston-based mother of two, returned to YouTube Kids during the pandemic, desperate to find ways to preoccupy her two young sons while she and her husband worked from home.
Ms. Hanna, an education consultant, had sworn off the site in 2017, after her then-4-year-old started to see troubling videos, including one that featured a bloody scene between Batman and the Joker. She said she noticed a difference when her children returned to the site two years ago.
“They’ve worked on the safety issue, and I feel like they’ve added more things that are of educational value,” said Ms. Hanna. Her now-9-year-old likes to watch educational tutorials on YouTube Kids, including videos about phonics and numeracy.
Never miss a story! Stay connected and informed with Mint. Download our App Now!!