Court says YouTube channels with children’s audiences like Ryan’s World, CookieSwirlC, Cartoon Network must defend against state-level children’s privacy class action lawsuit

By Franklin Graves

Photo credit: Diego Thomazini/Shutterstock Grapho Mind/Shutterstock (Licensed) by Caterina Cox

An appeals court in California is allowing a class action lawsuit to continue against Google and YouTube, as well as several kids and family media companies and creators that operate YouTube channels—including Dreamworks Animation; Hasbro; Mattel; Cartoon Network; Ryan’s World, a kidfluencer channel formerly known as Ryan’s ToysReview; Pocket.watch, which has a partnership deal with Ryan’s World; ChuChu TV Studios; and CookieSwirlC, an independent VTuber with 19 million subscribers.

This December opinion from the appeals court potentially expands legal exposure and liability for creators with children in their audience, despite Google and YouTube arguably being responsible for user tracking and data gathering of audiences.

The class action was first filed in October 2019 by a class of children through their parents and guardians. The claims of the class action center around the use of targeted advertising, powered by persistent identifiers, that allowed Google and YouTube to collect data and track the online behavior of children without proper parental consent. In August 2021, the district court upheld a motion to dismiss filed by YouTube and the channel owners, so the class of children appealed the dismissal to the Ninth Circuit—aka, the federal court of appeals—to overturn the lower court’s decision and allow the class action to proceed forward.

This most recent court opinion, delivered by Judge M. Margaret McKeown, now overturns portions of the lower court’s decision to dismiss the class action. In short, YouTube and the channel owners were unsuccessful in having the class action cut short. This opinion from the Ninth Circuit, which can now be used in similar cases supporting claims of state privacy violations by creators, potentially expands legal exposure and liability for creators to also include violations of state privacy laws. 

Creators may recall that the Federal Trade Commission (FTC) made it clear as part of its 2019 settlement with YouTube that channel owners could be held liable for violations of the Children’s Online Privacy Protection Act (COPPA) in the same way as any website or online service provider. In the 2019 complaint against YouTube, the FTC specifically mentioned several defendants in this class action, including CookieSwirlC—prior to the channel’s April 2022 unveiling of a virtual avatar and transitions from unboxing videos to Let’s Play content—and Mattel for the operation of channels associated with its Barbie, Monster High, Hot Wheels, and Thomas & Friends brands.

“I think the message from the FTC is that they are ramping up enforcement of these cases and that states are also looking to make an example of companies that cannot abide by these regulations around children’s data,” Debbie Reynolds, a data privacy consultant and host of The Data Diva Talks Privacy podcast, told Passionfruit.

Depending upon each state’s privacy laws and related consumer protection regulations, claims could be brought against creators and their businesses by either state agencies or individual citizens whose privacy rights have been impacted due to a creator’s content being targeted toward children. It’s important to note that lawsuits alleging that a creator violated COPPA cannot be brought by individuals, such as the parents of the children in this class action. Instead, COPPA enforcement is managed by the FTC. 

The potential for creator liability is largely determined by whether or not a content distribution platform adequately complies with laws and regulations. The liability could also expand to brands that have entered into partnerships with creators for sponsored content if the content is directed toward children and distributed in a manner that doesn’t comply with state privacy laws or COPPA.

For example, if a creator uploads content to TikTok that is intended for children, and TikTok does not have adequate protections in place to protect the privacy of children, then a creator, in addition to TikTok, could be held liable. Through its official Business Blog, the FTC confirmed in November 2019 that “COPPA applies in the same way [to YouTube channel owners] it would if the channel owner had its own website or app.” 

“The FTC does not have the resources to go after every family vlog, [Minecraft] channel, animator, and animal channel,” vlogger, author, and creator advocate Hank Green noted in a series of tweets following the FTC’s publication of its guidance for creators. “The law, as written, just doesn’t work. But creators are still required to comply.”

Creators can potentially reduce their exposure by focusing on what they are able to control, as opposed to what a platform may control, when it comes to both the content itself and the delivery of it to an underage audience. Whether or not this argument would hold up in court remains to be seen until the FTC is directly challenged on the issue, or they update the regulations and release additional guidance.

For now, the FTC looks at content through the lens of several factors when determining whether or not content is considered as being “directed to children.” These factors can be broken down into either creator-controlled, platform controlled, or a mix of both.

Creator-controlled factors include subject matter, visual content, the use of animated characters or child-oriented activities and incentives, music, audio, age of models, and the presence of child celebrities or celebrities who appear to be children. Both creators and platforms are deemed responsible for language on the site, whether advertising targeted toward children appears on the site, and whether there is “competent information” revealing information about the age of the audience.

The rise of VTubers and similar video styles that prominently utilize animated characters present a challenge for proper implementation of the FTC’s guidance. For example, the FTC states that “just because your video has bright colors or animated characters doesn’t mean you’re automatically covered by COPPA.” 

Until the FTC releases additional guidance or a change to the regulations, creators can rely on platforms that provide self-service tools to flag their content. YouTube is one example of a platform that provides a robust set of tools and guidance for creators creating content for children. However, the tools alone still fail to address the lack of clarification on the gray area between content children may enjoy and content that is directly targeting children.

“Creators know that they will lose features, audience, and money if they mark their content ‘For Kids,’” Green explains in the Twitter thread. “But they also risk getting fined or sued if they don’t.”

Adult creators that craft content specifically targeting children is nothing new, especially once platforms started sharing advertising revenue with creators. “Kidfluencers,” like Ryan Kaji of Ryan’s World, have been the subject of headlines for years, often focusing on the role parents and guardians play behind the scenes. In a 2021 interview with Time, Kaji’s mom LoAnn Guan said, “If I could do it over, I would try to incorporate more of the educational component right from the get-go.”

Prior to the FTC’s 2019 crackdown on YouTube, kids’ content delivered through the YouTube Kids app remained largely unmoderated when compared to traditional children’s programming producers and distributors, such as PBS and Paramount Global’s Nickelodeon, which are now also distributed in various forms on YouTube.

As of January 2023, the class action heads back to the lower court for the class of children to pursue their claims and potentially amend their complaint to add more support behind their arguments, while YouTube continues additional arguments it may have in support of the case being dismissed. The companies that operated the channels may be able to continue requests that the court dismisses them from the case—since it was arguably Google and YouTube that were responsible for the use of data trackers on children.

Passionfruit reached out to the companies and creators named in the class action lawsuit, as well as the attorneys representing the children and families in the class action lawsuit via email and did not hear back in time for the publication of this article.