Traditional media once offered a clear separation between ads and entertainment, with obvious commercial breaks. But now, the lines are more blurred than ever before — especially for kids. Whether it’s game publishers like Epic Games saving parents’ credit card information for kids to make easy one-click “Fortnite” purchases, creators like MrBeast creating a “Minecraft” world full of cows on behalf of Big Milk, or companies like Walmart jumping into “Roblox” with an immersive “Walmartland” universe, new research from the Federal Trade Commission (FTC) reveals it’s easy for kids to get confused about what is being sold to them.
While these examples contain “sponsored” disclosures, some advertising campaigns are less clearly marketed than others. “Blurred” and “stealth” advertising are the most common names given to these types of advertising and upsell campaigns, which are blended into content without any clear indication that the viewer is being sold a product or service.
The practice is in large part driven by the popularity of unboxing, “Let’s Play,” or similar content production styles that center around a creator’s engagement — sponsored or not — with a toy, game, or other item that the viewer otherwise would have to buy to experience. In these videos, it often becomes impossible to distinguish between advertising grifts, content upsells, and entertaining content.
But brands and content creators should beware: The FTC is cracking down on stealth advertising. The FTC’s newly published guidance document contains staff perspectives and recommendations as a follow-up to its 2022 workshop, “Protecting Kids from Stealth Advertising in Digital Media.” The paper warns platforms, influencers, and marketers: “Read carefully.”
Five Staff Recommendations
The FTC Act gives the agency the authority to pursue brands, influencers, and platforms with regulatory action, which historically has included monetary fines and penalties, prohibitions on certain activities (such as making a company’s content or app available to minors), preventing corporate executives from working in a specific industry for a period of time, and routine audits by the FTC or regular reports on how the company has been complying with regulations.
The FTC’s staff guidance on how to avoid regulatory action highlights five recommendations. These recommendations are applicable to brands and advertising agencies, creators and influencers, and platforms, while specifically cautioning that implementation of just one or two alone is likely insufficient to fully address issues of blurred advertising. Also, the FTC states that not all recommendations are applicable in every instance to all of the stakeholders (creators, brands, and platforms).
The first recommendation is for creators and brands to not blur advertising and create a clear separation between advertising content and entertainment or educational content through the use of verbal and visual cues and formatting techniques. They explain that practically, this could include short bumpers or different backdrops, different on-screen talent, or different editing styles (such as no music).
The second recommendation expands on the use of disclosures to avoid stealth advertising and applies to all three stakeholders: platforms, brands, and creators. YouTube, for example, currently offers creators a toggle when uploading a video to indicate whether or not the video is sponsored. However, it only displays during the first 10 seconds of the video, which would not be sufficient based on the FTC’s recommendation that a timely disclosure happens “at the point in time the product is introduced, and at reasonable intervals throughout the content, if the product is discussed or referenced for long periods of time or keeps re-appearing.”
The FTC also explains that gaming and immersive experiences should include a watermark-type disclosure that re-appears at regular intervals if a part of the world is heavily sponsored.
The third recommendation is fairly straightforward and recommends that platforms, creators, and advertisers create and use easy-to-understand icons disclosing advertisements in a manner that is easily seen by a viewer.
The fourth recommendation asks that creators, advertisers, and platforms work together to offer education to kids, parents, and teachers, noting that “media proficiency and digital citizenship, including ad literacy specifically, are widely recognized by government agencies and others as important to children’s development.” This educational material is already rolling out across several states.
The fifth recommendation asks that platforms consider two things: require that creators self-identify advertising content, and offer parental controls that allow parents to limit or block such content. As noted, YouTube currently offers means of flagging such content but does not give the option to limit content that contains ads.
A Blurry Road Ahead
Zooming out and looking at the big picture, things are still a bit blurry. The recommendations place a large amount of onus on the platforms to develop product solutions within their platforms that would allow brands and creators an easy, uniform method of complying with the disclosure and blocking recommendations.
However, it’s unclear how much responsibility will inevitably fall on creators to self-regulate their content. The first recommendation advises creators and brands against the use of blurred advertising while the fifth recommendation tells the platforms to develop means for creators to self-identify such sponsored content so that parents can toggle it off. In this lawyer’s opinion, the FTC should be more direct and explicitly require platforms to take action if content for kids is going to include blurred advertising, or potentially require a higher degree of human-led content moderation review.
The FTC staff recommendations also present an opportunity for creators, brands, and platforms — either together or individually within their representative industry groups — to adopt a set of standards or specific guidance when it comes to sponsored content. For example, the adoption of universal icons to indicate sponsored content.
Yet ultimately, the question remains: Can everyone simply agree that stealth advertising is off the table? For comparison, the Federal Communications Commission (FCC) imposes strict rules on broadcast, cable, and satellite TV providers that include explicit limits such as no more than 12 minutes per hour can be commercial and that the kid programs must “be separated from commercials by intervening and unrelated program material.”
A gap is potentially created now that the FTC’s recommendations shift more of the work over to parents to both stay educated on all the various platforms and piece together if and when platforms implement these recommendations. A parent will have to search within every app for the toggle to disable sponsored content because, without standards, the presentation of such a control would be different within the YouTube app versus the Facebook app.
Just the Beginning of Regulation
Many argue that social media and video platforms are the next Big Tobacco problem. The U.S. Surgeon General even expressed concerns about the impact of social media and screen time on youth. Now, the FTC is shifting the focus (and blame) away from parents and onto the platforms and content operators, such as in their recent data-tracking crackdown on Epic Games.
The agency’s recent activity in this area is also showcasing a desire to make examples of content creators and brands, moving beyond just their historical focus on platforms. Its 2019 settlement with YouTube over data-tracking of children specifically named prominent creators and brands within the toy and gaming industries, but stopped short of pursuing any action against them at the time.
Non-government organizations, such as Truth in Advertising and the Children’s Advertising Review Unit (CARU) of the BBB National Programs, regularly pursue investigations and audits of content creators, platforms, and advertisers. In lieu of strong legal and regulatory frameworks, CARU offers a range of guidelines and best practices to support content creators, brands and advertisers, and platforms to navigate regulatory complexities and opportunities to do the right thing and protect children in online environments.
Researchers are also taking action, such as the August 2023 report by Adalytics that attacked Google’s ability to retain control over its own advertising tools, explaining that ads from Fortune 500 companies geared towards adults have been appearing on “made for kids” content as of July 2023 (Google disputes the accuracy of the report).
At the end of the day, it’s important to remember that the FTC is going to look for and investigate instances where conduct is deceptive or unfair to children. It remains to be seen whether or not future legal options will be available for parents and children to pursue brands, creators, or platforms directly. For now, the FTC at least recognizes that “‘plac[ing] the burden entirely on parents to protect their children from these harms’ ignores the monumental changes to the digital ecosystem that have made a simple ‘No TV on school nights’-style approach impractical.”
“We now live in a world where kids spend many hours a day online, often in immersive environments,” Samuel Levine, Director of the FTC’s Bureau of Consumer Protection, said in the recently released report. “Creators, advertisers, and platforms must take responsibility for preventing and addressing the harms associated with blurred advertising. And this responsibility is about more than additional disclosures. Disclosures, while necessary, are not the silver bullet. As experts have made clear, there is no silver bullet.”
What are your thoughts on the FTC’s new Stealth Advertising guidelines? Email [email protected] to let us know.