My Blog
Technology

YouTube Ads May Have Led to Online Tracking of Children, Research Says

YouTube Ads May Have Led to Online Tracking of Children, Research Says
YouTube Ads May Have Led to Online Tracking of Children, Research Says


This year, BMO, a Canadian bank, was looking for Canadian adults to apply for a credit card. So the bank’s advertising agency ran a YouTube campaign using an ad-targeting system from Google that employs artificial intelligence to pinpoint ideal customers.

But Google, which owns YouTube, also showed the ad to a viewer in the United States on a Barbie-themed children’s video on the “Kids Diana Show,” a YouTube channel for preschoolers whose videos have been watched more than 94 billion times.

When that viewer clicked on the ad, it led to BMO’s website, which tagged the user’s browser with tracking software from Google, Meta, Microsoft and other companies, according to new research from Adalytics, which analyzes ad campaigns for brands.

As a result, leading tech companies could have tracked children across the internet, raising concerns about whether they were undercutting a federal privacy law, the report said. The Children’s Online Privacy Protection Act, or COPPA, requires children’s online services to obtain parental consent before collecting personal data from users under age 13 for purposes like ad targeting.

The report’s findings raise new concerns about YouTube’s advertising on children’s content. In 2019, YouTube and Google agreed to pay a record $170 million fine to settle accusations from the Federal Trade Commission and the State of New York that the company had illegally collected personal information from children watching kids’ channels. Regulators said the company had profited from using children’s data to target them with ads.

YouTube then said it would limit the collection of viewers’ data and stop serving personalized ads on children’s videos.

Adalytics identified more than 300 brands’ ads for adult products, like cars, on nearly 100 YouTube videos designated as “made for kids” that were shown to a user who was not signed in, and that linked to advertisers’ websites. It also found several YouTube ads with violent content, including explosions, sniper rifles and car accidents, on children’s channels.

An analysis by The New York Times this month found that when a viewer who was not signed into YouTube clicked the ads on some of the children’s channels on the site, they were taken to brand websites that placed trackers — bits of code used for purposes like security, ad tracking or user profiling — from Amazon, Meta’s Facebook, Google, Microsoft and others — on users’ browsers.

As with children’s television, it is legal, and commonplace, to run ads, including for adult consumer products like cars or credit cards, on children’s videos. There is no evidence that Google and YouTube violated their 2019 agreement with the F.T.C.

The Times shared some of Adalytics’ research with Google ahead of its publication. Michael Aciman, a Google spokesman, called the report’s findings “deeply flawed and misleading.” Google has also challenged a previous Adalytics report on the company’s ad practices, first reported on by The Wall Street Journal.

Google told The Times it was useful to run ads for adults on children’s videos because parents who were watching could become customers. It also noted that running violent ads on children’s videos violated company policy and that YouTube had “changed the classification” of the violent ads cited by Adalytics to prevent them from running on kids’ content “moving forward.”

Google said that it did not run personalized ads on children’s videos and that its ad practices fully complied with COPPA. When ads appear on children’s videos, the company said, they are based on webpage content, not targeted to user profiles. Google said that it did not notify advertisers or tracking services whether a viewer coming from YouTube had watched a children’s video — only that the user had watched YouTube and clicked on the ad.

The company added that it did not have the ability to control data collection on a brand’s website after a YouTube viewer clicked on an ad. Such data-gathering, Google said, could happen when clicking on an ad on any website.

Even so, ad industry veterans said they had found it difficult to prevent their clients’ YouTube ads from appearing on children’s videos, according to recent Times interviews with 10 senior employees at ad agencies and related companies. And they argued that YouTube’s ad placement had put prominent consumer brands at risk of compromising children’s privacy.

“I’m incredibly concerned about it,” said Arielle Garcia, the chief privacy officer of UM Worldwide, the ad agency that ran the BMO campaign.

Ms. Garcia said she was speaking generally and could not comment specifically on the BMO campaign. “It should not be this difficult to make sure that children’s data isn’t inappropriately collected and used,” she said.

Google said it gave brands a one-click option to exclude their ads from appearing on YouTube videos made for children.

The BMO campaign had targeted the ads using Performance Max, a specialized Google A.I. tool that does not tell companies the specific videos on which their ads ran. Google said that the ads had not initially excluded children’s videos, and that the company recently helped the campaign update its settings.

In August, an ad for a different BMO credit card popped up on a video on the Moolt Kids Toons Happy Bear channel, which has more than 600 million views on its cartoon videos. Google said the second ad campaign did not appear to have excluded children’s videos.

Jeff Roman, a spokesman for BMO, said “BMO does not seek to nor does it knowingly target minors with its online advertising and takes steps to prevent its ads from being served to minors.”

Several industry veterans reported problems with more conventional Google ad services. They described how they had received reports of their ads running on children’s videos, made long lists to exclude those videos, only to later see their ads run on other kids’ videos.

“It’s a constant game of Whac-a-Mole,” said Lou Paskalis, the former head of global media for Bank of America, who now runs a marketing consulting firm.

Adalytics also said that Google had set persistent cookies — the types of files that could track the ads a user clicks on and the websites they visit — on YouTube children’s videos.

The Times observed persistent Google cookies on children’s videos, including an advertising cookie called IDE. When a viewer clicked on an ad, the same cookie also appeared on the ad page they landed on.

Google said it used such cookies on children’s videos only for business purposes permitted under COPPA, such as fraud detection or measuring how many times a viewer sees an ad. Google said the cookie contents “were encrypted and not readable by third parties.”

“Under COPPA, the presence of cookies is permissible for internal operations including fraud detection,” said Paul Lekas, head of global public policy at the SIIA, a software industry group whose members include Google and BMO, “so long as cookies and other persistent identifiers are not used to contact an individual, amass a profile or engage in behavioral advertising.”

The Times found an ad for Kohl’s clothing that ran on “Wheels on the Bus,” a nursery rhyme video that has been viewed 2.4 billion times. A viewer who clicked on the ad was taken to a Kohl’s web page containing more than 300 tracking requests from about 80 third-party services. These included a cross-site tracking code from Meta that could enable it to follow viewers of children’s videos across the web.

Kohl’s did not respond to several requests for comment.

A Microsoft spokesman said: “Our commitment to privacy shapes the way we build all our products and services. We are getting more information so that we can conduct any further investigation needed.” Amazon said it prohibited advertisers from collecting children’s data with its tools. Meta declined to comment.

Children’s privacy experts said they were concerned that the setup of Google’s interlocking ecosystem — including the most popular internet browser, video platform and largest digital ad business — had facilitated the online tracking of children by tech giants, advertisers and data brokers.

“They have created a conveyor belt that is scooping up the data of children,” said Jeff Chester, the executive director of the Center for Digital Democracy, a nonprofit focused on digital privacy.

Related posts

Gmail Is Now Getting Emoji Reactions: Here’s How to Use Them

newsconquest

HP Spectre Foldable PC Review: It’s Slick but Quirky

newsconquest

What Reddit’s IPO means and how it could change for users

newsconquest