Buscar

youtubers predator problem

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes
Você viu 3, do total de 16 páginas

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes
Você viu 6, do total de 16 páginas

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes
Você viu 9, do total de 16 páginas

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes

Prévia do material em texto

Research Article
Convergence: The International
Journal of Research into
New Media Technologies
2022, Vol. 28(3) 822–837
© The Author(s) 2022
Article reuse guidelines:
sagepub.com/journals-permissions
DOI: 10.1177/13548565211066490
journals.sagepub.com/home/con
“YouTube’s predator problem”:
Platform moderation as
governance-washing, and user
resistance
Emily Tarvin and Mel Stanfill
University of Central Florida, Orlando, FL, USA
Abstract
YouTube experienced large-scale criticism in early 2019 for predatory behavior toward children on
the platform. To address concerns about children’s safety, YouTube acted quickly by demonetizing
and deactivating comments on videos featuring minors. In this paper, we analyze both the company’s
response to this scandal and how users received that response. We argue that YouTube’s reaction
was governance-washing, which presents the appearance of vigorous platform moderation and
leverages popular perceptions of technology to create the look of authority while deflecting
questions about substance. While YouTubers and users did not dispute that the pedophilic
comments were heinous, they questioned the effectiveness of the company’s solutions, arguing that
YouTube’s platform governance actions did not solve the problem. Ultimately, we show that users
have cogent critiques of governance policies that pretend to be comprehensive but fail to solve what
they purport to address, and offer up the term “governance-washing” as a useful framework to
make sense of such cases.
Keywords
YouTube, platform governance, child safety, online communities
In February 2019, YouTube user Matt Watson (username MattsWhatItIs), uploaded a video re-
vealing a YouTube subculture that comments on videos featuring “young girls doing gymnastics,
playing Twister or stretching” (Wakabayashi and Maheshwari, 2019). As one journalist noted, “The
videos aren’t pornographic in nature, but the comment sections are full of people time stamping
specific scenes that sexualize the child or children in the video” (Alexander, 2019). Other com-
menters asked whether the children wore underwear or left “a string of sexually suggestive emojis”
Corresponding author:
Emily Tarvin, Texts and Technology, University of Central Florida, 4000 Central Florida Blvd., P.O. Box 161889, Orlando,
FL 32816-2368, USA.
Email: entarvin@Knights.ucf.edu
https://uk.sagepub.com/en-gb/journals-permissions
https://doi.org/10.1177/13548565211066490
https://journals.sagepub.com/home/con
https://orcid.org/0000-0002-0269-8583
mailto:entarvin@Knights.ucf.edu
http://crossmark.crossref.org/dialog/?doi=10.1177%2F13548565211066490&domain=pdf&date_stamp=2022-01-10
(Wakabayashi and Maheshwari, 2019). Further, the exposé video demonstrated how, because of
YouTube’s recommendation algorithm, “It only took two clicks for Watson to venture away from a
video of a woman showcasing bikinis she’s purchased to a video of a young girl playing”
(Alexander, 2019). In addition to terrible comments, The New York Times reported, many of these
videos “carry advertisements for major brands” (Wakabayashi and Maheshwari, 2019). As a result,
“Fortnite publisher Epic Games said it had paused all pre-roll advertising on YouTube” and “a
Peloton spokesperson said it was working with its media buying agency to investigate why its
adverts were being displayed against such videos” (Orphanides, 2019).
The YouTube company took steps to solve the problem; the comments were likely particularly
concerning because family channels have been the source of YouTube’s greatest growth of viewing
time (Burroughs, 2017). However, the purported solutions caused frustration and confusion among
YouTubers.1 Initially, reports that YouTube would demonetize videos simply because they were
subject to predatory or hateful comment sections sparked outrage. This subsided quickly as
YouTube stopped demonetizing videos and turned off comments sections instead (Robertson,
2019). However, many users did not feel that this was a good solution either, depriving users of the
community-building power of comments because of the actions of predatory third parties.
The early 2019 controversy that is our case study was not the first one YouTube faced. The
company regularly struggles to balance the desires of the platform community with advertisers and
legal requirements such as copyright. A particularly notable escalation of these tensions was
“Adpocalypse,” which began in April 2017. Cunningham and Craig (2019: 112) describe Ad-
pocalypse as “a rolling series of crises,” explaining that “in 2017, investigating journalists revealed
that multinational and national brand advertising was appearing programmatically alongside
YouTube videos featuring terrorist organizations, antisemitic clips discussing a ‘Jewish World
Order,’ and Swedish neo-Nazi groups.” After this news broke, more than 250 large advertisers
pulled their YouTube ads, and the company took steps to “crack down immediately on this flagrant
failure of programmatic advertising to maintain baseline community standards” (Cunningham and
Craig, 2019: 112). This resulted in tighter restrictions about what content was monetizable, and
many content creators had videos demonetized for not meeting these criteria. Kumar (2019: 4)
explains that the most substantial change after Adpocalypse was that advertisers could remove their
ads from broad categories of content; the demonetization of many videos from LGBT creators that
resulted received significant attention and backlash. Adpocalypse was a watershed moment for
YouTube as creators realized that the corporation would prioritize advertisers over them, which
“appeared to contradict its longstanding support of certain marginalized and alternative creators and
communities” (Cunningham and Craig, 2019: 113). Adpocalypse gave rise to a narrative that
YouTube was “anti-creator,” which we discuss further below.
In addition to this previous large-scale advertiser revolt, there had also been a previous
controversy about children’s videos. In late 2017 came Elsagate, which revealed that seemingly
innocuous videos using characters from children’s media were full of disturbing content. As one
journalist reported what was “hiding in plain sight on some of YouTube’s most popular children’s
channels: creators were drawing children in with familiar characters — most notably Elsa from
Frozen, but also Spider-Man and the Joker— then arranging them in bizarre situations involving
cheating spouses or public urination”; other videos used “innocent thumbnails” but led to videos
of children engaged in sex acts (Brandom, 2017). In the aftermath, “major advertisers have
responded to the growing controversy by pulling ads from the streaming platform. Mars Inc. and
Adidas are among the major brands to suspend advertising on YouTube while it cleans up the site”
(Whigham, 2017). It was against the background of Adpocalypse and Elsagate, that the company
Tarvin and Stanfill 823
responded to the renewed controversy over harm to children and advertiser backlash that we discuss
here.
In the case of the disturbing comments on videos of children, concerns about child safety
required the YouTube company to quickly develop a detection and enforcement apparatus, but while
YouTubers did not dispute that these comments were abhorrent, they did contest the response. In this
paper, we analyze both the company’s response to this scandal and how that response was received
by YouTubers and other users. First, in the tradition of work like Hokka’s (2021: 147), we examined
“YouTube’s presentation of themselves” and the “different kinds of publicly available user guidance
and promotional materials from YouTube.” We conducted a systematic collection of both the
company’s official posts on their Creator Blog and Help sites as well as responses their spokes-
people gave to journalists, moving through search results until we had reached saturation; this
happened quickly because the statements in both types of document were so similar. We focusedon
how they explained what they did both because it indicates what they wanted the public to focus on
and because, like YouTubers impacted by the situation, we simply don’t have visibility into the
underlying platform technology itself. In this, much as Gillespie (2018: 72) argues that platform
policies are “texts into which we must peer closely, to see what values they represent and upon what
theories of governance and rights they are premised,” we turn to public statements as texts and
uncover how they represent governance.
Second, similar to the research of Berryman and Kavka (2018), we conducted case studies of
high-profile response videos made by YouTubers. Since there were so many reactions, we engaged
with YouTubers who directly interacted with the YouTube company about its governance policies:
Phillip DeFranco, whose summary of the updated policies was retweeted by YouTube’s official
account; Colleen Ballinger, whose tweets about the incident received a response; and Special Books
for Special Kids, who met with YouTube representatives. We analyzed only the videos directly
discussing this controversy. We conducted a discourse analysis on these two corpuses, attending to
patterns of language use, how the problem was framed, and the underlying relations of power that
shaped this social phenomenon. We treated the company’s response separately from the user re-
sponse as two articulated moments in the overall incident.
We argue that the YouTube company’s response, far from being as robust as it claimed to be, was
an instance of governance-washing, or producing the appearance of governance without substantive
action. The ways YouTubers pushed back on these actions, staking out a demand for an effective
response rather than one that was expedient but generated both false positives and false negatives,
further reinforces that the platform’s response was the outward trappings of governance without
substance. Ultimately, we show that users have cogent critiques of governance policies that pretend
to be comprehensive but do not solve what they purport to address, and offer up governance-
washing as a useful framework to make sense of such cases.
“Hundreds of millions of comments”: Creating the appearance of
governance
YouTube’s formal response to the predatory comments, we argue, was aimed at showing itself more
than doing the work of resolving the issue. That is, it was style over substance in platform
governance, or what we’re calling governance-washing. Governance-washing builds from previous
concepts like “greenwashing” and “pinkwashing.” Greenwashing, the earliest of these terms, dates
from the mid-1990s and describes “the dissemination of false or incomplete information by an
organization to present an environmentally responsible public image” (Furlow, 2010). There is also
what Jasbir Puar calls “‘pinkwashing,’ or Israel’s promotion of a LGTBQ-friendly image to reframe
824 Convergence: The International Journal of Research into New Media Technologies 28(3)
the occupation of Palestine in terms of civilizational narratives measured by (sexual) modernity”
(Puar, 2013: 337); others have looked at other national contexts and broadened the term out to “to
the tendency to ‘coopt’ queer politics or to tout a nation’s ‘gay-friendliness’ as a marker of
modernity, civilization and desirable progress” (Dreher, 2016: 119). Pinkwashing has also been
used in the context of breast cancer, describing “companies marketing pink ribbon products while at
the same time producing and/or selling products that are associated with breast cancer” (Mart and
Giesbrecht, 2015: 1542).
What these practices have in common is that they are “disinformation from organizations seeking
to repair public reputations and further shape public image” (Laufer, 2003: 253). Two of the key
characteristics Laufer (2003: 256) identifies seem especially relevant to our analysis: -washes are
“projects that have negligible value but appear on [the] surface to be significant” and “promote [the]
image of a committed corporate culture.” As we will show, this is ultimately what the YouTube
company did in the realm of platform governance–and its users knew it. Governance-washing is
related to ethics-washing, or “tech companies’ self-interested adoption of appearances of ethical
behavior” (Bietti, 2020), in that both are-washes by corporations that seek to avoid governmental
regulation and advertiser revolt. Where the two terms differ is their target: ethics-washing is about
the technologies companies build, whereas governance-washing is about actions in relation to users.
It therefore fills a gap in our terminology for ways companies project an appearance of morality.
The first key characteristic of social media governance-washing is that it presents the appearance
of large-scale and vigorous platform governance. Thus, the YouTube company’s statements about
the incident, both in its own spaces like Help pages and the Creator Blog and responding to news
organizations, emphasized the strength of their response. First, they stressed speed. The company
insisted that “We’re taking swift action to ensure we’re identifying as much of this content as
possible” (YouTube Help, 2019). Similarly, they claimed, “When we find content that is in violation
of our policies, we immediately stop serving ads or remove it altogether” (Orphanides, 2019).
Through terms like “swift” and “immediate,” the speed of governance actions is a synecdoche for
governance itself. The platform also insisted that their enforcement had been robust, telling a
journalist,“We enforce these policies aggressively, reporting it to the relevant authorities, removing
it from our platform and terminating accounts” (Alexander, 2019).
Discussions of speed and intensity of governance actions were joined by discussions of their
quantity; there was a tendency to use large numbers to prove the platform’s dedication to solving the
problem. They “terminated hundreds of viewer accounts for the comments they left on videos”
(YouTube Help, 2019). They “disabled comments from tens of millions of videos that could be
subject to predatory behavior” (YouTube Creator Blog, 2019). They “have been removing hundreds
of millions of comments for violating our policies” (YouTube Creator Blog, 2019). Overall, the
emphasis is on the scale of the response, as in “we’re going above and beyond our existing
protections in the near term on content that may include or endanger minors” (YouTube Help, 2019).
If a platform responds quickly, intensely, and expansively to a problem, this could simply be
governance. It becomes governance-washing to the extent that these statements engage in sleight of
hand, as speed, intensity, and quantity take the place of effectiveness and make it more difficult to
analyze. These statements do not explain what they are doing or whether it works, only that there is a
lot of it.
A second characteristic of governance-washing is leveraging popular perceptions of technology
to deflect questions about the substance of governance. This is to say that statements such as the
YouTube company’s invoke what Nye (1996: xiii), among others, calls the technological sublime,
“an essentially religious feeling, aroused by the confrontation with impressive objects”; in par-
ticular, the sublime involves both awe and fear, which tends to be leveraged by contemporary
Tarvin and Stanfill 825
technology companies as something like an argument that “technology can do many wonderful
things but it is also beyond your understanding, so just trust it.” As in the case Gillespie (2020)
describes, in which companies use the buzzword “AI” to describe nothing more sophisticated than
pattern matching and identifying subsequent copies of the same content, leveraging the tendency to
be impressed by technology can suppress questions about what is actually happening. Technological
awe tactics were demonstrated by the YouTube company in this incident, as one key aspect of
insisting it had the situation under controlwas about its technologies. Thus, a spokesperson insisted
that they were “fighting the issue by developing new tools and strategies” (Alexander, 2019). What
tools and strategies these were went unspecified. Particularly, this meant spending money on
technologies: “We continue to invest heavily in technology, teams and partnerships with charities to
tackle this issue” (Alexander, 2019). However, there is little explanation of what such technologies
actually do.
In fact, even when there was more specific discussion of the technology, it still relied on the
technological sublime. The YouTube company announced that “we had been working on an even
more effective classifier, that will identify and remove predatory comments. [. . . ] We accelerated its
launch and now have a new comments classifier in place that is more sweeping in scope, and will
detect and remove 2Xmore individual comments” (YouTube Creator Blog, 2019). This may or may
not have been intentionally opaque, but nevertheless uses the term “classifier,” which as a technical
term referring to an algorithmic or machine learning process does not appear in standard dictionaries
like Merriam-Webster, without explaining what it means. While from context a reader could discern
that this is a technology that classifies, the fact that it determines which one of a fixed number of
categories a particular comment fits into, based on whatever criteria the tool was given, is spe-
cialized rather than general knowledge. The statement also invokes the rhetoric of speed discussed
earlier. What it doesn’t do is specify what will be classified (usernames associated with that behavior
in other comments, word choice in comments, use of time stamps to call attention to particular parts
of the video, something else?), using what criteria (with 51% certainty? 90%?), nor indeed what
number it is they claim to have doubled. While, to some extent, the more they explain their tools, the
easier it becomes to circumvent them, there is also a reliance on both buzzwords and the low
technical literacy of the average person to discourage interrogating these solutions. In such ways,
governance-washing leverages the mystique around technology to achieve the appearance of action
without need for substance.
Our critique here is not that the YouTube company doesn’t get it right every time; that would be
impossible. As Gillespie (2018: 9) notes, “moderation is hard because it is resource intensive and
relentless.” On one hand, algorithmic solutions lack precision, producing both false positives and
false negatives. On the other hand, human review is impossible at the scale of contemporary social
media platforms. However, the company not only does not acknowledge that it is dealing with a very
challenging problem, but acts as if it has it under control when it does not. A key component of
governance-washing is thus the creation of a false sense of control where the primary interest is
managing public perception, not effective governance.
“Channels will be required to moderate”: Offloading governance onto
users
Another way the YouTube company’s governance was governance-washing is that they offloaded
responsibility for governance onto users. That is, at the same time as emphasizing their actions,
especially speed, volume, and technology, the company also asked those who use the platform to
contribute to platform governance–not in the sense of involvement in setting policy, but as the eyes
826 Convergence: The International Journal of Research into New Media Technologies 28(3)
and ears and reporting clicks of enforcement. That is, in a playbook common across social media,
users are asked to do some of the work of content moderation. After announcing that videos that
include minors will have reduced or no ads, the YouTube Help page goes on to say, “Additionally, if
you see any type of content (videos, comments, etc.) that you think exploits minors, please flag it for
review and select ‘child abuse’ in the reporting tool” (YouTube Help, 2019). This appears to be at
odds with the company’s public statements that it had everything under technological control. This
is, as Gillespie (2018: 65) notes, “convincing users to produce quality content and proper
participation—work that benefits the platform. Many of these platforms depend to a great degree on
their users producing the bulk of the content, labeling it, organizing it, and promoting it to others,” as
well as, in this case, reporting it. In this way, some of the labor of keeping YouTube from becoming a
cesspit of awful content is outsourced to users—tasks which “would cost much more capital if they
were performed by regularly employed wage labour” (Fuchs, 2010: 143), or indeed, given the
massive volume of content involved, could be so prohibitively expensive the platform might never
turn a profit.
Similarly, individual creators are asked to take governance actions on their channels. The Creator
Blog announces that “A small number of creators will be able to keep comments enabled on these
types of videos [that feature children]. These channels will be required to actively moderate their
comments, beyond just using our moderation tools, and demonstrate a low risk of predatory
behavior” (YouTube Creator Blog, 2019). This is in some ways extortionate: creators must take on
additional labor—specifically, labor not supported by the platform’s own tools—or else lose access
to the platform’s “social core” of “contributing content, referring to, building on and critiquing each
other’s videos, as well as collaborating (and arguing) with one another” (Burgess and Green, 2009:
24). YouTube’s management is aware that this interaction is the lifeblood of a YouTube fan base; in a
post to the YouTube Creator Blog, the YouTube company’s CEO Wojcicki (2019a) wrote, “We
know how vital comments are to creators. I hear from creators every day howmeaningful comments
are for engaging with fans, getting feedback, and helping guide future videos.” They even brag
about their platform’s interactive features; in an earlier post, Wojcicki (2019c) explained, “it’s the
engagement between creators and viewers that truly sets YouTube apart from traditional media like
TV.”
The third indication of offloading governance is that the YouTube company refuses respon-
sibility by telling users it’s their own fault for allowing children onto YouTube in the first place.
When asked for a comment on the pedophile problem, the spokesperson seemed to try to change the
subject: “YouTube said it has invested significantly in YouTube Kids and heavily markets the
product to parents, to encourage them to direct children to YouTube Kids rather than the main
service” (Martineau, 2019), essentially arguing that parents should instead use what the company
describes as “a filtered version of YouTube, built just for kids to explore their interests in a
contained, age-appropriate experience” (YouTube Kids, n.d.).2 In such ways, at the same time that
the company emphasizes how large and intense its response to the crisis is, it also demands that
others take on some of the responsibility. These rhetorical moves to not only deflect blame, but the
labor of platform governance itself, are part of what makes this incident a clear case of governance-
washing.
Overall, while in some ways this is a standard showy response to a scandal, we argue that its
particular features indicate a larger formation we are calling governance-washing. This was sleight
of hand intended to repair the platform’s reputation, taking steps that were only minimally useful
while making a show of their efforts. It speaks to a desire to be seen as acting responsibly, but also
trying to overcome criticism through extravagant rigor. However, the YouTuber responses we will
discuss in the next section show it was clear to many that this response was not what it claimed to be.
Tarvin and Stanfill 827
“Under the guise of protecting children”: Failing to address the
intended problem
While there was no visible disputethat these comments on videos of children were heinous,
YouTubers and users questioned the costs and effectiveness of the company’s response. In par-
ticular, the accounts we studied demonstrated a sense that YouTube had not solved what it purported
to solve. First, there were false positives identifying innocuous content as harmful. One YouTuber
making this critique was Colleen Ballinger. Ballinger has several YouTube channels with large
followings, including parody channel Miranda Sings, which has almost eleven million subscribers;
she also uploads non-parody videos to a personal channel with more than 8.5 million subscribers
and a vlog “Colleen Vlogs” with three million subscribers. During mid-2018, Ballinger announced
that she was pregnant and began discussing pregnancy and motherhood on her personal and vlog
channels; after the birth of her son, she regularly featured him in videos–making her vulnerable to
these policies. After YouTube’s restrictions on videos featuring minors, Ballinger uploaded a vlog
critiquing the new rules. As many who critiqued this policy did, she underscored her commitment to
protecting children, saying, “I want to keep children safe. Like, that is my top priority… I have a
baby”; she also made a point of insisting her complaint was not for personal gain: “me complaining
about videos getting demonetized or the comments taken off is not me being like, ‘man, that’s not
fair for me’” to legitimate her critique as disinterested (Colleen Vlogs, 2019). She was particularly
critical of demonetization, explaining that the family channel run by her brother Chris Ballinger and
his wife Jessica Ballinger had several videos demonetized despite being “very ad friendly, very, very
wholesome” (Colleen Vlogs, 2019). By emphasizing that her sister-in-law’s content was inof-
fensive, Ballinger implied YouTube was wrongfully punishing innocent creators.
Another channel reporting false positives was Special Books by Special Kids (SBSK), run by
spouses Chris Ulmer and Alyssa Porter, which “seeks to normalize the diversity of the human
condition under the pillars of honesty, respect, mindfulness, positivity and collaboration. This multi-
media movement supports the acceptance and celebration of all members of the neurodiverse/
disability community regardless of diagnosis, age, race, religion, income, sexual orientation, gender
or gender expression” (Special Books by Special Kids, n.d.). In their first video addressing the new
policies, Ulmer and Porter explained that all of their comments were deactivated by YouTube, even
on videos without minors, because their channel had been labeled “high risk for predatory
comments” (Special Books by Special Kids, 2019b). They expressed their confusion over this
categorization because they had not seen troubling comments on their videos. Ulmer says, “And if
you know our channel, you know that all of the comments are positive …and [YouTube] will not
communicate with us what puts us as ‘high risk for predatory comments’ as they’ve stated it.”
Moreover, like Ballinger’s defense of her sister-in-law, they described their videos as wholesome
and uplifting. Thus, the YouTubers we examined contested the company’s claim that its response
was effective.
Additionally, in contrast to the technological sublime promoted by the company, the YouTubers
in our study are critical of technology as a solution to this problem. SBSK contends YouTube’s
algorithms misidentify some comments as predatory. For example, Ulmer explains that, though
supportive comments about one interviewee’s appearance were about an adult, they were shut
down; “Since our channel name has ‘kids’ in it, [YouTube] saw these comments as pedophilic,
because it saw ‘gorgeous’ and ‘beautiful’ and the word ‘kids’” (Special Books by Special Kids,
2019a). He argues that such automated decision making causes videos without children to have
comments deactivated. In such ways, both SBSK and Ballinger argued that their videos were not in
the wrong and that YouTube’s supposed solutions punished innocent creators rather than predators.
828 Convergence: The International Journal of Research into New Media Technologies 28(3)
Ultimately, this is a contention that these acts of platform governance did not solve what they
purported to solve.
Second, in addition to false positives, the YouTubers we studied argued that there was uneven
enforcement of the new policies. Ulmer calls the decision to deactivate their comments “dis-
criminatory,” explaining, “The reason this is discriminatory is because they’re doing it under the
guise, under the mask of protecting children from predators. But they’re only selecting certain
channels featuring minors.” Porter further argued that mainstream businesses and advertisers do not
have their content policed as intensely, “even if their content is more subjective to the type of
predation that they’re trying to combat” (Special Books by Special Kids, 2019b). For example,
Porter and Ulmer point out, videos uploaded by companies like Disney or Nickelodeon, which
heavily feature children, have comments enabled. However, content made by YouTube creators,
even those with millions of subscribers, have comments deactivated if the video features a child.
Such apparent differential treatment fueled the narrative that YouTube’s decision to leave large
companies’ comments enabled reflected favoring advertisers over independent content creators and
so was–as DeFranco (2019a), who runs a popular, daily YouTube show that reports on news and pop
culture events–reported, seen as “anti-creator.”
If some channels have comments deactivated while others do not, even if their videos include
minors, this raises questions about the purpose of comments. On a video uploaded by a company,
comments enable discussing the content or product advertised, and so let companies promote
their business–though comments also let users talk with and back to companies. Corporations
rely on social media’s presentation that users have the potential to interact with them through
commenting. However, this kind of interaction is even more important for YouTube micro-
celebrities, who, as multiple scholars have shown, operate on intimacy (Berryman and Kavka,
2017, 2018; Jerslev, 2016; Raun, 2018) and a sense that they can be accessed through social
media platforms (Berryman and Kavka, 2017, 2018; Garcı́a-Rapp and Roca-Cuberes, 2017; Hou,
2019; Jerslev, 2016; Raun, 2018) such as via YouTube comments. Companies also benefit from
this apparent authenticity and intimacy, and these benefits, and incentives to keep advertisers
happy, may be why YouTube allows comments on videos featuring minors uploaded by large
companies.
Moreover, the YouTubers we studied argued the YouTube company was favoring larger channels
over smaller ones. Ballinger’s video discussed above was initially demonetized and had comments
deactivated. As of June 2021, the video description still says, “Youtube demonetized this video and
disabled all the comments. Interesting... I call them out, and they punish me and make it so that no
advertisers can see it, and make it so no one can comment and start a discussion. Wowwww....”
(Colleen Vlogs, 2019), though YouTube has since remonetized the video and reopened comments.
Comparing this to SBSK, DeFranco questions, “Why did the big YouTuber get their stuff reinstated?
But SBSK, who once again uploaded a video yesterday, it features this just adorable little girl talking
about being blind and having a growth deficiency, why is that disabled?” (DeFranco, 2019a).
However, not all popular YouTubers had such success. Popular content creators that either are
minors or heavily feature minors, like teenage vlogger, pop singer, and dancer JoJo Siwa and Ryan’s
World, formerly RyanToysReview,3 still have deactivated comments. Thus, the YouTubers in our
study argued that enforcement of the policy was both opaque and inconsistent.
Third, the YouTubers we analyzed contended that the policy did not solve the problem of
predatoryactions toward children. Ballinger argues YouTube’s solution is problematic and actually
aids pedophiles: “Because now, the pedophile doesn’t have to sit through an ad to [get to] watching a
video of a kid, and prey on a child, and watch a video, and be disgusting about a child, a victim. Like,
this child now is a victim of someone doing a disgusting thing, and now they can watch it with no
Tarvin and Stanfill 829
ads and they can’t comment, which means there’s no way to find them” (Colleen Vlogs, 2019). She
added that she views these restrictions as punishing creators, paralleling DeFranco’s comments
about many users seeing YouTube’s response as “anti-creator,” but also as aiding and abetting
predators. Ballinger compares this to other forms of harmful comments, giving a hypothetical
situation of an LGBT vlogger getting homophobic comments on a video and then having their
videos demonetized because advertisers didn’t want their ads next to homophobic comments, which
would compound the harm of the homophobia. In such ways, she argues, YouTube’s governance
actions do nothing to stop abuse at the same time as they cause creators to lose revenue. Overall, the
perception among the YouTubers whose videos we analyzed was that predatory actions against
children weren’t solved.
“I feel like YouTube took that from us”: Unintended consequences
Beyond not solving the problem, the YouTubers in our study argued, disabling comments to control
predatory behavior had inadvertent effects. Ballinger (2019) expressed concern about unintended
consequences of the policy in a tweet, saying, “So now YouTube can punish creators by disabling
the comment section & demonetizing videos if the comments aren’t ad-friendly? If this is true, every
YouTuber needs to start looking for a new job. There are hardly any videos on YouTube that lack
vile comments. How is this fair?” This is a contention that the policy is too broad and harms the
wrong people. Ballinger’s concern that YouTubers may need new jobs reflects that many creators
seek to earn revenue. Typically, an ad will play before a monetized video, or ads may appear near a
video while it plays. The uploader earns money from each ad viewed, but monetization also means
that YouTube promotes the video.
Since the YouTube company benefits from users watching ads, the platform displays advertiser-
friendly content as recommended to seek as many views as possible. By contrast, if a video is
deemed nonmonetizable, YouTube suppresses it. Gillespie (2014: 172) explains, “YouTube ‘al-
gorithmically demotes’ suggestive videos, so they do not appear on lists of the most watched, or on
the home page generated for new users.” Thus, when YouTube deactivated comments of videos
featuring minors, many viewed this as better than demonetization, as demonetization leads to lost
income and YouTube’s algorithms not suggesting the video. As DeFranco (2019a) says in a
demonstration of this perspective, “I’m generally of the mindset of ‘Well, if you still have ads but
you’re just not getting comments, what’s the big deal?’” While the consequences of removing
monetization are relatively obvious, removing comments was also seen as harmful to creators. For
example, comments also influence YouTube’s algorithms and signal the popularity of a video; a
video with many comments often indicates that many users are watching and discussing it (Postigo,
2016).
Moreover, comments facilitate conversation between content creators and viewers, which
supports the development of YouTube culture through regular communication. According to
Strangelove (2010: 103), “There remains the simple fact that many Internet users see themselves as
part of a community; this is particularly true of YouTubers.”Users regularly refer to the collection of
regular platform users as the “YouTube community” or discuss smaller communities formed around
specific channels. Community building is thus an essential component of the platform, and
comments are one of the main ways users develop a sense of community. The absence of comments
sections therefore removes an important social affordance of the platform. YouTube prides itself on
enabling two-way conversation, but without comments, it is closer to traditional forms of media,
like television, with only one-way communication. While users still have a voice through uploading
videos, removing comments removes a major affordance that enables building social connections.
830 Convergence: The International Journal of Research into New Media Technologies 28(3)
While this may not affect creators who are predominantly business-motivated or users mainly
seeking entertainment, deactivating comments can be devastating for marginalized groups or those
who struggle to find community in other ways. In such ways, the YouTube company’s governance-
washing actions undermine the platform’s selling points, doing specific kinds of harm in the name of
governance.
In particular, SBSK argued that comments are imperative to their mission because their in-
terviewees use comments to interact with, educate, and receive kindness from viewers. Ulmer said
that neurodiverse and disabled people “have a voice, but we give them the platform to use that voice.
And the second purpose is to let the world communicate with them. To show the people who we
meet that there is good in the world” (Special Books by Special Kids, 2019a). Ulmer explained that,
for the people featured in SBSK’s videos, it is important “that so many people understand them and
love them and want them to be included. And I feel like YouTube just took that from us.” He gave
the example of a then-upcoming video featuring a young girl battling cancer. Ulmer notes that the
girl struggles to make friends at school and eats lunch with the principal every day, saying he had
hoped the video would let viewers support her and help her not feel alone, but without comments
that was not possible (Special Books by Special Kids, 2019a). DeFranco (2019c), who as noted
above thought removal of comments might be a reasonable solution, explains,
I do look to channels like SBSK, and I see the impact there. You have these kids in a variety of situations
that- where they probably put themselves in the ‘other’ box. You know, they might feel disconnected,
they’re sharing this story, they’re in a very vulnerable place, and those comments may make them feel
more tethered to society. It may make them feel good about what they’re going through, and now they’re
not getting that, you know.
Thus, not only affected channels perceived this to be an unintended consequence, but so too did
general commentators on YouTube such as DeFranco. Ulmer also says that removing comments
means people cannot look back on past conversations, which is particularly upsetting for those who
want to remember people: “There are videos of people who passed away and their parents read the
comments as a way of keeping their kid alive. And now they’re gone” (Special Books by Special
Kids, 2019a). There was a shared sense among the accounts we studied that, in preventing users
from commenting about children at all in an effort to prevent inappropriate comments, YouTube
removed many beneficial uses of comments.
A further unintended consequence is more discursive. SBSK’s videos that only show adults are
still deemed “high risk” and have comments removed. In that these are neurodiverse and/or disabled
adults, this reproduces and reinforces a long history of treating such adults like children (Kafer,
2013; Thomson, 1996). As Ulmer explains, since many YouTube users are aware of why some
videos have comments deactivated, this influences viewers’ perceptions of these marginalized
groups. He says, “And now you go to one of our videos, and you see, no, you can’t have a
conversation with that person. You’re not allowed to talk to them! ‘No, this person is at high risk for
predatory comments.’ People want to make fun of them. People want to tease them.” He insists that
harmful conversations have rarely happenedon their channel, but a deactivated comment section
creates the impression that this is routine (Special Books by Special Kids, 2019b). Ulmer also points
out that this label of “high risk” reinforces the notion that neurodiverse and disabled people are
vulnerable and need to be separated from others for their own protection rather than letting them
make those decisions for themselves. In such ways, the YouTubers we studied present a complex
critique of the platform’s actions to solve the predator problem, ultimately rejecting it as governance.
Tarvin and Stanfill 831
The YouTubers in our case studies do recognize the difficulty of the problem. Both Ballinger and
SBSK acknowledge that YouTube’s scale increases the difficulty of stopping pedophilic comments.
Although Ballinger heavily critiques the company, she explains, “in YouTube’s defense… This is a
hard thing to solve. There are billions of comments on this platform, and there are so many channels,
so many videos. Like there is no way to monitor it all.” She adds that she hopes the platform is
working on truly solving the issue, but she has not heard anything about it (Colleen Vlogs, 2019).
Similarly, Ulmer says, “We know that YouTube is huge. We know it’s so impossible to moderate all
of this” (Special Books by Special Kids, 2019a). DeFranco similarly explains, “YouTube’s scale,
howmassive the site is, howmany minutes of video are uploaded every single second, there is really
no perfect way for them to crack down without some people that should not be getting hit getting
hit” (DeFranco, 2019b). He understands people’s frustrations, but sees the issue as inevitable. The
scale of the platform, DeFranco (2019c) notes, “brings up the question of: well can YouTube
actually fully sanitize the site in all ways.” He adds, “There is not a doubt in my mind that YouTube
is actively trying to combat this problem. This narrative that YouTube doesn’t care is stupid, not
because I’m like well a company can’t have morals but because it hurts their bottom line”
(DeFranco, 2019c). In such ways, DeFranco presents YouTube as not immoral but potentially
amoral–preserving the “bottom line” of advertising revenue demands that YouTube take action. It is
here that the framework of governance-washing may be most useful: the platform needs to be seen
to do something, but that stands in an unclear relationship to actually doing something.
All three YouTubers offered possible alternative governance approaches to dealing with this
problem. SBSK called for more open communication between the platform and creators. Ulmer
explained that if SBSK did have predatory comments, he and Porter would want to discuss that with
the company. Porter adds, “Every creator that’s been affected deserves to have that type of
conversation, so that they can better protect their channel, they can better protect their communities
wherever their content lives” (Special Books by Special Kids, 2019a). Porter and Ulmer urge
improved communication with those uploading content labeled “high risk.” For his part, DeFranco
suggests more human moderation: “is there a situation where several channels can get together and
put money towards like a 24–7 comment moderating service? You know, like in addition to the
banned words, so that YouTube isn’t scared that some bad actor is going to leave a bunch of shitty
comments, screenshot it, and try and reignite this whole controversy.” He explains that he does not
know all of the answers, but if YouTube approved and verified a service like he described, he would
help financially support it to help channels like SBSK (DeFranco, 2019a). Finally, Ballinger is
adamant that prohibiting comments on videos featuring minors actually helps pedophiles by taking
away a method to identify them. Instead, she says, “the right thing to do would be to find the people
commenting these gross pedophile comments about children, figure out who posted those com-
ments, see if they can, on the back end, figure out where those accounts were created.” She does
clarify that she is only assuming the company can do this technologically, but says that if they can,
moderators could report pedophilic comments to the police. She advocates making such users
“suffer the consequences of being a pedophile and preying on young, innocent children,” as without
the comments, predators can more easily escape punishment (Colleen Vlogs, 2019). The YouTubers
we studied, then, do not merely have critiques of YouTube’s governance as governance-washing,
but put forward their own theories of governance—as involving open communication and tech-
nology for mapping users back to the people committing the potentially criminal actions, while
relying on human moderation.
832 Convergence: The International Journal of Research into New Media Technologies 28(3)
Conclusion
Ultimately, we argue that YouTube’s response to its predator problem is best understood as
governance-washing whose purpose is presenting the appearance of effective governance and
improving the public perception of the platform—effectiveness is irrelevant. This was apparent to
some users, who, as our analysis shows, resisted the narrative that these measures solved what they
purported to while highlighting that it undermined the platform’s ability to create intimacy with
audiences through the interactivity of comments.
To illustrate the usefulness of our governance-washing model beyond our case study here, we
turn to a subsequent incident with many of the same features. In February 2020, Andy Parker, whose
daughter was murdered during a live news broadcast in 2015, filed a Federal Trade Commission
complaint against Google and YouTube over the persistence of videos of the murder. YouTube’s
public statement about this case invoked the same appeals to the power of technology as we saw
with comments on videos of children, saying “We rigorously enforce these policies using a
combination of machine learning technology and human review” (Kelly, 2020). This case also
showed the offloading of governance onto users that our analysis identified. As Parker said in a CNN
op-ed, “in early 2017, Google suggested that I view and flag the content I found offensive myself.
They wanted me to watch my daughter’s murder and explain why it should be removed” (Parker,
2020). Specifically, “YouTube requires users to flag content, record time stamps, and describe the
violence within the offending videos” (Kelly, 2020)—a terrible cost for Parker or others harmed by
the content they seek to have removed. Moreover, the complaint said, the videos are not always
successfully removed, and in fact some videos of Parker’s daughter’s death remained on the
platform for years despite being reported. As a point of contrast, YouTube’s Content ID system
identifies copyrighted materials on upload and stops them before they are ever posted, again re-
inforcing the narrative that the company prioritizes large companies over non-corporate users.
In such ways, the patterns in the case we focus on in this article can be seen to be broader than our
single case. Certainly, the recurrence of the same issues the following year suggests that YouTube
was not inspired to change its platform governance by the comments controversy. Governance-
washing is therefore a problem on YouTube beyond the instance we examine here–and also a useful
term to describe various superficial attempts by platforms to appear vigilant against harmful content.
Declaration of conflicting interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or
publication of this article.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
ORCID iD
Emily Tarvin  https://orcid.org/0000-0002-0269-8583
Notes
1. Throughout, we distinguish YouTube’s content creators (YouTubers) vs. regular users (users) vs. the
company’s management (the YouTube company) vs. the platform as the aggregate of these thingsplus its
technological infrastructure (YouTube).
Tarvin and Stanfill 833
https://orcid.org/0000-0002-0269-8583
https://orcid.org/0000-0002-0269-8583
2. Indeed, in a similar sleight of hand, the creation of YouTube Kids in the first place was, as Burroughs (2017)
argues, intended to head off criticisms that children were watching too much content that was not age
appropriate.
3. Ryan’s World is a children’s channel in which a young boy does “science experiments, music videos, skits,
challenges, DIY arts and crafts and more” (Ryan’s World, n.d.). In 2019, Truth in Advertising filed a
complaint with the Federal Trade Commission accusing RyanToysReview of “deceptive advertising, saying
it’s difficult for preschoolers to discern” paid advertisements (Chmielewski, 2020), and later the channel
changed its name to “Ryan’sWorld.”Another FTC complaint filed that same year said YouTube and Google
violated the Children’s Online Privacy Protection Act of 1998 “by collecting personal information—in the
form of persistent identifiers that are used to track users across the Internet—from viewers of child-directed
channels, without first notifying parents and getting their consent. YouTube earned millions of dollars by
using the identifiers, commonly known as cookies, to deliver targeted ads to viewers of these channels,” and
even “touted its popularity with children to prospective corporate clients” (Federal Trade Commission,
2019). In response, much like the case we explore here, YouTube both proposed technological solutions and
offloaded governance: “In order to identify content made for kids, creators will be required to tell us when
their content falls in this category, and we’ll also use machine learning to find videos that clearly target
young audiences” (Wojcicki, 2019b).
References
Alexander J (2019) YouTube Still Can’t Stop Child Predators in its Comments. Available at: https://www.
theverge.com/2019/2/19/18229938/youtube-child-exploitation-recommendation-algorithm-predators
(accessed on 7 February 2020).
Ballinger C (2019) ‘So Now YouTube Can Punish Creators by Disabling the Comment Section & De-
monetizing Videos if the Comments Aren’t Ad Friendly? if This Is True, Every Youtuber Needs to Start
Looking for a New Job. There Are Hardly Any Videos on Youtube that Lack Vile Comments. How Is
This Fair? https://t.co/qOgWkwQ8Q6’/Twitter. Available at: https://twitter.com/ColleenB123/status/
1098784930790760449 (accessed on 13 April 2020).
Berryman R and Kavka M (2017) ‘I Guess A Lot of People See Me as a Big Sister or a Friend’: the role of
intimacy in the celebrification of beauty vloggers. Journal of Gender Studies 26(3). 307–320. DOI: 10.
1080/09589236.2017.1288611. Routledge
Berryman R and Kavka M (2018) Crying on youtube: vlogs, self-exposure and the productivity of negative
affect. Convergence 24(1): 85–98. DOI: 10.1177/1354856517736981. SAGE Publications Ltd
Bietti E (2020) From ethics washing to ethics bashing: a view on tech ethics from within moral philosophy. In:
Proceedings of the 2020 conference on fairness, accountability, and transparency, NewYork, NY, USA, 27
January 2020, pp. 210–219. FAT* ’20. Association for Computing Machinery. DOI: 10.1145/3351095.
3372860.
BrandomR (2017) Inside Elsagate, the conspiracy-fueledwar on creepyYouTube kids videos. Available at: https://
www.theverge.com/2017/12/8/16751206/elsagate-youtube-kids-creepy-conspiracy-theory (accessed on 11
March 2020).
Burgess J and Green J (2009) YouTube’s social network: The extending boundaries of a major phenomenon for
millions. Intermedia, December.
Burroughs B (2017) Youtube kids: the app economy and mobile parenting. Social Media + Society 3(2):
2056305117707189. DOI: 10.1177/2056305117707189. SAGE Publications Ltd
Chmielewski D (2020) How Ryan’s youtube playdate created an accidental (eight-year-old) millionaire.
Available at: https://www.forbes.com/sites/dawnchmielewski/2020/02/28/how-ryans-youtube-playdate-
created-an-accidental-eight-year-old-millionaire/#3ab155183fe0 (accessed on 28 April 2020).
834 Convergence: The International Journal of Research into New Media Technologies 28(3)
https://www.theverge.com/2019/2/19/18229938/youtube-child-exploitation-recommendation-algorithm-predators
https://www.theverge.com/2019/2/19/18229938/youtube-child-exploitation-recommendation-algorithm-predators
https://t.co/qOgWkwQ8Q6�%20/%20Twitter
https://twitter.com/ColleenB123/status/1098784930790760449
https://twitter.com/ColleenB123/status/1098784930790760449
https://doi.org/10.1080/09589236.2017.1288611
https://doi.org/10.1080/09589236.2017.1288611
https://doi.org/10.1177/1354856517736981
https://doi.org/10.1145/3351095.3372860
https://doi.org/10.1145/3351095.3372860
https://www.theverge.com/2017/12/8/16751206/elsagate-youtube-kids-creepy-conspiracy-theory
https://www.theverge.com/2017/12/8/16751206/elsagate-youtube-kids-creepy-conspiracy-theory
https://doi.org/10.1177/2056305117707189
https://www.forbes.com/sites/dawnchmielewski/2020/02/28/how-ryans-youtube-playdate-created-an-accidental-eight-year-old-millionaire/#3ab155183fe0
https://www.forbes.com/sites/dawnchmielewski/2020/02/28/how-ryans-youtube-playdate-created-an-accidental-eight-year-old-millionaire/#3ab155183fe0
Colleen Vlogs (2019) Body update & my thoughts on youtube. Available at: https://www.youtube.com/watch?
v=C2PKdLkEYrg (accessed on 26 March 2020).
Cunningham S and Craig D (2019) Social Media Entertainment: The New Intersection of Hollywood and
Silicon Valley. New York: NYU Press.
DeFranco P (2019a) Who youtube’s new crackdown is hurting, why venezuela might get even crazier, &
facebook’s new ban. Available at: https://www.youtube.com/watch?v=9cFDrXEZrGU (accessed on 26
March 2020).
DeFranco P (2019b) Why Colleen Ballinger & Top Youtubers Are Freaking Out, Zion Nike Controversy, &
More. Available at: https://www.youtube.com/watch?v=HVbXTUCnhIY (accessed on 10 March 2020).
DeFranco P (2019c) YouTube’s Predator Problem Stokes Fear and Concern, Malia Obama ‘Exposed’, &More.
Available at: https://www.youtube.com/watch?v=uxkdQgLZMRc (accessed on 10 March 2020).
Dreher T (2016) Pinkwashing the Past: Gay Rights, Military History and the Sidelining of Protest in Australia.
ID 2724515, SSRN Scholarly Paper, 29 January. Rochester, NY: Social Science Research Network.
Available at: https://papers.ssrn.com/abstract=2724515 (accessed on 11 March 2020).
Federal Trade Commission (2019)Google and YouTube Will Pay Record $170Million for Alleged Violations of
Children’s Privacy Law. Available at: https://www.ftc.gov/news-events/press-releases/2019/09/google-
youtube-will-pay-record-170-million-alleged-violations (accessed on 3 February 2021).
Fuchs C (2010) Class, knowledge and new media. Medicine, Conflict, and Survival 32(1): 141–150.
Furlow NE (2010) Greenwashing in the New Millennium. The Journal of Applied Business and Economics
10(6): 22–25.
Garcı́a-Rapp F and Roca-Cuberes C (2017) Being an online celebrity: Norms and expectations of YouTube’s
beauty community. First Monday 22(7): 7–3. DOI: 10.5210/fm.v22i7.7788.
Gillespie T (2014) The Relevance of Algorithms. In: Gillespie T, Boczkowski PJ, and Foot KA (eds) Media
Technologies: Essays on Communication, Materiality, and Society. 1 edition. Cambridge, Massachusetts:
The MIT Press, pp. 167–194.
Gillespie T (2018) Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions that
Shape Social Media. New Haven: Yale University Press.
Gillespie T (2020) Content Moderation, AI, and the Question of Scale: Big Data & Society. London, England:
SAGE PublicationsSage UK. DOI: 10.1177/2053951720943234.
Hokka J (2021) PewDiePie, racism and Youtube’s neoliberalist interpretation of freedom of speech. Con-
vergence 27(1): 142–160. DOI: 10.1177/1354856520938602. SAGE Publications Ltd
Hou M (2019) Social media celebrity and the institutionalization of YouTube. Convergence 25(3): 534–553.
DOI: 10.1177/1354856517750368. SAGE Publications Ltd
Jerslev A (2016) In The Time of the Microcelebrity:Celebrification and the YouTuber Zoella. International
Journal of Communication 10: 19.
Kafer A (2013) Feminist. 1 edition. Queer, Crip: Indiana University Press.
Kelly M (2020) YouTube receives FTC complaint over videos of journalist’s death. Available at: https://
www.theverge.com/2020/2/20/21145959/google-youtube-journalist-death-video-ftc-complaint-parker
(accessed on 3 February 2021).
Kumar S (2019) The algorithmic dance: YouTube’s Adpocalypse and the gatekeeping of cultural content on
digital platforms. Internet Policy Review 8(2): 1–21. DOI: 10.14763/2019.2.1417. Alexander von
Humboldt Institute for Internet and Society
Laufer WS (2003) Social accountability and corporate greenwashing. Journal of Business Ethics 43(3):
253–261.
Mart S and Giesbrecht N (2015) Red flags on pinkwashed drinks: contradictions and dangers in marketing
alcohol to prevent cancer. Addiction 110(10): 1541–1548. DOI: 10.1111/add.13035.
Tarvin and Stanfill 835
https://www.youtube.com/watch?v=C2PKdLkEYrg
https://www.youtube.com/watch?v=C2PKdLkEYrg
https://www.youtube.com/watch?v=9cFDrXEZrGU
https://www.youtube.com/watch?v=HVbXTUCnhIY
https://www.youtube.com/watch?v=uxkdQgLZMRc
https://papers.ssrn.com/abstract=2724515
https://www.ftc.gov/news-events/press-releases/2019/09/google-youtube-will-pay-record-170-million-alleged-violations
https://www.ftc.gov/news-events/press-releases/2019/09/google-youtube-will-pay-record-170-million-alleged-violations
https://doi.org/10.5210/fm.v22i7.7788
https://doi.org/10.1177/2053951720943234
https://doi.org/10.1177/1354856520938602
https://doi.org/10.1177/1354856517750368
https://www.theverge.com/2020/2/20/21145959/google-youtube-journalist-death-video-ftc-complaint-parker
https://www.theverge.com/2020/2/20/21145959/google-youtube-journalist-death-video-ftc-complaint-parker
https://doi.org/10.14763/2019.2.1417
https://doi.org/10.1111/add.13035
Martineau P (2019) Youtube has kid troubles because kids are a core audience. Wired, 6 June. Available at:
https://www.wired.com/story/youtube-kid-troubles-kids-core-audience/(accessed on 16 January 2020).
Nye DE (1996) American Technological Sublime. Reprint edition. Cambridge, Mass.: The MIT Press.
Orphanides KG (2019) On YouTube, a network of paedophiles is hiding in plain sight.Wired UK, 20 February.
Available at: https://www.wired.co.uk/article/youtube-pedophile-videos-advertising (accessed on 3
February 2020).
Parker A (2020) The video of my daughter’s murder is still on YouTube and Facebook. They should have to
take it down. Available at: https://www.cnn.com/2020/07/28/perspectives/alison-andy-parker-murder-
youtube-facebook/index.html (accessed on 3 February 2021).
Postigo H (2016) The socio-technical architecture of digital labor: Converting play into YouTube money. New
Media & Society 18(2): 332–349. DOI: 10.1177/1461444814541527.
Puar J (2013) Rethinking Homonationalism. International Journal of Middle East Studies 45(2): 336–339.
DOI: 10.1017/S002074381300007X. Cambridge University Press
Raun T (2018) Capitalizing intimacy: New subcultural forms of micro-celebrity strategies and affective
labour on YouTube. Convergence 24(1): 99–113. DOI: 10.1177/1354856517736983. SAGE Publi-
cations Ltd
Robertson A (2019) YouTube says it’s not restricting ads based on creators’ comment sections. Available
at: https://www.theverge.com/2019/2/22/18236688/youtube-ads-comment-section-monetization-child-
predatory-adpocalypse-policy (accessed on 7 June 2021).
Ryan’s World (n.d.). Available at: https://www.youtube.com/channel/UChGJGhZ9SOOHvBB0Y4DOO_w/
about (accessed on 17 April 2020).
Special Books by Special Kids (n.d.) Special Books by Special Kids. Available at: https://www.youtube.com/
channel/UC4E98HDsPXrf5kTKIgrSmtQ (accessed on 26 March 2020).
Special Books by Special Kids (2019a)Our Last Chance for Inclusion (#UnsilenceSBSK). Available at: https://
www.youtube.com/watch?v=oAEOnW25jNE (accessed on 9 April 2020).
Special Books by Special Kids (2019b) YouTube’s discriminatory new policies are destroying our mission of
inclusion (we need your help). Available at: https://www.youtube.com/watch?v=Wy7Tvo-q63o (accessed
on 26 March 2020).
Strangelove M (2010) Watching YouTube: Extraordinary Videos by Ordinary People. 1 edition. Toronto ;
Buffalo, NY: University of Toronto Press, Scholarly Publishing Division.
Thomson RG (1996) Extraordinary Bodies: Figuring Physical Disability in American Culture and Literature.
1st edition. New York: Columbia University Press.
Wakabayashi D and Maheshwari S (2019) Advertisers Boycott YouTube after Pedophiles Swarm Comments on
Videos of Children. The New York Times, 20 February. Available at: https://www.nytimes.com/2019/02/
20/technology/youtube-pedophiles.html (accessed on 10 March 2020).
Whigham N (2017) Parents warned to look out for disturbing ‘Elsagate’ videos. New York Post, 29 November.
Available at: https://nypost.com/2017/11/29/parents-warned-to-look-out-for-disturbing-elsagate-videos/
(accessed on 11 March 2020).
Wojcicki S (2019a) Addressing creator feedback and an update on my 2019 priorities. In: YouTube Creator
Blog. Available at: https://youtube-creators.googleblog.com/2019/04/addressing-creator-feedback-and-
update.html (accessed on 27 February 2020).
Wojcicki S (2019b) An update on kids and data protection on YouTube. Available at: https://blog.youtube/
news-and-events/an-update-on-kids/(accessed on 3 February 2021).
Wojcicki S (2019c) YouTube in 2019: Looking back and moving forward. YouTube Creator Blog.
Available at: https://youtube-creators.googleblog.com/2019/02/youtube-in-2019-looking-back-and-
moving.html (accessed on 27 February 2020).
836 Convergence: The International Journal of Research into New Media Technologies 28(3)
https://www.wired.com/story/youtube-kid-troubles-kids-core-audience/
https://www.wired.co.uk/article/youtube-pedophile-videos-advertising
https://www.cnn.com/2020/07/28/perspectives/alison-andy-parker-murder-youtube-facebook/index.html
https://www.cnn.com/2020/07/28/perspectives/alison-andy-parker-murder-youtube-facebook/index.html
https://doi.org/10.1177/1461444814541527
https://doi.org/10.1017/S002074381300007X
https://doi.org/10.1177/1354856517736983
https://www.theverge.com/2019/2/22/18236688/youtube-ads-comment-section-monetization-child-predatory-adpocalypse-policy
https://www.theverge.com/2019/2/22/18236688/youtube-ads-comment-section-monetization-child-predatory-adpocalypse-policy
https://www.youtube.com/channel/UChGJGhZ9SOOHvBB0Y4DOO_w/about
https://www.youtube.com/channel/UChGJGhZ9SOOHvBB0Y4DOO_w/about
https://www.youtube.com/channel/UC4E98HDsPXrf5kTKIgrSmtQ
https://www.youtube.com/channel/UC4E98HDsPXrf5kTKIgrSmtQ
https://www.youtube.com/watch?v=oAEOnW25jNE
https://www.youtube.com/watch?v=oAEOnW25jNE
https://www.youtube.com/watch?v=Wy7Tvo-q63o
https://www.nytimes.com/2019/02/20/technology/youtube-pedophiles.html
https://www.nytimes.com/2019/02/20/technology/youtube-pedophiles.html
https://nypost.com/2017/11/29/parents-warned-to-look-out-for-disturbing-elsagate-videos/
https://youtube-creators.googleblog.com/2019/04/addressing-creator-feedback-and-update.html
https://youtube-creators.googleblog.com/2019/04/addressing-creator-feedback-and-update.html
https://blog.youtube/news-and-events/an-update-on-kids/
https://blog.youtube/news-and-events/an-update-on-kids/
https://youtube-creators.googleblog.com/2019/02/youtube-in-2019-looking-back-and-moving.html
https://youtube-creators.googleblog.com/2019/02/youtube-in-2019-looking-back-and-moving.html
YouTube Creator Blog (2019) More updates on our actions related to the safety of minors on YouTube.
YouTube Creator Blog. Available at: https://youtube-creators.googleblog.com/2019/02/more-updates-on-
our-actions-related-to.html (accessed on 16 January 2020).
YouTube Help (2019) Update on our actions related to the safety of minors on YouTube. In: YouTube Help.
Available at: https://support.google.com/youtube/thread/1805616 (accessed on 16 January 2020).
YouTube Kids (n.d.) A Safer Youtube Experiencefor Kids. Available at: https://www.youtube.com/kids/safer-
experience/(accessed on 13 April 2020).
Tarvin and Stanfill 837
https://youtube-creators.googleblog.com/2019/02/more-updates-on-our-actions-related-to.html
https://youtube-creators.googleblog.com/2019/02/more-updates-on-our-actions-related-to.html
https://support.google.com/youtube/thread/1805616
https://www.youtube.com/kids/safer-experience/
https://www.youtube.com/kids/safer-experience/
	“YouTube’s predator problem”: Platform moderation as governance-washing, and user resistance
	“Hundreds of millions of comments”: Creating the appearance of governance
	“Channels will be required to moderate”: Offloading governance onto users
	“Under the guise of protecting children”: Failing to address the intended problem
	“I feel like YouTube took that from us”: Unintended consequences
	Conclusion
	Declaration of conflicting interests
	Funding
	ORCID iD
	Notes
	References

Continue navegando