Child Predators Use Twitch to Systematically Track Kids Livestreaming

The gaming platform’s design enables people to find and exploit kids in real-time

Published: | Updated:

Twitch, the Amazon subsidiary where millions of people congregate every day to watch skilled gamers play franchises like Fortnite and Minecraft, is one of the most popular websites on the internet. But the factors that have contributed to its rapid growth, such as the ease with which anyone can open an account and begin broadcasting themselves live, have also enabled predators to target young users, according to an analysis from October 2020 through August 2022 by a researcher who studies livestreaming websites.

In an emailed statement, Twitch said the problem plagues many online platforms and that it has taken steps over the last two years to address it. “Preventing child harm is one of our most fundamental responsibilities as a society. We do not allow children under 13 to use Twitch, and preventing our service from being used for harm is one of our biggest priorities,” a Twitch spokesperson wrote. “We know that online platforms can be used to cause harm to children, and we have made extensive investments over the last two years to better stay ahead of bad actors and prevent any users who may be under 13 from accessing Twitch.”

In that time, Twitch has quadrupled the size of its law enforcement response team working with the National Center for Missing and Exploited Children, known as the NCMEC, and the Tech Coalition, an industry-wide alliance combating online child sex abuse. After Twitch verifies instances of grooming, the company’s law enforcement response team reports them to authorities and investigates the groomers’ networks. The spokesperson said Twitch has “numerous additional updates in development” to detect and remove child streamers and predators. Twitch can’t share “much of the work in this area” publicly because bad actors could use that knowledge to evade accountability, the spokesperson said.

Still, the reporting shows that Twitch’s existing moderation tools have proved insufficient at preventing children from broadcasting and at stopping adults from finding and grooming them.

The researcher assembled databases of Twitch accounts by manually identifying livestreams of young people and determining which of their followers also track other children. One data set represents predatory users — those believed to be adults targeting children — who each possess a follower list composed of at least 70% kids or young teens. The researcher created a separate data set of apparent children targeted by these accounts and reported instances of overt predation to the NCMEC and to Twitch. The person has studied internet harassment and extremism for years and requested anonymity due to concerns over potential career repercussions from being associated with such a disturbing topic.

Bloomberg verified that the 1,976 accounts had numerous children in their following lists. Bloomberg also reviewed live video recordings and other documentation and analysis by the researcher. In the course of reporting, Bloomberg discovered additional live videos and predatory accounts not cataloged by the researcher, suggesting the problem could be even more widespread than the data portrays.

“Even one single instance of grooming is abhorrent to us, and if it’s valid, the data you reference demonstrates that we are not offering the level of protection we strive for yet — which is deeply upsetting,” Tom Verrilli, Twitch’s chief product officer, wrote in an email. “This work is vitally important to everyone at Twitch, and we’ll never stop.”

The reporting shows how alleged predators are able to target multiple children simultaneously using common grooming tactics. In late July, one Twitch voyeur entered the livestream of four different young users and asked them to perform acts including “spicy dares,” slang for requests that may be indecent or vulgar. One stream attracted 290 viewers. In that case, Twitch shut down the broadcast after 30 minutes. But others go unmoderated.

page break page break page break

Safeguarding children online is an industry-wide challenge. Meta Platforms Inc., which runs the largest social networks, perennially tops the list of reports of apparent incidents of child exploitation online, according to the NCMEC. A whistle-blower last year accused the Facebook and Instagram parent of failing to devote adequate resources to removing child sexual abuse material. The US government is reportedly looking at how ByteDance Ltd.’s TikTok handles the problem, and Alphabet Inc.’s YouTube algorithm has recommended videos of children to predators who help disseminate them further. Spokespeople for Meta and TikTok have said they have no tolerance for such abuse. YouTube spokesperson Ivy Choi said the company has devoted significant resources to child safety. YouTube said it has also changed its systems since past reports of predatory networks there.

Kids’ vulnerability to abuse is a particular concern in the world of online games, where Twitch is a cornerstone. Some three-fourths of US children play video games, and Twitch says the majority of its users are between 16 and 34.

There are few barriers to prevent kids from livestreaming on Twitch, particularly on mobile. Whereas a new Twitch user can begin streaming live immediately, doing so on YouTube’s mobile app requires the person to have at least 50 subscribers and wait at least 24 hours. TikTok requires creators to be over 16 and have at least 1,000 followers. Facebook, which stipulates that people must be 13 to open an account, has no other age or follower requirement for livestreaming on mobile, according to the company, but unlike Twitch, new users must connect a phone number or email address and validate their accounts.

Critics say efforts by social media companies to enforce their age restrictions are inadequate. “The platforms aren’t doing a great job keeping young kids off their sites, and it creates the opportunity for child predators to groom in such an expedient way,” said Ben Halpert, founder and president of Savvy Cyber Kids, an organization promoting children’s safety online.

Live predation has been a problem on Twitch for years, but there are signs the pandemic made it worse. While reports of apparent child sexual abuse online skyrocketed 73% from 2019 to 2021, according to the NCMEC, they increased 1,125% on Twitch. A Twitch spokesperson attributes the influx of NCMEC reports to the company’s improved detection methods as the online platform grew.

Children and teens stuck at home were on their phones and computers for nearly twice as many hours as they were pre-pandemic. On Twitch, the average number of live channels broadcasting concurrently more than doubled between late 2019 and the start of 2022 to 105,000. Average viewership for those channels increased to 2.9 million from 1.2 million over that period. “Covid has had an unprecedented impact,” said Iain Drennan, executive director of child safety firm WeProtect Global Alliance. “There are more potential victims online, more offenders online, and we’re seeing a dramatic increase in reports worldwide.”

Twitch Grows During Pandemic

March 2020

pandemic lockdowns

Monthly active streamers: 10M

5

0

2018

2019

2020

2021

2022

March 2020

pandemic lockdowns

Monthly active streamers: 10M

5

0

2018

2020

2022

March 2020

pandemic lockdowns

Monthly active streamers: 10M

5

0

2018

2020

2022

Source: TwitchTracker

Even though the early effects of the pandemic are fading, kids are still flocking to Twitch. The average number of children and young teens broadcasting monthly doubled between July 2021 and 2022, the researcher’s data shows.

page break page break page break

Child predation has been a public problem for Twitch since at least 2020, when a Wired investigation revealed dozens of children were livestreaming in a category known as Just Chatting. Predators could sort broadcasts on the site’s directories by “recently started” to reveal accounts with low numbers of followers. In response, the company removed the sorting feature for Just Chatting and the Roblox channel. It’s still available in Twitch’s thousands of other categories, including ones where children regularly livestream.

Although the platform showcases its most popular users on its front page, with little effort a viewer can find the broadcasts of kids mixed in with those of older kids and adults and join those livestreams. Alleged predators that have discovered and followed children can receive notifications when they’re online. Some kids can receive hundreds of live viewers within a few minutes of going live — far more than what’s expected for an account that isn’t a featured streamer.

In mid-August, 650 live viewers watched a girl who said she was 11 and alone in her bedroom. She displayed a sign with her Twitch username and the number 12 — the age she would turn in a few weeks. Viewers in the chat asked her to do a “fashion show” and show her legs. An archived clip of the performance was viewed 3,700 times.

It’s challenging to determine with certainty how old someone is by watching a video or reviewing an account. The researcher classified a set of accounts as likely belonging to children based on how they described themselves in addition to their physical appearance. For example, someone who says her favorite game is Roblox or poses with a school backpack is not likely to be an adult.

Streamers sometimes give their ages explicitly, as happened in April, when two girls appeared sitting on a couch eating ice cream. A viewer asked how old they were, and they replied they were 12 and 13. In another instance, a child said she’s 8 and lives in Florida.

Children on Twitch often mimic their favorite gaming streamers, responding to comments and prompts by their viewers in order to keep their audience engaged. This makes young streamers susceptible to predation because viewers can communicate through text anonymously with broadcasters who are on live video.

In September, a girl who said she was 9 streamed on Twitch alone in her room using a smartphone. A viewer entered her chat room within the first minute of her broadcast and asked if she’s a girl. Then the person said, in Spanish, “Show your butt to prove it.” Three minutes later, the viewer said, “I want to see your underwear.” Throughout the livestream, the viewer repeatedly threatened to leave if she did not perform to meet his demands.

It’s possible some of the accounts believed to be adult predators are actually kids, but the behavior exhibited by many members of this group follows typical grooming techniques. Some use the same tactics on multiple streams, even seconds later. They can start out by asking innocuous questions about a child’s favorite color and work their way to demands for live sexual acts.

Some predatory viewers send messages in Twitch’s text chat that appear to look like donations to the children — a popular monetization feature on the platform — in an effort to coerce them into sexual acts.

As Twitch saves streams for several days, videos with inappropriate material can sometimes attract thousands of views before they are automatically removed. In April, one young boy streamed himself shirtless in his room. In the accompanying chat, one viewer asked him to “show your trunk.” Another asked him to “show off ur skills,” which prompted him to flex. A video of the livestream was viewed over 230 times — an outsized number for someone with 18 followers.

The researcher’s data from Twitch text chats contains over 5,700 references to other social media platforms, such as Discord, Snapchat and TikTok, indicating predators may be seeking to move conversations to more private venues. “One thing we see offenders try to do is connect with children on one platform and move somewhere they can have a private communication or private video stream,” said Lindsey Olsen, a former executive director for the NCMEC’s exploited children division.

page break page break page break

Identifying predatory behavior, especially in real time, would be a daunting task for any technology company. Twitch broadcasts over 2.5 million hours of live content every day in 35 languages, relying on a combination of user reports and live moderation to shut down inappropriate streams. Like Meta’s Facebook Live, where the shooting in Christchurch, New Zealand, was broadcast, Twitch has also inadvertently hosted live videos of extremists and terrorists on their way to committing heinous acts. In May, a gunman used Twitch to broadcast his attack in a Buffalo, New York, supermarket. The video attracted 22 viewers within 25 minutes before the company ended the stream. (Twitch said it removed the livestream “less than two minutes after the violence started.”)

Algorithms can scan websites and apps for abusive material by comparing images and videos to content that’s already been identified as inappropriate. Livestreams showing abuse are brand new, so it’s hard for companies to quickly identify them as inappropriate and remove them, said Rebecca Steinbach, senior producer at NCMEC.

Twitch successfully deters other forms of child sexual abuse material, said Verrilli, the product chief, including problematic recorded videos that are shared or rebroadcast. “The majority of established solutions in the space are ineffective in combating the type of grooming we see people attempt on Twitch, as they either do not work well in a live context or are not capable of catching the types of grooming behavior that are intended to escape detection,” he said. Over the last year, Twitch said, it terminated almost three times the number of accounts belonging to underage streamers as it did the year before. Twitch relies on user reports to root out abuse, in addition to “automated solutions” that can identify child streamers and predatory behavior, Verrilli said.

Critics say Twitch is over-dependent on crowdsourced reports. The researcher reported 1,200 accounts apparently belonging to children to Twitch. In March, the company updated its reporting form to make it easier for users to flag child streamers. Still, only about 37% of accounts reported by the researcher in May were removed.

Even in the many instances when Twitch has eliminated inappropriate content or suspended clearly underage users, it’s easy enough to create a new account. When one viewer “dared” a child to pull down her pants during a livestream in late May, the child responded that she wouldn’t do so because she had already had one of her three accounts suspended. The viewer, who followed primarily young girls on Twitch, replied in the chat: “u can always make a new account easy.” Later in the broadcast, she pulled her shirt up.