Richard Lewis: Is Twitch doing enough to protect children on its platform?

Amazon-owned Twitch remains the number one live streaming platform on the market, having 15 million active daily users at last count. Policing such a sizeable platform is no mean feat and they have often found themselves publicly criticized simultaneously for being overzealous in some matters and ineffective in others.

When it comes to a list of priorities over what needs to be done correctly though, none can rank higher than protecting children and young adults that use these internet-based platforms. Based on new findings, we call into question whether Twitch is doing enough in this regard.

In February 2019, YouTube found itself at the center of a controversy after a user called Matt Watson highlighted what he called a “softcore pedophile ring” where special search terms could be used to find children in revealing clothing on the platform. Once one video was found, the suggestions put forward by YouTube would all be near identical content with the same search terms in the metadata.

Article continues after ad

The comments for these videos would be filled with users telling the children how good they looked and making requests for other types of clothing to wear for future videos. Although YouTube didn’t make a public statement about Watson’s findings, shortly after they were made public they disabled comments on videos by children. Explaining their rationale in a blog post they said:

“Over the past week, we disabled comments from tens of millions of videos that could be subject to predatory behavior. These efforts are focused on videos featuring young minors and we will continue to identify videos at risk over the next few months. Over the next few months, we will be broadening this action to suspend comments on videos featuring young minors and videos featuring older minors that could be at risk of attracting predatory behavior.”

Naturally, there were several Reddit threads discussing the shocking find. Among that discussion, an account called ‘ThePonzo’ listed evidence of a similar situation occurring at Twitch. They posted several screenshots of children under 13 streaming via mobile phones or laptops with people in Twitch chat making several sexually suggestive requests. The post would be gilded and heavily upvoted and many users would make suggestions for the poster to share their findings with news outlets, several giving ideas as to which ones would be best.

Article continues after ad

Twitch’s Travel & Outdoors section has become a hotbed for pedophiles.

During this time, the anonymous user took their advice and contacted several publications they thought would be interested in the story. Nothing happened except Twitch suddenly deleted the incriminating VODs and banned a handful of the more egregious accounts. In the interests of transparency, it is worth noting that they suspect it was this publication that was responsible for contacting Twitch although they decided not to run the story.

The Redditor continued to monitor the “Travel & Outdoors” section of Twitch. As this is the default category for the Twitch mobile app, this leads to many young children who are unaware of how to configure their channel streaming on this category. They would scroll down to the children who were streaming without adult supervision and had only one or two viewers. Consistently they would find Twitch accounts in the chat that seemed to be engaged in sexually inappropriate activity.

Article continues after ad

Over the course of just 10 days, they found multiple instances of children, many of them apparently under the age of 13 – a requirement to stream on Twitch – having chat users make sexually suggestive comments to them. In addition to this, many of these users would also make requests of the children, asking them to perform certain acts for them that ranged from appealing to certain fetishes, to direct requests for sex acts to be performed on camera. The most concerning thing was the frequency with which they saw it happening. They could find instances most days they checked.

By April, ‘SorryKnutILoveYou’ had grown frustrated at Twitch’s lack of action and publicly tweeted at Twitch Support with examples of the grooming. Despite him tagging esports journalist Rod ‘Slasher’ Breslau in the tweet, who subsequently made a quote tweet to ask what Twitch were doing about it, nothing further came of that exchange.

Article continues after ad

In May, they would make several more posts containing examples in the subreddit r/livestreamfail, a subreddit dedicated to discussing streaming culture. These would similarly be upvoted and users would make suggestions about the best way to try and get attention to what was happening. Somewhere my name came up and I awoke to a DM in my Twitter account the same day. After a brief conversation, they sent me an archive of videos and screenshots they had saved and told me to go through them. What I found within would turn my stomach.

One example shows a user called “littlejimmy7676” continually requesting that two young girls strip for them and then states he wants to see her perform a sex act on camera. While the VOD of this stream is no longer publicly available on the site, videos left on their channel show the two girls referring to a “little Jimmy” and discussing which one of them will undress for him.

Article continues after ad

The offending account is no longer available on Twitch, although it isn’t clear whether it was banned or the user themselves deleted it in a bid to evade any reprisals.

In another example, a screenshot shows three users making suggestive remarks and requests, that includes the two underage girls being called “sexies,” one user asking if they would put on high heels for them and one making multiple requests for them to lick each other’s faces and then engage in a “twerk battle.” Again, two of these accounts are no longer on Twitch but one does remain on the site, with seemingly no action taken against them.

In what is the most shocking and direct instance that we were shown, a user was talking about sending pictures of penises to two underage girls, asking for “lewd” images and also telling the girls to tickle each other. Although the VOD of this has lapsed, clips from that streaming session still on the channel also show the girls responding to requests to spread their legs and “try and do the splits.”

In addition to this were several videos where children, many under the age of 13 and therefore should not be on the platform at all, were being told to change clothes and adopt sexual positions by the viewers. They would also be asked about whether or not they had a boyfriend. There was even a Twitch user who would consistently appear in these channels making sexually suggestive remarks. They had made no attempt to cover their tracks and made these comments from their main account, which as of writing has 326 followers and a fairly regularly streaming output.

At first, I assumed that these examples had to be outliers, so I spent my time following the same methodology that the Redditor had. Across late May and early June, I too found examples just like the ones I was originally sent. The children would turn on their mobile phones, stream to the travel and outdoors section and within ten minutes someone would come to their chat and start behaving in a sexually inappropriate manner with the children.

I even found one account called ‘lolitafarm’ that followed multiple Twitch accounts belonging to children and saw the person behind that account on multiple occasions asking children to do handstands and other activities on stream.

I forwarded my findings to a senior member of Twitch staff who stated they would pass it on to one of their trust and safety team and request that they make contact with me for additional comment. Much like before what happened was some of the accounts belonging to children were banned, so I knew my complaints had indeed been seen, yet there was no contact forthcoming. I hadn’t shared everything with the Twitch staff member and had made that clear yet Twitch’s Trust & Safety team still hadn’t made any attempt to get in touch.

Much like the Reddit user that had uncovered this before me I grew frustrated and publicly tweeted at Twitch Support on June 6th. Still nothing.

While covering this story something happened that I figured might be serendipitous. After the Artifact section of Twitch had been hijacked by some people who were broadcasting all manner of inappropriate conduct, from the Christchurch shooting footage to pornography, the company took swift action to quarantine the situation. Temporary restrictions were put in place to stop new accounts from streaming unless they had two-factor authentication in place.  This, I figured, would be enough to deter most children from unsupervised streaming at least. It was made clear though that this would only be a temporary measure, which begs the question of why it isn’t implemented as a matter of course.

Twitch’s terms of service are very clear that to stream on its platform you have to be at least thirteen years of age and if you are below eighteen then a parent or guardian must agree to the ToS on your behalf. However, when creating a new account on Twitch there is no age verification process. It only requires an email account. It also means that people who just want to have an account for the purposes of chat do not have to submit any identifying information to sign up. With Twitch being so popular and so many video games targeting a younger audience, the fact that children can gain access to a streaming platform so easily should be of great concern to everyone.

We contacted the National Society for the Prevention of Cruelty to Children (NSPCC) and shared our findings. In recent years the charity has issued multiple warnings about the dangers of online technology from YouTubers interacting with underage children, to Fortnite potentially being a gateway for pedophiles. They were as disturbed by what we showed them as we were.

Andy Burrows, Associate Head of Child Safety Online at the NSPCC, said:

“Streaming sites like Twitch pose particular risks because they allow for two-way communication, making it easy for groomers to contact large numbers of children and leaving them exposed to inappropriate behaviour and content. Tech companies have shown time and again they won’t take the comprehensive steps needed to protect children from sexual grooming on their sites.”

Burrows also added that if tech companies won’t take the proper preventative measures unprompted that it would fall on the government to make them do so.

“It is therefore crucial the Government stays true to its word and sets-up an independent regulator as quickly as possible that has the power to make platforms adhere to child safety standards and ensures children are protected when they’re watching and playing their favourite games.”

We are still waiting for comment from Twitch regarding our findings.