Imagine that your daughter is playing Fortnite, and a troll in another country joins the game and begins sexually harassing her. Imagine that your son is watching his favorite video gamer live-stream an eSports game, and the streamer begins to shout obscenities.
Disruptive behavior like this has become routine in video games. The system that guides the appropriateness of content on these platforms is obsolete, and children who play video games are exposed to inappropriate and abusive behavior. To fix this, and make video games safe for children, we need a new content rating system for the eSports era.
With the rise of eSports, video games are no longer just a hobby, as they were in my elementary school days when I used to play Super Mario on my Super Nintendo. In eSports, professional gamers compete in video game leagues and tournaments like the Overwatch League and the International DOTA2 Championships, where the prize money can reach into the millions of dollars.
The eSports industry wants legitimacy in the world of sports, and it’s succeeding. Video games are evolving to a place where they are treated like athletic sports, and the gamers treated like athletes. Pro gamers sign yearly contracts with teams and practice for hours and hours to enhance their skills and build team strategies. Those who make it to the top enjoy worldwide fame, like South Korean pro gamer Lee Sang-hyeok, known as Faker, who has been called “eSports’ Michael Jordan.”
At the Asian Games 2018 in Indonesia, six video games were played as a demonstration sport. At the next Asian Games, in four years, gamers will be awarded medals. Even officials of the 2024 Paris Olympics are considering video gaming as a demonstration sport.
The final games at the League of Legends World Championship in 2018, which ended last weekend, attracted more than 200 million viewers. This year, thanks to streaming sites like Twitch and YouTube, game live-streaming will have an audience of 380 million viewers worldwide.
eSports platforms and video games are plagued by trolls. As eSports are rising to the same worldwide standing as athletics, and gamers are treated as athletes, the level of sportsmanship within video games is lagging far behind.
In the League of Legends, I can form a virtual team with players around the world in less than five minutes and play against millions of others. As soon as the match starts, someone types a message in the chatbox. “Feel like trolling,” the anonymous person says, and everyone else on the team knows what’s coming next. Soon, the troll who was supposed to be your teammate kills himself in the game and types obscenities — misspelled to avoid text filters — which flash across the screen. There’s nothing that my teammates or I can do about the troll. I tell the anonymous troll that I’ll report him. “LOL!” he says. I start another game, only to be trolled again.
The same toxicity plagues streaming sites like Twitch and YouTube, where people watch other gamers play. One pro gamer was playing a live public match on a Korean national TV channel, where thousands of children were watching. Forty seconds into the stream, his anonymous opponent typed a string of obscenities into the chatbox for all to see. In another stream, the N-word was used about 60 times. In another instance, a streamer used explicit and homophobic language when talking to a gay pro gamer.
Four years ago, in what became known as Gamergate, anonymous online miscreants harassed women in the video game industry, drawing worldwide outrage well before the #MeToo movement began. Today, 48 percent of gamers are women, and some — like Kim Se-yeon, known as Geguri, the first female gamer in the Overwatch League — make it to the pro level. But trolls pester female gamers with obscene, sexist and misogynistic language, without any consequences. In the world of gaming, there are no rules.
The most troubling aspect of this is that most games are intended for children and young adults. Approximately 64 million children in the United States play video games. We don’t allow our children to watch things on television that contain this kind of language or behavior, and we certainly don’t want them to think that the language and behavior of the trolls is acceptable.
My family moved to the United States two years ago from South Korea. My 3-year-old daughter loves learning English through YouTube videos and by playing puzzle video games. I fear that she’ll be exposed to this toxicity soon, unless we eliminate trolls, put a new ratings system into place and demand better of the video-gaming industry. There are very real dangers to be feared: In June, a 7-year-old girl’s avatar was sexually assaulted in a Roblox game.
The regulations and ratings systems now in place are not doing enough to stop the trolls. If an Elmo character shouts the F-word or kills his teammates in a game, it doesn’t affect the game’s rating, as provided by the industry’s self-regulating organization, the Entertainment Software Rating Board. The E.S.R.B. reviews how violent and sexual game content is, but not how toxic the gamers are becoming in the games. The Federal Trade Commission says on its website that “you may be able to block the player, or notify a game’s publisher or online service.” While I’ve blocked and reported hundreds of trolls to the video game companies, I’ve never heard back.
Even though this problem is widespread and many of the trolls live outside the United States, the F.T.C. and the rating board can take steps to make video game communities less toxic.
First, they need to clearly define what is and is not acceptable in video games and streaming sites. Video games should be rated based on the amount of trolling that happens, and streaming sites should be rated just as video games are. Streaming channels should be dynamically filtered based on the user. Children under a certain age should not be able to see videos that use explicit language or sexual visuals.
Second, the F.T.C. should run extensive tests on video games and streaming sites to understand the toxicity and trolling of gaming communities. To do so, it can work with gamers across the country to collect data, as well as work with game companies and streaming sites.
Third, the F.T.C. should require game companies and streaming sites to share public reports on how they are managing and preventing toxicity across their platforms. Whenever trolling occurs — especially in children’s games — the users and their parents should be notified, and the games or platforms should be required to address the trolling and share the steps that were taken to stop or prevent it.
The F.T.C. can apply these rules to companies outside of the United States that want to sell their products here, where 178 million players generate $30.4 billion in revenue every year.
Parents also need to shoulder some of the responsibility. In the United States, 91 percent of children ages 2 to 17 play video games. We need to protect our children by deleting the toxic games from their personal computers, consoles and phones. We need to share their stories online to expose the problem. It’s time to make the online world safe for them.
Won Sang Choi is an eSports fan who recently received an M.B.A. from the Stanford Graduate School of Business.
Follow The New York Times Opinion section on Facebook and Twitter (@NYTopinion).
Source: Read Full Article