In 2020, the work of volunteer internet moderators is harder and more important than ever
The stress of the COVID-19 pandemic has made online forums more heated, says researcher
COVID-19 is testing the limits of social media networks' voluntary moderators.
In January, the website Reddit launched a new coronavirus discussion forum. It was soon bombarded with misinformation and disinformation — content that volunteers like Rick Barber had to sort through for hours a day.
"Early on, it was well over 10 hours a day," the University of Illinois PhD candidate told CBC Radio's Day 6.
The subreddit, as communities on the platform are known, is home to news and information about the novel coronavirus pandemic and has more than 2.3 million subscribers.
Each day, thousands of articles and comments are posted, and many of them end up flagged for review by moderators like Barber.
"The role of every moderator is to enforce the rules of the subreddit — and these are rules that we came up with. The rules could have been, I suppose, anything we wanted them to be," he said.
While services like Facebook and Twitter use paid content moderators to monitor most content on their platforms, unpaid volunteers are responsible for keeping and eye on Reddit's communities, as well as community-led Facebook groups.
The coronavirus subreddit lists nine rules. Among them: "Be civil," "Keep information quality high" and "Avoid politics."
That last one has proven difficult to enforce.
"For a while our line was discussing policy was OK," he explained. "But if you wanted to do the partisan, tribal identification thing and just kind of hit each other with slogans and stereotypes of the other side, then like literally every other place on the internet seems to be a home for that."
"It's really hard to define that politics rule well, and it's really hard for people to resist being political. And that's not necessarily their fault because various players in that space have made it more political."
'Pick out the worst of the worst'
With the pandemic heightening anxiety and stress, Amy Bruckman says it's not surprising that conversations online have gotten even more heated in recent months. The Georgia Institute of Technology professor has studied online communities and moderation for years.
"Of course, that makes a lot of conversations more intense," she told Day 6 host Brent Bambury.
"It's great that we have places that you can have a controversial conversation, and places where that's not allowed. The nice thing about a platform like Reddit is that if you don't like the subreddit … you can create a new one."
Like Barber, Bruckman is a moderator of Reddit's coronavirus forum. She says the team's work to create a source of high-quality information for users "regardless of their political persuasion" is, in effect, "inherently political" in nature.
"That is stressful on the moderators to try and figure out where to draw the line," she said.
Those that have crossed a line, Bruckman recalls, include comments featuring obscenities, racism and ad hominem attacks.
"I tend to go through people's comments and just pick out the worst of the worst," she said.
Difficult work
Monitoring what should stay — and what should go — can be draining for moderators, says Bruckman.
"When I first joined the mod team, I thought it was just all goofy fun, and what I didn't see is that what the mods do is actually deadly serious a lot of the time," she said, referring to her experience as a moderator on the Georgia Tech subreddit.
In the past, for example, users have written about suicide in the community — posts the team are quick to manage.
"There will be a thread saying, we're sorry to say that we have lost this person, then we sit on the thread watching it carefully, reloading every second to delete the method of suicide because it's been established that sharing the method increases the risk of copycats," she explained.
Moderating posts in the coronavirus subreddit have also pushed some volunteer moderators too hard, Barber says.
"Every single person on the team has at least at one point said, like, 'I need a break.' You know, 'I'll be back in a few days' or 'I'll be back in a week.' Some people just didn't come back," he said.
Group organizers can design online spaces in ways that encourage positive interactions, but ultimately the group's topic plays a significant role in how people act.
"I met this guy once and I asked him advice on the algae in my fish tank and he was very helpful ... And then I saw the same guy 10 minutes later in a different group where there was a horrible flame fest," where the user had posted inflammatory comments, Bruckman recalled.
"I sent him a message saying, 'Wow, this is crazy,' and he responded back to me, 'Don't assume people always behave the way they behave when they talk about fish.'"
"It turned out that he was one of the site's most legendary nasty trolls," she added.
Written by Jason Vermes. Produced by Samraweet Yohannes.