Russell was an admin for a Facebook group with 32k members. Then someone sent him a photo of his daughter

Photo by Tim Bennett on Unsplash

Facebook groups are often blamed for being superspreaders of misinformation and hate speech. Now there are calls for the social media giant to do more to support the volunteer admins who run the company’s major money earners.

Russell Redmond used to run a Facebook group in his spare time, but after experiencing the darker side of the platform, he quit his role as a volunteer administrator.

“I’ve equated it to like being an umpire for a local football team, you’re constantly having to deal with people, and they know who you are and everyone hates you,” he said.

Facebook groups showcase the best and worst of social media — they are either a safe, supportive space or a haven for misinformation and hate speech.

They are mostly run by unpaid admins who set the rules and decide which posts to approve.

Mr Redmond said he stopped being an admin for Chit Chat — Launceston, Tasmania, a group with 32,000 members, because of the nature of the posts being constantly uploaded.

He said they were angry and malicious and there were also lots of conspiracy theories.

“I’ve had people send me a photo of my daughter from my personal profile having a go at me over my child,” he said.

“I spent a solid year in a constant grump, old ladies would smile at me up the street and I’d be like, ‘Yeah right, I don’t trust you’.”

The difference proper governance makes

Liz Crane has had a completely different experience as an administrator of a Facebook parent support group, HMN Chat.

As it is run by a professional organisation, the Child Health Association, strict rules were put in place that keep trolls out and provide clear guidance for admins, particularly in relation to health advice.

“An example [of a problematic post] is, ‘my child has this rash’ and you pop a photo up on the group and you get 50 comments saying it might be this, it might be this, that’s actually really dangerous,” Ms Crane said.

She said having strict rules in place allowed the group to be a safe, non-judgemental space where people can seek support and guidance.

“There have absolutely been posts from people who have said this group has been an absolute lifeline for me,” she said.

“Having access to a safe online community might be the only social outlet that you have and that’s really important as an organisation supporting families to be able to make that it’s as safe and friendly as possible.”

Havens for misinformation and hate speech

Groups that do not have strict guidelines have been blamed for being superspreaders of hate speech and misinformation during the coronavirus pandemic.

Local community Facebook groups were found to be responsible for allowing millions to view the conspiracy video Plandemic before it was removed from the platform.

In response to concerns about the way Facebook is being used for hate speech, more than 1,100 businesses boycotted advertising on the platform throughout July.

Not long after the campaign launched, Facebook announced it would begin labelling newsworthy content that violates its policies.

But the group behind the #stophateforprofit campaign said the changes do not go far enough and it has plans for more advertising boycotts if Facebook does not take “substantive action” to meet its list of demands.

The problem is complex, yet one major issue is that Facebook group admins take on the job with little training or support.

Mr Redmond said other than Facebook alerting him when someone had flagged a post as not meeting community standards, he was not offered any training.

“If the options were there it was probably so well hidden you probably couldn’t find it,” he said.

Ms Crane said other than a general email about access to information sessions, little guidance is offered.

“Having been a moderator of this page for over five years, I’ve never had any interaction with Facebook directly”, she said.

That is despite Facebook Groups being a central part of the social media giant’s business model and growth strategy.

Facebook’s gold mine

Facebook founder Mark Zuckerberg said the company is aiming to attract 1 billion “meaningful” users of groups within five years.

“There are rumours they spent $10 million on a 60-second Super Bowl ad to promote groups,” according to California based social media commentator Thomas Smith.

The technology consultant said although groups are known to be a major revenue earner, it is not known exactly how much money they attract.

“It’s hard to piece out the exact revenue numbers, but because of the importance of data for their advertising model I think groups are a really big and important piece,” Mr Smith said.

It makes the work of group admins central to Facebook’s business model.

What’s the solution?

Mr Smith suggested one way Facebook could address some of the problems caused by groups is to connect unpaid admins to the company’s professional moderators.

“Facebook could probably start to identify admins that might be under a lot of stress and suggest resources and maybe to have a system where admins can elevate something where they don’t feel comfortable”, he said.

“One thing that would help a lot is to have more professional content moderators.

“They have 15,000 but research has said they need at least 30,000 to effectively moderate posts.”

Facebook says group admins do have access to support and the company recently compiled a playbook specifically to help them navigate the extra challenges created by the pandemic.

It also just banned hundreds of groups tied to a worldwide conspiracy theory QAnon.

A spokeswoman for Facebook told the ABC the company is about to roll out more tools for group admins to help them manage their pages.

Read the original article >

Leave a comment