Quantcast
Channel: News
Viewing all articles
Browse latest Browse all 13986

Q&A: Facebook's content moderators on cleaning up the murkiest parts of the web

$
0
0

Like many things at Facebook, there is an air of opacity about the way the notoriously secretive social media giant moderates its content. In recent months it’s come under fire from the public and the advertisers that fund it for being both overzealous in its removal processes and not quick enough to purge distressing content from its walls.

Last week, the businesses’ Community Standards Enforcement Report detailed how moderators from around the world – employed by subcontractors on behalf of Facebook – had deleted 3.4bn fake accounts and 7.3m hate speech posts between October 2018 and March 2019.

But who are the people making these decisions?

On Thursday (30 May) The Drum - along with journalists from the BBC and the Telegraph - was given access to the Facebook’s content moderation centre in the Jean Nouvel-designed Torre Glòries skyscraper in Barcelona.

In the gleaming tower not far from the bustle of Las Ramblas, some 800 staff employed by a company called Competence Call Centre monitor the murkiest pockets of the internet – policing everything from child exploitation to bestiality and hate speech.

It's done via a system that flags up content reported by users or through Facebook’s increasingly sophisticated AI – which is being trained every day to recognise, and sometimes purge, nuanced forms of hate speech, racism and more within its walls.

Staff moderate different markets, based on their native language – which Facebook would say makes them positioned to understand the cultural and political nuances of potentially harmful content in each region.

Recent investigations in Facebook’s content moderation hubs in the US painted a picture of overworked staff, shackled to NDAs who were, in some instances, psychologically damaged by the toll of the content they were reviewing.

“Collectively, the employees described a workplace that is perpetually teetering on the brink of chaos,” wrote The Verge earlier this year in its dissection of Facebook’s content moderation site in Phoenix, which is managed by Cognizant.

The group of six moderators (some of which didn't contribute to the questions asked) Facebook put in front of The Drum, however, painted a very different picture of life behind a moderation screen.

Flanked by a senior comms representative and a senior employee from the centre itself, the diverse group of European millennials spoke on the condition of anonymity about the role they play in regulating what we see on our news feeds.

Come back next week for The Drum's full report on how Facebook is moderating content from the European hub.

Answers have been condensed and lightly edited for clarity and brevity. 

800 people work here – when you look around do you see there as being a particular type of person who takes on this job?

Moderator 1: No, we all have different backgrounds. I’m Swedish and I work on a floor with Swedish, Danish, Belgian and Norwegian moderators. It’s a mix of many different people.

Moderator 2: I’m on the Swedish team and when I look around I don’t think ‘all of these people are just like me’. What everyone has in common is that they have the same values around what should be on the platform and they agree on policy.

What happens if you disagree on whether a certain piece of content should be removed or left on Facebook?

Moderator 1: That’s called a dispute. You need to dispute it and try and get your point across, it can be a funny part of the job, but not always.

Moderator 2: Lets say I make a decision [to take something down] and the [more senior] moderator who is cross-checking my decision tells me it’s wrong. I’d have to look that up in the policy section points and argue as to why that particular post was a violation.

Moderator 3: Our decisions are individual. At the end of the day you’re responsible for your own decisions.

Moderator 2: Of course, there are times when I disagree with what a piece of content is, but I understand it’s not violating any policies. That brings a bit of inner conflict around what you choose to do, what’s the right thing?

Does that happen quite a lot?

Moderator 2: Sometimes when I know people are saying racist things, but it's [vague] so I can’t delete them. I’ll want to remove it but you just have to take a breather and let it go, let that little racist go be on their own and embarrass themselves.

Can you talk me through your typical day? How many posts do you moderate each shift on average and what kind of content are you coming up against most?

Moderator 1: We do between 100 to 300 posts per day and when you come in you take a look at the disputes from the previous day then you get started. My market (Danish and Dutch) is mostly bullying and political stuff. Not really graphic stuff, well maybe sometimes. But most of the political stuff is very funny.

You all work across different markets. Are there some worse than others for graphic/distressing content?

Moderator 2: The markets that have the worst type of content are those where you hear there are a lot of nasty things happening, so South America and the Middle East.

Moderator 1: Italy have a lot of nudity. Sometimes I wish I worked on that floor, I’m jealous.

The New Zealand Christchurch put a spotlight on the moderation protocol? How does it change in an instance like that?

Moderator 1: We immediately got an update. When something like that happens we’re all alerted. The rest of the world sees it, but later than us. We’re there as it’s happening so we can action on it.

Doesn’t it feel a bit like firefighting? That particular terrorist video was uploaded and repurposed so many times

Moderator 3: In some ways yes, you just have to deal with it.

Do the mental health facilities Facebook provides to you (staff get access to therapists as well as break-out rooms and 'wellness' sessions) help?

Moderator 2: We’re recommended to take 45 minutes each week to disconnect from the platform away from our desks and that works well. I feel like CCC are good at compromising and making the difficult part of the job a small part of it. It’s an open space office and if you need any support you have it.

New tools have been released that mean if content is graphic or distressing then you can blur it or mute it – have those helped in your own mental health?

Moderator 4: Yeah, because we sometimes have to see things like self-injury – so of course it helps.

How often do you need to step away from your desk to go to the ‘wellness’ area because of something you’ve just seen?

Moderator 3: I’d say that’s happened once since I started here about four or five months ago.

Moderator 2: We’re lucky to be working in our markets, it means it would take a reshare of a reshare of a reshare before something that caused me to have to do that would come up on my screen.

Moderator 1: Also before you press the play button you see a thumbnail, which alerts you if the content is graphic. So the software helps.

You’re subject to confidentiality rules, meaning you can’t talk to friends and family about what you see. Is it difficult having to hold that back?

Moderator 2: You can always talk to each other, so if there’s something really heavy you might not be able to discuss it with anyone else outside but there are always people in your office going through the same thing.

Moderator 3: It’s also not the kind of job you bring home with you. When you’re done with the day you’re done.

The press has painted a bleak picture of working conditions (Facebook has come in for some criticism about the way subcontractors treat moderation staff). That’s not the impression I’m getting from you here. How would you respond to that criticism?

Moderator 2: It’s funny and a misinterpretation. People think about reported content and they think of murder videos or imagees, but we can be rifling through so many stupid things – somebody saying happy birthday to someone’s boyfriend and then the girlfriend complaining, for instance.

The misinterpretation is that all we’re watching everyday is ‘war, war, war’ and that we must be heartless psychopaths sitting in dark rooms and watching things, but it’s not like that.

Moderator 3: I also find a lot of meaning in my work. We know we can handle the content and that’s why we’re here. We can actually respond to real-life situations and make a difference.

What do you think Facebook would look like then if you weren’t doing this job?

Moderator 1: You would have a lot of violations.

Moderator 5: Like World War Three.

Moderator 2: It would be disgusting, the filtering is needed.


Viewing all articles
Browse latest Browse all 13986

Trending Articles