Monday, 18 Nov 2024

Ex-Facebook moderator: More must be done to block harrowing uploads

‘I saw horrors posted on Facebook daily… and it nearly destroyed me’: Man hired to moderate content – including beheadings, graphic sex and even child murder – says more MUST be done to block harrowing uploads

To be honest, I don’t remember the first time I watched an execution. When you’ve seen so many that they start to blur into one, it’s impossible to remember the first.

Then there’s the animal cruelty, torture and sickening pornography I’ve witnessed.

You must think I’m some sort of twisted individual with disturbing online tastes. Or else that I’m a police investigator with years of training in how to deal with online content that’s the stuff of nightmares.

Former Facebook moderator Chris Gray and his wife Mildka Gray are pictured above in Dublin. Mr Gray writes: ‘I am now suing both Facebook and CPL Resources, to which Facebook outsourced moderation services, for the psychological injuries I believe I suffered as a result of my work’

The truth is neither. Instead, these memories are the deeply disturbing legacy of the nine months I spent working for Facebook as a moderator, or as I was called, a ‘community operations analyst’. It is a role that has left me with post-traumatic stress disorder and a deep cynicism that social media can ever really be controlled.

This week, Facebook announced it would be tightening its rules over political advertising in the run up to the U.S. presidential election in November, including allowing users to turn off political advertisements altogether. It’s the latest development in a strategy to clean up Facebook’s image — which now has former Lib Dem leader Sir Nick Clegg as its vice president of global affairs and communications.

But with this focus on policing political content, it’s easy to forget the dark side of Facebook — the immeasurable amounts of deeply disturbing content being uploaded every second.

As a moderator, I was employed to filter through the content UK and Irish users see every day. I scrolled through it on my screen for eight hours a day.

My job was to attempt to help keep control of this vast universe of uploaded material and make sure it didn’t breach the company’s community standards.

I was based in Facebook’s head office in Dublin and, later, another building.

At the single interview I had before I was employed, I was asked whether I understood I would sometimes see content that I might find a bit disturbing. As I discovered, ‘a bit disturbing’ didn’t even come close to describing what I’d have to look at [File photo]

While I was employed by a contractor, CPL Resources, when I did the job in 2017-2018, the link between what I was doing and the tech giant was clear. I am now suing both Facebook and CPL Resources, to which Facebook outsourced moderation services, for the psychological injuries I believe I suffered as a result of my work.

I’m not alone. To date there are more than 20 former moderators in the process of suing Facebook in Dublin.

Meanwhile, Facebook has recently agreed to compensate moderators in the U.S. following a class action suit.

Becoming one of the tens of thousands of analysts around the world tasked with deciding whether a Facebook post is hateful, harmful, satirical, innocent or just plain stupid was not the most obvious role for a man who spent the 1980s running his own construction business in Bath, then left the UK to teach English in China.

They were interesting times, but when my wife Mildka and I got married, I found myself wanting to be back in a country where English was the native language, so we settled in Ireland.

I was 50 and needed a new job. Facebook, via CPL, was looking to employ 800 people within three months. It wasn’t well paid, a basic rate of less than £12 an hour, but it was a foot in the door.

As more moderators were employed, we were allocated different roles. At first, I mainly looked at pornography, which on my second day included a scene of bestiality [File photo]

It was not long after a teenage girl, Molly Russell, from London, had killed herself after viewing graphic self-harm and suicide material on Facebook-owned Instagram and boss Mark Zuckerberg was coming under pressure to take stronger action to moderate content.

At the single interview I had before I was employed, I was asked whether I understood I would sometimes see content that I might find a bit disturbing.

As I discovered, ‘a bit disturbing’ didn’t even come close to describing what I’d have to look at. Nothing can prepare you for the depravity that is out there on social media.

My first day at the company’s big offices was exciting. There were 50 of us in the reception area, from all over the world. It was hectic and noisy, but I think we all shared the same feeling that we were on a mission that was important. That soon evaporated.

There was a week and a half of training, pretty basic stuff with an instructor armed with a script and PowerPoint. By the second week, we were assumed to be ready to make judgment calls that could affect people’s lives forever.

I worked evening shifts, coming in at 6pm and working until 2am. There were about 20 others, filling one corner of a large airy, open-plan floor. Anyone visiting would have thought it was a model place to work — as long as they didn’t glance at a computer screen.

As more moderators were employed, we were allocated different roles. At first, I mainly looked at pornography, which on my second day included a scene of bestiality. But, after the first month, I mostly worked on the high priority queue — hate speech, violence, the really disturbing content.

My days always followed a similar pattern. I started by checking for updates to the Facebook rules, which were many and ever-changing, and it was my responsibility to disseminate that information to the other moderators on my team.

Then on with the day, looking at my ‘gameplan’, as it was called, which told me what I had to focus on first, and how many ‘tickets’, the tech industry term for tasks, I had to deal with.

In my case, this was assessing hundreds of bits of different content that had been flagged by Facebook users as questionable.

Obviously not all the content is disturbing — some people use the report function in anger and, as moderators, we found ourselves scanning family spats, marital disputes and teenage name-calling.

There were even fluffy kittens and puppies to look at, because someone might have reported a Facebook user for selling pets.

I’d say ten to 20 per cent of what I had to look at was disturbing, but when you’re looking at hundreds of clips or images every day that quickly adds up.

When you believe someone you are watching is going to harm themselves or someone else, as has happened on Facebook, there is a process to follow. There are more than 100 options to choose from when actioning content [File photo]

Working to an average handling time, or an AHT (because everything has an acronym), of 30 seconds per ticket it felt relentless. Imagine having seen so much depravity that you can’t remember the detail?

Much of it not suitable for retelling in a family newspaper.

But I do remember the day I had to look at a montage of images relating to ethnic cleansing in Myanmar. There were burned villages, refugees carrying bags — then an image of a baby lying on the ground with its eyes closed and a human foot pushing down on its chest.

I thought this was an easy decision, a photo of a baby which had met a violent death, so I deleted it.

Because of the foot, I thought somebody had crushed that baby and it had stopped breathing.

But, a short time later, I was audited, a process that happens regularly to ensure moderators are making decisions that adhere to Facebook’s rules.

It has even got a name, your ‘quality score’, which at that time had to stay at 98 per cent.

The auditor questioned my decision; there was no confirmation of death, he said, without any blood or visible broken bones. How did I know that baby had died?

At the time, my overwhelming fear was, ‘I can’t afford to let my quality score go down, I need this job’.

I pointed out that the baby was not resisting — if it was alive surely there would be some flicker of movement, some resistance to the force bearing down, but its arms were flat on the ground, palms down. To my relief, my decision was allowed to stand.

And I’M ashamed to admit my thoughts were not about the violent death of an innocent child, but about my job security.

That’s what the job did to me. I became numb not just to the atrocities, but to the insidious drip-drip of the awfulness of human behaviour.

It was only two years later, when I was speaking on stage at an event, that I suddenly recalled that incident and burst into tears.

It was, I realised, the first time I’d actually cared about that baby. It was an overwhelming thing.

Having witnessed films of terror executions, I can tell you what happens when a person is shot point- blank in the head — and it’s not what happens in a video game.

Sometimes, there are tragic stories where people have been talking on their phone and stepped out in front of a bus and been killed.

Did you know there are people who collect the CCTV footage, edit it all together, add celebratory music over the top and post it online? That ended up on my screen, too.

Through sheer weight of experience, I also came to know gruesome details about the damage people do to themselves.

If someone is hurting themselves, you have to look at how they have done it to discover whether it can be deemed self-harm, or attempted suicide.

It’s not that one is deemed OK and one not, but rather that each is categorised differently.

If you believe there is imminent danger, for instance, watching a live video feed, there is a chain of escalation to follow.

It was not long after a teenage girl, Molly Russell, above, from London, had killed herself after viewing graphic self-harm and suicide material on Facebook-owned Instagram and boss Mark Zuckerberg was coming under pressure to take stronger action to moderate content

This was the big fear for all of us. When you believe someone you are watching is going to harm themselves or someone else, as has happened on Facebook, there is a process to follow.

There are more than 100 options to choose from when actioning content. The priority is speed but, as a moderator, you have no idea what happens after you press the buttons to escalate. There is a horrible feeling of powerlessness.

What compounds the stress of exposure to disturbing images and videos is the pressure to make the right decision about what to do with it.

You’re in constant jeopardy of making a wrong decision and being penalised for it — such as categorising something as violence when actually there is a flash of a nipple, because nudity is a different category.

And the rate was relentless. Press a button, make a decision, press a button, make a decision, for hours on end. Because I’d signed a Non Disclosure Agreement I wasn’t allowed to discuss anything I saw with my wife.

But I was on a hair trigger the whole time, ready to argue with anybody about anything. Frankly, I wasn’t a nice person.

Yes, the company did make attempts to look after us, albeit in a very American way. The same week I joined a wellness coach started, which developed into a ‘wellness team’.

We would get emails saying ‘we are having yoga at 11 o’clock’ or ‘we are doing finger painting in the canteen’ and I did have meetings with a wellness coach. But try caring for yourself when you have a 98 per cent accuracy target to meet and your boss is agitating because there’s a backlog in the child sexual abuse images queue.

In the end, I was there just under a year. The firm let me go, it said, because my quality score wasn’t high enough.

When I came home and told my wife she was so relieved, she said: ‘Oh my God, you were such a miserable, awful person the past six months. I’m so glad you are out of that job.’

It took me a year to realise just how much I felt damaged by the job. I was a mess: someone who had become aggressive, had trouble sleeping and whose relationship had suffered.

I became someone even I didn’t like. But it was only when I decided to go public about it, and spoke out about it, that the emotional floodgates opened and I realised how much I’d been affected.

I’ve had moderators from all over Europe get in touch and I encourage them to speak to the not-for-profit organisation Foxglove, which campaigns to stop abuse of digital technology, and has supported me.

Talking about it brings flashbacks, but it’s vital I do talk about it because I know I’m not alone.

So what’s the answer? If the content on Facebook is to be policed, which I think most people agree it should be, how can that be done?

I do think there should be moderators, but they shouldn’t be poorly paid, super-stressed workers with no specialist training.

They should be well-trained, well-paid professionals who understand what is going on and are fully supported.

With a £14billion net profit last year, Facebook can afford it. The role of a moderator might not be as visible as a police officer on the ground, but it’s as important.

Facebook says it introduced a new set of standards last year.

A spokesperson said: ‘We are committed to providing support for those that review content for Facebook, as we recognise that reviewing certain types of content can sometimes be difficult.

‘Everyone who reviews content for Facebook goes through an in-depth, multi-week training programme on our community standards and has access to extensive psychological support to ensure their wellbeing.

‘This includes 24/7 on-site support with trained practitioners, an on-call service, and access to private healthcare from the first day of employment.

‘We are also employing technical solutions to limit their exposure to graphic material as much as possible. Providing this support is really important, and we are committed to getting it right.’

These days, I work as a guide, taking tourists around Ireland by coach, though the pandemic has halted that for the moment. Even so, I’d still rather be where I am now than where I was then.

Meanwhile, I don’t have a Facebook account and I’m not on Instagram or Twitter. Not surprisingly, I don’t have much appetite for social media these days.

Source: Read Full Article

Related Posts