When Great Places to Work Outsource Jobs That Are... You Guessed It, Not Great...
July 10, 2018
Part of the game of building a great place to work is that you never let down your guard.
--Never admit that things are less than perfect...
--Never agree with someone that suggests things are less than perfect...
--Keep adding benefits or features of your culture that are cool but few people will actually use...
And today, I'm adding one. Here's how it goes:
--When faced with a job that is so objectionable it will burn people out in 7 months, deem it "non-core", outsource it to another company and transfer the cultural liability.
That's what Facebook has traditional done with the people they need to review flagged posts. A job reviewing flagged posts exposes the worker responsible to all types of objectionable humanity, and let's face it, after a year in that job, you hate life and hate people. That doesn't transfer well to the employee survey scores or other ways to measure cultural health, so high-end companies make the obvious choice to outsource it.
Problem is, the job is still ruining someone's life and you're still responsible. More on the "reviewing flagged posts" job at Facebook:
"A former Facebook moderator said the pressure to churn through a never-ending pile of disturbing material eventually made her desensitized to child pornography and bestiality.
Sarah Katz, 27, worked as a content reviewer at Facebook's headquarters in Menlo Park, California, through a third-party contractor, Vertisystem, for eight months in 2016. Her job was simple: figure out whether posts reported to Facebook violated the company's detailed community standards.
Practically, this meant eyeballing new and potentially horrific material every 10 seconds and making a snap decision about whether it needed to be ditched. Posts that needed reviewing were called "tickets," and there were about 8,000 every day.
To deal with this onslaught, Facebook had 4,500 moderators like Katz on its books last year, and in May 2017 it announced plans to hire another 3,000 to help it in the fight against the darkest corners of its user output. Facebook is also investing in artificial intelligence to help police posts that break its rules."
Any guesses whether those 3000 additional hires will be contractors or full-time employees?
They're going to be contractors. To be fair to Facebook, you can't hire that many people in this type of role without help. BUT - you can bet a lot of them - if not all - will stay contractors because Facebook will consider this to be a non-core part of their people business.
The dirty side of maintaining a great place to work is how you define a Great Place to Work. But contracting in the toughest, lowest level jobs, you're playing with definitions - to your benefit.
I'm not saying I wouldn't do the same thing. But related to the culture you have, when you outside dirty/shitty jobs, people are getting an incomplete view of happiness and engagement at your company.
The real win for Facebook is when AI can do it all and humans don't have to touch this stuff. That will be awesome - until the machines take over, off course.
Comments