Facebook headlines have been bloody these past few months.
Murder, suicide — practically anything at all — can flood the News Feed in an instant, yet the company's solutions, excuses and ignorance is even more jarring.
Around early May, Mark Zuckerberg announced that because of the barrage of horrid live-streaming videos, more people will be hired to screen for things like this.
On one hand, this is a big surprise.
They already replaced their own workers with robots to monitor news of all things, so what’s caused the sudden change?
Maybe it’s the backlash, maybe it’s their damaged image, or maybe it’s their wallets being hurt from all of the above.
On the other hand, it could create new problems: I mean, it’s not like Facebook ever gave the real names of people that screened beheading videos to the terrorists. Right? There is already clear danger involved for those tasked with monitoring Facebook Live.
But there's a less obvious danger, too. Adding 3,000 people to an already 4,500-member team means one thing: a bigger therapy bill.
Whatever the pay is, subjecting individuals in an office setting to watch videos that may range from child pornography to murder is incredibly taxing.
Considering how the job is outsourced in countries like the Philippines. The conditions are just terrible.
Take former YouTube content moderator, known only as Rob in the Wired story, who described just some of the experiences at his time working in San Bruno, California:
The Arab Spring was in full swing, and activists were using YouTube to show the world the government crackdowns that resulted. Moderators were instructed to leave such “newsworthy” videos up with a warning, even if they violated the content guidelines. But the close-ups of protesters’ corpses and street battles were tough for Rob and his coworkers to handle.
On top of that, it is clear that this will continue, just like how it has in the past, and nothing really gets done, even when people do flag posts or alert moderators.
Zuckerberg also vaguely mentioned more tools to help spot these videos and get law enforcement involved quicker. Obviously, Facebook hasn't released any details about what exactly those tools are, but at this point it doesn't matter.
With more than two billion people to supervise, an empire like Facebook has decided to spew sweet nothings, even after years of suffering put on display for everyone to see.
From live-streaming death to fake news, when will Silicon Valley stop saving a pretty penny and take actual responsibility for what it's doing to us?
Send this article to a friend who thinks Silicon Valley is bullshit.