Facebook will hire 3,000 new employees to scan the site for violence and hate
Looking to halt a trend toward the streaming of violent acts in real time, Facebook announced today that it will hire 3,000 new employees whose job will be to monitor the site looking for acts of violence and hate. The 3,000 new hires will join the company over the next year and will be added to the 4,500 employees who already have the task of scanning Facebook for violence. Both live video and video posted after the fact will be part of the content to be examined.
Besides violence, Facebook CEO and co-founder Mark Zuckerberg said that the 7,500 will remove posts that are not allowed on Facebook including "hate speech and child exploitation." The executive added that Facebook will continue to work with law enforcement to help stop someone who is hinting about suicide. On a blog post he disseminated today, Zuckerberg relayed a story about someone streaming video on Facebook Live. On the video, this Facebook member talked about committing suicide. Facebook called in the cops and they were able to prevent the man from ending his own life. Ominously, Zuckerberg added that "in other cases, we weren't so fortunate."
For those Facebook investors worried about the extra expense of the new hires, Roger McNamee, managing director at tech fund Elevation Partners, says that the move won't affect the social media company's earnings. Others are calling the move "the bare minimum" that Facebook could get away with. Which means that if this plan doesn't work out, the next move will have to involve more drastic steps.
According to Eric Hippeau, managing partner at Lerer Hippeau Ventures, the move by Facebook shows the limitations of using algorithms to find and remove posts related to hate and violence. On CNBC this morning, Hippeau said, "What it shows is the algos cannot do the work of a human editor."
source: MarkZuckerberg via CNBC
The site has hosted a number of high profile violent videos in real time, such as the recent murder of a 74-year old grandfather who was gunned down in cold blood while his assailant streamed it live. A little while after that, a man in Thailand posted videos showing him killing his 11-month old daughter. Both videos were up on the site for a long time before being taken down. And that is exactly what Facebook is trying to avoid by adding the extra monitors.
For those Facebook investors worried about the extra expense of the new hires, Roger McNamee, managing director at tech fund Elevation Partners, says that the move won't affect the social media company's earnings. Others are calling the move "the bare minimum" that Facebook could get away with. Which means that if this plan doesn't work out, the next move will have to involve more drastic steps.
"This is not going to hurt their earnings much. The company is a monopolist in three or four different categories. And it's going to grow really dramatically in its revenues in at least a couple of those businesses. The challenge I think they face, fundamentally, is that at their level of market share, they've become an issue for personal safety as well as personal privacy. I would be surprised if at some point the European regulators, in particular, didn't start to question the impact of internet monopolies on the rights of citizens. I think Facebook and Google have done has done the bare minimum to deal with this issue. "-Roger McNamee, managing director, Elevation Partners
According to Eric Hippeau, managing partner at Lerer Hippeau Ventures, the move by Facebook shows the limitations of using algorithms to find and remove posts related to hate and violence. On CNBC this morning, Hippeau said, "What it shows is the algos cannot do the work of a human editor."
Things that are NOT allowed: