In our social media, big tech world, anyone can post pretty much anything online, good or bad. Earlier this year, Facebook gave themselves a pat on the back for hiring 3,000 people to monitor content people posted online.
This was after a string of bad press for allowing horrible people to post beatings, rapes, and even murders on their site. Yes, it was a step in the right direction, but ultimately it’s not enough because it still happens and Facebook makes billions of dollars every year. They could do more, period.
Google’s YouTube is now following suit.
Google to hire 10,000 YouTube moderators
Google has faced similar issues to Facebook, and advertisers are fed up and dropping ads on YouTube because national brands don’t want to be placed next to what Google calls “problematic content.” Most recently, this “problematic content” (a.k.a. pedophile related content) started popping up in their kid-friendly