Google, Facebook and Twitter won’t tell MPs how many staff moderate content in a major grilling by the Home Affairs Select Committee
Several of the world's biggest tech companies faced a major grilling from MPs over their efforts to tackle online abuse and hate crime as German authorities pushed further, threatening them with multimillion pound fines over failures.
Top executives from Google, Facebook and Twitter faced a lashing from MPs on the Home Affairs Select Committee for failing to do enough to stop content appearing on its platforms related to terrorism, child abuse and hate speech.
In a heated exchange, the firms refused to tell an exasperated Yvette Cooper, the committee chair, how many staff they have working on moderation and monitoring.
Read more: Facebook, Google and Twitter could face €50m fines for fake news
"On the basis of the answers you have given it's unacceptable for you not to tell us how many people you have working on these public safety issues," she shot back.
Twitter head of policy for the UK, Nick Pickles, said it has more than a hundred while Google's Peter Barron, vice president of communications and public affairs for the region, said it had thousands of people working on assessing millions of pieces of flagged content.
Pushed further by Cooper, Pickles continued: "The important thing to note is that we've been through two restructuring processes where a number of staff have lost there jobs, but we have actually increased those processes, there are more people working on trust and safety within the company than there were,"
However, the firms would not be drawn on the exact numbers or where they were based, with Pickles citing issues of safety in identifying the locations of people who may be dealing with removing terrorism related content. Facebook's policy director for the region, Simon Milner, said it was commercially confidential information but was also in the thousands.
"It's not necessarily a linear relationship between the number of people you employ and the effectiveness of the work that you do," he said, adding that it would look at giving numbers to the committee members, but only confidentially.
In the nearly three-hour session, several committee members pulled up the companies by demonstrating searching for content that could potentially be offensive as they sat in Portcullis House in Westminster, with one asking why they could not actively remove content if he was able to find it so easily. And Labour MP Chuku Umuna accused Google of making money from "hate peddlers" while Labour's David Winnick accused the company of engaging in commercial prostitution, after an investigation by the Times found adverts appearing on YouTube videos from supporters of extremist groups.
Read more: What exactly is fake news? These MPs have launched an inquiry to find out
Concluding a long and arduous session, Cooper poured scorn on Google's efforts to ensure standards on YouTube, while for Twitter and Facebook she said there is still considerable concern over the pace of change.
"[There is] a general feeling that for all the things you've said and that you're working on, in the end, it is still not enough," she said.
"Frankly Mr Barron, your answers on how you're are implementing community standards do feel a bit of a joke and do not feel as if you're taking your own community standards seriously enough and playing even by your own rules in terms of what counts as hate crime and what should be removed."
She added that there was "a very strong sense from across parliament, not just from this committee, that we need you to do more."