City watchdog calls on internet giants to do more in fight against financial crime ‘epidemic’
The chair of the Financial Conduct Authority has called on internet giants to play a greater role in tackling financial crime which he said had reached “epidemic proportions” in the UK.
Charles Randell told the Cambridge Economic Crime Symposium on Wednesday night that “financial crime, specifically fraud against individuals, has reached epidemic proportions”.
Randell said it was not clear how much financial crime there is “but it’s a very serious epidemic”.
The FCA has been heavily criticised by retail investors for its slow actions closing down risky investment schemes advertised online, such as mini-bond firm London Capital & Finance (LCF) which collapsed in January owing £237m to more than 11,000 bondholders.
Read more: First payout to London Capital & Finance investors pushed back
Randell said the FCA “couldn’t take on the investigation and prosecution of all investment activity which is fraudulently promoted on the internet”.
He called on internet platforms, which are often used to promote scams and risky investment schemes, to do more to help protect consumers.
“I welcome the decision by Facebook to contribute to the Citizens Advice Scams Action service and to create a scams ad reporting tool, as part of the settlement of litigation against them by Martin Lewis. I hope that Facebook will continue to invest in further anti-scam protections,” he said. “And I hope that other internet giants will follow suit.”
Randell said: “Google searches of ‘high return investments’ continue to reveal numerous very doubtful offers high up the search rankings. We know from the London Capital & Finance case that a large proportion – over £20m – of clients’ payments to the firm were spent on Google advertising to attract more customers.”
Read more: City watchdog failing to protect consumers from high-risk investment firms
LCF’s collapse is being investigated by the FCA and the Serious Fraud Office.
Randell said the internet giants “as a minimum” should take down suspected fraudulent content immediately when requested to do so by the authorities.
“I would expect them to use their extraordinary resources to work with law enforcement and regulators to develop algorithms and machine learning tools to identify potentially fraudulent content,” he said.