FCA issues warning to financial firms over AI fraud activity
The Financial Conduct Authority (FCA) has highlighted the risks associated with ‘deepfake’ fraud and the benefits that artificial intelligence (AI) can offer.
CEO of the FCA, Nikhil Rathi, has said that AI could disrupt the financial services sector in “ways and at a scale not seen before.” He also warned that the regulator would be forced to take action against AI-based fraud.
With increased risks of cyber fraud in line with AI use, cyber-attacks and identity fraud have increased in both scale and sophistication. It is important that regulators and government organisations continue to combat continued ransomware and phishing hacks on global businesses.
FCA work to better regulate AI and the wider tech sector
According to the FCA, the UK Prime Minister Rishi Sunak hopes to make the UK in particular a centre for AI regulation. The FCA is undergoing efforts to work on AI as part of a broader effort to work out how to regulate the big technology sector as it increasingly offers financial products.
New research into scam activity recently revealed that cyber criminals keep their attacks under the radar with tactics to extort money successfully without alerting security teams. They do this in part by making moderate payment demands in bitcoin.
In his speech at The Economist, London, Rathi warned of the increased risks posed by AI for financial firms. Senior managers at those firms will be “ultimately accountable for the activities of the firm,” including decisions taken by AI, he said.
“As AI is further adopted, the investment in fraud prevention and operational and cyber resilience will have to accelerate simultaneously,” he continued.
“We will take a robust line on this – full support for beneficial innovation alongside proportionate protections.”
Deepfake technology on the rise
Rathi will used the example of a recent “deepfake” video in his speech of the UK personal finance campaigner Martin Lewis selling speculative investments. Lewis himself described the video as “terrifying” and called for regulators to force big technology companies to take action to stop similar scams.
The FCA has noted that many financial services use Critical Third Parties: as of 2020, nearly two thirds of UK firms used the same few cloud service providers, leading the organisation to note that they must be clear where responsibility lies when things go wrong. This is part of efforts of the FCA to continue to mitigate potential systemic impacts that could be triggered by a Critical Third Party.
Derek Mackenzie, CEO of global skills provider Investigo said: “With AI set to have a seismic impact on the financial services industry, tackling the digital skills shortfall should be a top priority. From compliance to coding, businesses operating in this area are crying out for the latest tech talent to help them explore and implement AI into the workplace, yet many remain woefully understaffed.
“The FCA is right to get ahead of the game on this important issue, but without the right talent pipeline in place, far too many firms will find themselves falling behind in terms of AI capabilities and open to risk due to a lack of in-house regulatory expertise.”