Implementing an AI programme? Find out how AI affects your regulatory requirements with our new AI toolkit

Artificial intelligence is moving into the mainstream and raises a number of practical, legal and ethical challenges. If you are looking to exploit this technology you need to address these issues, many of which are new and need new responses. Financial services firms in particular must ensure that their approach to artificial intelligence reflects the additional regulatory requirements placed upon them and the many developments in the space mean that interesting questions are arising.

AI in the financial context

Artificial intelligence is used in a variety of financial contexts from the provision of robo-advice to trading decisions. Financial services firms must comply with both their broader regulatory obligations, and the specific controls in areas such as algorithmic trading. The use of artificial intelligence should also be factored into the firm’s overall risk management framework.

Regulatory obligations for financial advice apply

Financial advice powered by artificial intelligence (or any form of automation) is subject to the same regulatory obligations as more traditional financial advice delivered by humans, and the obligations will fall on the firm offering the system rather than (for instance) a third-party provider who creates the relevant artificial intelligence.

For example, in the case of “robo advice”, the Financial Conduct Authority has stated previously that in its view there is nothing particularly special about robo advice in comparison with other forms of financial advice. It is up to regulated firms to ensure that any advice offered by them using artificial intelligence is “suitable” for the client.

Oversight and validation

A well-designed model could potentially reduce the risk of mis-selling by removing human error or certain elements of discretion on the part of human advisers.

However, equally firms will need to ensure that they maintain appropriate oversight of the activities of the robo advice and are able to validate the suitability of the advice in the same manner as they would for human advisers.

Current compliance failings

Demonstrating the potential pitfalls of automated advice, the Financial Conduct Authority conducted a review of firms  (published in May 2018) offering online discretionary management and retail investment advice through automated channels.

This review found various deficiencies in relation to the provision of such services, including that several firms had failed to give adequate disclosures to clients, or seek and maintain adequate information from clients that would be required to ensure suitability due to their nature of their offerings.

It is clear from this example that firms need to think carefully about their approach to complying with their regulatory obligations when embarking on AI programmes.

Launch of the AI toolkit

To find out more on this, and other issues that relate to AI in fintech, explore our new toolkit for artificial intelligence projects with a specific chapter on AI in financial services