Information Commissioner John Edwards delivers the UK Finance keynote speech on 31 October 2023.
Check against delivery
Kia ora. I hope you’ve had an interesting and thought-provoking conference. Thank you for sticking around until the end of the day to hear me speak – it’s a difficult gig closing a conference, so your eyes and ears are much appreciated. And thanks to David Postings, CEO of UK Finance, for inviting me here today.
This afternoon I want to talk a little bit about trust and its importance, especially when developing and adopting innovative tech. How your sector can build trust with your customers and clients. How we, as the regulator, are working to build trust with the financial services sector. And how, by working together, we can build public trust in new innovations and emerging technologies. I’ll also slip in a little bit of data protection in there as well, and some reassurance for the future.
--
Let’s start there, to address the law-shaped elephant in the room – data protection legislation is changing. And I know that you may be feeling some nervousness about how this affects your work in financial services. I’m here to provide some reassurance – the Data Protection and Digital Information Bill, or DPDI, is an evolution of the law, not a revolution. It allows the ICO to keep our independence and protects people’s rights and freedoms. It also encourages growth and innovation for organisations through greater regulatory certainty. It’s a logical next step, allowing us to regulate more efficiently and effectively in the modern world.
The bill introduces more flexibility for businesses, making the data protection framework easier to navigate and use. Responsible use of personal information that people can trust has significant potential to contribute to the UK’s economic prosperity and global competitiveness. The bill emphasises accountability and ensures that it remains at the heart of working with people’s information.
Having an effective, flexible and modern data protection legislation as a foundation will empower organisations to use, share and innovate with personal information responsibly and within the proper guardrails. This means that organisations in the financial services sector will be supported to bring privacy-respectful products and services to market quickly and safely.
--
This idea works nicely with the theme of today’s conference and what you’ve been discussing all day. The theme of driving forward the future of finance. At the ICO, we’re also committed to anticipating and embracing the changes that the future brings. Whether that’s the economy, technology, or protecting people’s rights.
Data protection enables and supports innovation. It doesn’t block it.
Our Innovation Advice service gives organisations the opportunity to talk directly to us as the regulator. You can come to us with a data protection-related query to do with your new, innovative product, service or business model. We provide advice and our opinion, within a short turnaround time, to allow you to continue your innovation journey. This service is available to any organisation, of any size, from small fintech start-ups to larger, more established firms. We’re here to help.
And of course, if you want to take it further, and require longer-term ICO advice, you can apply to take part in our Regulatory Sandbox scheme. We are currently taking applications for 2024. This provides a “safe space” for you to innovate and ensure that you’re baking in data protection by design and default from the very beginning. We’ve recently worked with Home Office, UK Finance and participating banks. This sandbox project is exploring the legal framework and practicalities of sharing information about customers that posed the highest financial crime risk – so banks can uphold laws concerning financial crime through compliant data sharing.
However, given that we are the regulator, I do need to caveat my comments with the fact that we will not tolerate or condone reckless innovation that puts people at risk of harm. Innovation, when done properly and with the correct checks and balances in place, is a force for good. We expect, and require, all firms including financial services firms to consider and comply with their data protection obligations. You have to ensure that data protection compliance is considered and included from the outset. Data protection is not a tick-box exercise – it is something that needs to be included from the very start.
This applies across the board, not just in your innovative projects. You have to get the fundamentals right, otherwise you will lose your customers’ trust. If a financial services firm cannot process data responsibly, or keep their systems secure, who is going to trust them with their business or money? If you aren’t getting the fundamentals right, your customers may complain to us about your actions. For example, the bulk of the complaints we received last year about the finance, insurance and credit sector related to subject access requests. Your sector made up 11% of our overall complaints last year. There can be no excuse for not getting the basics right, and we are willing to take action to protect people’s information rights under the law.
--
So, as I mentioned at the start, I want to talk a bit about trust. I’ve already touched on why I believe it’s important for your customers to trust you. I think it’s equally important that you feel you can trust us, the regulator, to empower you to use people’s information responsibly and compliantly. Given the importance of data protection in your work – it’s almost ubiquitous – we expect financial services firms to set high standards when upholding people’s information rights. And we’re here – and very active – in helping you do that.
For example, in the past few months we’ve backed data sharing schemes that protect gamblers from harm, giving a clear steer that data protection law does not stop gambling companies from conducting financial risk checks. We’ve engaged with your sector – stressing that banks must uphold data protection rules when collecting information about politically exposed persons. And we’ve released a joint letter with the FCA making it clear to banks that direct marketing regulations are not a blocker and do not prevent them telling their customers about better deals.
We’re also calling on the financial sector to work with us on other issues where the use of personal information can empower and protect people. This speaks directly to our goals in our three-year strategic plan, ICO25, where we are clear we want to work with firms to support compliance and address risks that are specific to their sector. Another way we’re looking to work with your sector is through a series of voluntary audits that we’ve been undertaking with financial services firms. This exercise allows us to hear directly from you about the data protection questions you may have and how we can help.
So, I encourage you to give us your views on risks and opportunities where working with us will benefit the public – be that data sharing initiatives that prevent scams and fraud, data sharing schemes that extend data sharing beyond “open banking”, or new uses of digital assets and the proposed introduction of a central bank digital currency. We are active in your sector and welcome your engagement.
Of course, we can’t address all risks and opportunities alone. Developments in your sector cut across multiple regulatory boundaries and work areas. That’s why we work so closely with our regulatory partners – in particular the Financial Conduct Authority, the Competition and Markets Authority and Ofcom through the DRCF, or Digital Cooperation Regulation Forum.
Through the DRCF, we are piloting a multi-agency advice service, helping innovators to develop their ideas with regulatory compliance in mind. This “AI and Digital Hub” will provide tailored advice to help businesses navigate the process and remain compliant. We are aiming to launch the pilot service in 2024 so keep an eye out for that if you’re interested.
--
To finish, I’d like to look ahead to the future, and the potential innovations that are coming down the track. If any of you in the room attended our annual conference at the start of the month, you will have heard me talk about a little thing called ChatGPT. I’m sure that has been mentioned once or twice today as well. A year ago, no one had heard of it. Now, it’s the fastest growing app ever. Artificial intelligence has become an almost integral part of our daily lives – it's likely that we use it at work, when we’re travelling, even when we’re choosing what to watch on TV.
As AI has become more ubiquitous, we’ve been clear that organisations using AI have to consider the risk, as well as the benefits. We appreciate many of your firms may be considering how AI can improve your front and back-office operations – be that cutting financial crime, improving customer service, assessing insurance premiums or the likelihood of arrears or default on a loan. Where personal information is processed there can be privacy risks, as with any adoption of technology. Going back to the theme of trust – can you trust the AI system that you’re employing to safely handle your customers’ data? And can your customers trust the technology?
In particular, we’ve been looking into generative AI, like that used by ChatGPT. While the benefits of using this technology may be huge, that cannot come at the expense of people’s privacy rights. My colleague Stephen Almond spoke about this earlier this year, calling for businesses to ensure that they can prove how they’ve addressed the risks to people before rolling out the technology. And where we see non-compliance, or where we think that the risks haven’t been properly identified or assessed, we will take action.
For example, we recently announced a preliminary enforcement notice against Snap, Inc over the potential failure to properly assess the privacy risks posed by Snap’s generative AI chatbot ‘My AI’. Our investigation is ongoing so I can’t say much at this point, but our provisional findings suggest a worrying failure by Snap to adequately assess the privacy risks to children and other users before launching their chatbot.
We’ve been clear that we organisations must consider the risks associated with AI, alongside the benefits. And we want to help organisations get this right. That’s why we’ve produced guidance on AI and data protection, which provides a roadmap to data protection compliance for developers and users of AI. We’ve also published an AI risk toolkit to help organisations recognise and understand potential risks and mitigate them during their development.
--
So, to bring my segment to a close – I'd like to reiterate my key message. If you want any advice, guidance, support, reassurance – we are here to help. If you need to run a question past us or want to take part in our sandbox or Innovation Advice service for a deeper dive into your potential product or service, then please let us know.
However, we will not look kindly on organisations who flout the law and who do not respect people’s information rights. You need to get the basics right before you begin looking at innovation. It is not an either/or situation – protecting people’s personal information is non-negotiable.
Thank you.