The ICO exists to empower you through information.

  • People’s concerns starting to outstrip excitement
  • Tech developers must embed privacy from the very start to maintain trust

UK Information Commissioner, John Edwards, will today warn that 2024 could be the year people lose trust in AI – and call on tech developers to embed privacy into their products from the very start.

Referring to research which shows people are growing ever more nervous of AI, Mr Edwards will set out the steps the ICO has taken to support businesses using the smart technology and make it clear that there are no excuses for “bad actors” who do not comply with data protection laws.

Delivering the keynote address at techUK’s Digital Ethics Summit 2023, Mr Edwards will warn:

“If people don’t trust AI, then they’re less likely to use it, resulting in reduced benefits and less growth or innovation in society as a whole. This needs addressing. 2024 cannot be the year that consumers lose trust in AI”.

Mr Edwards will acknowledge the important role AI has for business, from providing new innovations improving customer service to quicker resolutions for common technical issues – but these benefits can’t be at the expense of people’s privacy and where the ICO finds error it will take action.

He will say:

“By virtue of you attending this summit, engaging with TechUK and listening to me talk, I believe I’m safe to assume that all of you in this room today understand and appreciate both the benefits and dangers of AI. I believe I can also assume that you understand and appreciate that our existing regulatory framework allows for firm and robust regulatory intervention as well as innovation.”

He will move on to say:

“We know there are bad actors out there who aren’t respecting people’s information and who are using technology like AI to gain an unfair advantage over their competitors. Our message to those organisations is clear – non-compliance with data protection will not be profitable. Persistent misuse of customers’ information, or misuse of AI in these situations, in order to gain a commercial advantage over others will always be viewed negatively by my office. Where appropriate, we will seek to impose fines commensurate with the ill-gotten gains achieved through non-compliance.”

Mr Edwards will also set out his expectations of the industry and highlight the help already available from his office including AI guidance, award winning Innovation Advice Service and Sandbox. In drawing to a close he will say:

“Privacy and AI go hand in hand – there is no either / or here. You cannot expect to utilise AI in your products or services without considering privacy, data protection and how you will safeguard people’s rights. There are no excuses for not ensuring that people’s personal information is protected if you are using AI systems, products or services.

“We can help you here, as I’ve laid out this morning”.

Notes to editors
  1. The ICO is the UK’s independent regulator for data protection and information rights law, upholding information rights in the public interest, promoting openness by public bodies and data privacy for individuals.
  2. The ICO has specific responsibilities set out in the Data Protection Act 2018 (DPA2018), the United Kingdom General Data Protection Regulation (UK GDPR), the Freedom of Information Act 2000 (FOIA), Environmental Information Regulations 2004 (EIR), Privacy and Electronic Communications Regulations 2003 (PECR) and a further five acts and regulations.
  3. The ICO can take action to address and change the behaviour of organisations and individuals that collect, use, and keep personal information. This includes criminal prosecution, non-criminal enforcement and audit.
  4. To report a concern to the ICO telephone call our helpline on 0303 123 1113, or go to ico.org.uk/concerns.