Let’s face it... if you’re building anything in fintech right now, chances are you’re using (or seriously thinking about using) AI. From onboarding to fraud checks to smarter savings nudges, it’s a powerful tool. But using AI in the real world means thinking carefully about how you handle personal data, especially when you’re in a regulated space and especially when that data is what makes your product tick.
That’s why two of the UK’s biggest regulators, the Financial Conduct Authority (FCA) and the Information Commissioner’s Office (ICO), have teamed up. Their goal? To help startups and fintechs strike the right balance between innovation and data protection so you can build responsibly without tripping up on privacy pitfalls.
Here’s what you need to know 👇
Let’s kill the myth: regulation doesn’t exist to slow you down.
Done right, it’s actually the opposite. It gives founders and product teams the clarity and confidence they need to build bold, AI-powered products that users trust not just for their functionality but for how they treat personal data.
Take this example: thanks to a small tweak in data protection regulation, banks were suddenly able to alert customers to better savings rates. The result? Over 100 million “you could be earning more” messages went out. A Big win, a simple change and proof that data privacy rules can enable innovation, not block it.
This kind of joined-up thinking is exactly what the FCA and ICO want to encourage through their AI guidance, helping firms build smarter, safer tools that keep compliance and customer trust front and centre.
Yep. The FCA and ICO have been building out support for fintechs and tech startups: think the FCA’s AI Sandbox, the ICO’s Regulatory Sandbox, and now their joint work through the Digital Regulation Cooperation Forum (DRCF).
The idea? Help startups navigate complex areas like automated decision-making, lawful data use, and AI explainability without needing a PhD in regulation. They’re also trialling a new initiative, the AI and Digital Hub, to give firms one place to get consistent, joined-up guidance across multiple regulators (including Ofcom and the CMA).
It's a move that could make building with AI and personal data feel a lot less murky.
The FCA and Bank of England recently ran a survey across financial services firms and found that 85% are already using or planning to use AI. No surprises there.
But here’s the thing: around one-third said data protection was a major constraint. That’s a pretty clear signal that the rules feel hard to navigate, especially when you're working with sensitive or high-risk datasets.
So, the regulators did what good collaborators do: they listened. From founder roundtables to privacy workshops, they’ve started building a better picture of where the pain points are and how to support innovation without compromising the rights of data subjects.
Same. That’s why the ICO is working on a statutory code of practice for AI and automated decision-making to give businesses clearer examples of how to use AI in line with the UK GDPR without having to decode the fine print solo.
Meanwhile, the FCA’s AI Lab is helping fintechs stress-test their models for privacy risks, fairness, and transparency rather than just accuracy. Because when your AI is processing personal data, especially at scale, getting it wrong can cost more than just reputational damage.
So, if you're wondering, “What do the rules say about using AI in UK fintech?” the answer is: they're starting to say more, and it's all about building in privacy by design.
If you’re building a product, whether it’s AI-driven credit scoring, personalised finance tools, or a slick digital identity flow, then privacy compliance isn’t just a tick-box exercise. It’s a foundational piece of building trust with your users and staying on the right side of the regulators.
The good news? There’s support out there. The FCA and ICO’s work is a big step towards a more startup-friendly approach to AI + data regulation, helping you move fast and stay compliant.
🎯 Need help making sense of it all?
At Founders Law, we help fintech startups get their AI and privacy set-up sorted, without killing momentum. Whether you need a quick data compliance check, a deep dive into automated decision-making risk, or help drafting policies and procedures that meets your legal obligations and helps satisfy the ICO, we’ve got bundles, retainers and flexible options to suit your stage.
So, if you want to get your AI + data game right (and avoid fines and headaches later), drop us a line hello@founders-law.co.uk.
Let’s make legal stuff feel less… legal.