I’ve spent the last twenty years working in digital across multiple sectors. Financial services has been a big part of that journey.
What I’ve seen branch first become web first.
Web first become mobile first.
I’ve seen online banking go from slightly nerve wracking to completely normal.
Each wave of change has brought the same question:
“Can I trust this?”
What feels different now is the speed and scale of AI.
And at the centre of it sits a very human tension: trust versus technology.
We’ve faced trust gaps before
When online banking first appeared, people worried about security.
When comparison sites took off, people questioned impartiality.
When apps replaced phone calls, people missed the reassurance of speaking to someone.
Over time, experience caught up with anxiety. Security improved. UX improved. Brands learned how to communicate better. Trust followed.
But AI is not just a new interface.
It’s a new layer of judgement.
We are no longer asking customers to trust a digital channel.
We are asking them to trust a digital decision.
That is a bigger psychological leap.

UK adults (including 3.2 million cash savers) fall into the advice gap and have low confidence, yet many have the means to invest.
The advice gap makes this more urgent
The advice gap in the UK is not new. Traditional financial advice is expensive to deliver, which means millions sit outside of it.
At the same time:
Products are more complex
Personal responsibility for retirement and wealth has increased
Economic uncertainty isn’t going anywhere
In theory, AI is well placed to help.
It can analyse data at scale.
It can personalise guidance.
It can reduce cost to serve.
It can make support more accessible.
On paper, it looks like the solution.
But potential is not the same as adoption.
The chicken and egg of AI trust
Here is the tension.
To benefit from AI driven guidance, a customer has to use it.
To use it, they have to trust it.
But trust often only comes after a positive experience.
So we have a circular challenge.
AI needs engagement to prove its value.
Customers need proof before they engage.
For marketing and digital leaders, this is not just a product problem. It’s an experience and positioning problem.
Trust is not built by saying “powered by AI”
If I’m honest, much of the messaging in the market still leans heavily on the “what”.
Machine learning.
Predictive modelling.
Advanced algorithms.
Most consumers don’t care about the plumbing.
They care about:
- Is this safe?
- Is this fair?
- Is this in my interest?
- Can I understand what it’s telling me?
Trust in AI enabled financial services will not be built through technical language. It will be built through clarity and reassurance.

of UK consumers trust their bank to keep their data safe.
What I’m seeing work
Across digital more broadly, and increasingly within financial services, a few principles stand out:
1. Explain it in plain English
If a recommendation is made, show how.
What data was considered? What wasn’t? Where are the limits?
Ambiguity kills confidence.
2. Keep humans visible
AI plus human reassurance tends to outperform pure automation in trust terms.
Even knowing there is a human fallback changes perception dramatically.
3. Start small
Budget insights. Spending nudges. Savings prompts.
Let customers build confidence through low risk wins before you ask them to rely on higher stakes recommendations.
4. Show outcomes, not capability
Case studies. Real scenarios. Clear improvements in financial wellbeing.
Experience builds belief.
5. Brand matters more than ever
AI does not sit in isolation. It sits within a brand.
If your brand already stands for fairness, competence and customer centricity, AI can strengthen that promise.
If trust is fragile, AI can amplify anxiety.
That’s the uncomfortable bit.
AI maturity without brand maturity is risky.
6. Efficiency is not the same as reassurance
In boardrooms, AI is often framed around cost reduction and margin improvement.
Those benefits are real.
But in financial services, emotional reassurance is often more powerful than speed.
A well-designed AI journey might deliberately include:
- Clear explanations of scope and limitations
- Visible confidence indicators
- Easy access to a human adviser
- Simple summaries that reinforce understanding
None of that is wasted effort. It is trust architecture.
7. Technology should follow trust, not the other way round
The best organisations I’ve worked with do not start with:
“What can technology do?”
They start with:
“Where are customers excluded, confused or underserved?”
Then they look at how technology can responsibly close that gap.
That shift in mindset is critical.
Technology as enabler.
Trust as strategy.
This is a real opportunity
If this is handled well:
- More people could access meaningful financial guidance
- Confidence in decision making could increase
- The advice gap could narrow, not widen
But it requires discipline.
AI cannot be positioned as a silver bullet.
It needs to feel like a co-pilot.
Because in financial services, the fundamentals have not changed.
If people do not trust it, they will not use it.
And if they do not use it, they will never experience its value.
The winners in this next chapter won’t just be the most technologically advanced, they will be the most trusted.

Ready. Steady. Grow!
We've helped some of the world's biggest brands transform and grow their businesses. And yours could be next.
So if you've got a business challenge to solve or a brief to answer, we'd love to hear from you. Simply complete this form and one of our experts will be in touch!




