Skip to content

Why the Mills Review is a wake-up call for marketers in the financial sector

Nick Phillips
Nick Phillips
Digital Strategist
5 minutes to read
Why the Mills Review is a wake-up call for marketers in the financial sector

The UK financial services sector is rapidly adopting AI as the norm. From predictive credit scoring to hyper-personalised marketing funnels, AI has rapidly shifted from a futuristic concept to the baseline for competitiveness. But as we lean harder into automation, the regulatory wind is shifting.

The Mills Review, initiated by the Financial Conduct Authority (FCA) in January 2026, is a strategic inquiry into the long-term effects of advanced Artificial Intelligence on UK retail financial services through to 2030 and beyond. Directed by Sheldon Mills, the review aims to establish a regulatory framework that encourages AI innovation while robustly managing risks such as fraud and market manipulation. 

The final findings are expected to be published in the summer of 2026. It marks a pivotal moment in this evolution, signalling that the regulator is moving beyond general high-level principles and into the specifics of how AI impacts everyday retail customers.

For years, the strategic question in boardrooms has been: "What can AI do for us?". The Mills Review changes the question to: "What will AI do to the customer?"

Why marketers are on the regulatory front line

Traditionally, AI implementation has been the domain of IT and Data Science. However, the outcomes of these AI systems (including the ads seen, the offers received, and the prices paid), are the domain of the marketer.

The Mills Review aims to answer three core concerns that every UK marketer working in the Financial Services sector needs to have on their radar:

1. Consumer protection

Could AI-driven models lead to "digital exclusion" or unfair pricing for vulnerable customers? This could happen in two ways:

  • Systemic bias: AI doesn’t need to know a customer's race or disability status to discriminate. It uses Proxy Variables. For example, an AI might find that living in a certain postcode or having a specific "non-traditional" work pattern (like a gig economy worker) correlates with higher risk. Even if the human marketer didn't intend to exclude them, the algorithm creates a "digital red-line" that prevents these groups from even seeing an ad for a competitive loan or mortgage.
  • The literacy gap: as firms move toward AI-only interfaces (like advanced chatbots), customers who are not "digitally native" or have cognitive impairments may struggle to navigate the journey. If there is no "human fallback," these customers are effectively excluded from accessing support or better deals.

2. Competition

Is the dominance of "Big Tech" AI models creating an unfair barrier to entry for smaller firms? The Mills Review warns that AI could fundamentally alter the structure of the financial services market, potentially shifting power away from traditional banks and smaller FinTechs towards a handful of "Big Tech" giants.

3. Operational resilience

What happens if every firm relies on the same third-party AI models (a phenomenon known as "herding behaviour")?

The Mills Review identifies "herding behaviour" as one of the most significant systemic risks of the AI era. In simple terms, it is the risk of a "Digital Monoculture." When every financial firm uses the same small pool of AI models (like GPT-4, Gemini, or Claude) to drive their marketing and decision-making, they stop acting as independent players and start acting as a single, massive, and potentially unstable block. For an operational resilience strategy, this means your firm's stability is no longer just about your own servers—it’s about your collective dependency on a few "Critical Third Parties" (CTPs).

Key takeaway

The Mills Review does not exist in a vacuum; it is the enforcement arm of the UK Consumer Duty. The FCA expects firms to provide "good outcomes". If an AI-driven marketing algorithm inadvertently targets high-interest products only to people in financial distress, the FCA will not view it as a technical "glitch”, they will view it as a regulatory failure.

Three ways the Mills Review will impact your marketing strategy

1. The end of "black box" marketing

In the past, marketers could use AI to optimise "Next Best Action" (NBA) models without fully understanding the logic behind them. Those days are over. The Mills Review emphasises explainability. If a customer asks why they were served a 29% APR offer instead of a 10% one, "the algorithm said so" is no longer an acceptable answer. You need to be able to show your work.

To do this, firms must transition from opaque algorithms to Explainable AI (XAI) frameworks. This involves deploying tools such as SHAP (SHapley Additive exPlanations) or LIME to produce "feature importance" reports, which quantify exactly which variables: such as repayment history or current debt levels: most heavily influenced a specific marketing outcome. Additionally, firms should implement Model Cards to document training data and potential biases, alongside Counterfactual Explanations that can inform a customer exactly what would need to change in their profile to qualify for a better offer. By maintaining a robust audit trail of these automated decisions, marketers can justify their strategies to the FCA and provide transparent, evidence-based answers to consumer enquiries.

2. Hyper-personalisation vs profit targeting

AI allows us to segment audiences with extreme precision. However, the review explores whether this leads to "price walking" or the exploitation of behavioural biases. As marketers, we must ensure that personalisation leads to better service, not just higher margins at the expense of the consumer’s best interest.

3. Data ethics as a brand moat

As AI tools become commoditised, trust becomes your biggest differentiator, and this is particularly true in financial services. Firms that proactively adopt transparency will build higher brand equity than those that wait for the regulator to come knocking.

How marketers can proactively prepare for the review

1. Gain a better understanding of your use of AI data in marketing

Most marketers are using AI without even realising it because it is baked into their tools. To audit this, you need to map your "data-to-outcome" flow.

Step A: Create a spreadsheet of every platform in your stack (e.g. HubSpot, Salesforce, Google Ads, Sprout Social).

Step B: For each tool, identify the automated features.To identify hidden automated features, scour your admin panels for labels like "Optimisation" or "Predictive" and scan vendor documentation for keywords such as "Dynamic" or "Neural" Is it using "predictive sending"? "auto-generated copy"? "lookalike audiences"?

Step C: Document what data is feeding these models. Mapping your data lineage involves listing all inputs (including first-party, behavioural, and third-party signals) and scrutinising them for protected characteristics or indirect proxies, such as postcodes or device types, that could trigger biased outcomes.

Organising these findings into a structured audit template allows you to categorise compliance risks for every tool in your stack, creating a transparent evidence trail for your compliance officer. Are you feeding the AI "protected characteristics" (like age or postcode) that could lead to biased outcomes?

This “AI register” will list the tool, the AI function, the data used, and the specific marketing objective it serves.

2. Establish algorithmic transparency, and add a human into the loop

The FCA is wary of "Black Box" models where no one can explain why a specific decision was made.

You need to insert human oversight at the point of decision-making, here’s how:

  • Ask your data team to produce "feature importance" reports for your marketing models. This shows which factors (e.g. browsing history vs. credit balance) most heavily influenced the AI's decision to serve a specific ad.
  • Define "boundary conditions". For example, if an AI-driven dynamic pricing model suggests an interest rate above a certain threshold for a customer identified as "vulnerable", the system should automatically flag it for a manual review by a human compliance officer before the offer is sent.
  • Once a month, take a random sample of 50 AI-generated "next best action" offers and have a strategist manually verify if they align with the firm's duty of care.

3. Review the MarTech stack

When you use tools like Meta’s Advantage+ or Google’s Performance Max, you are essentially "outsourcing" your compliance to a third party. The Mills Review suggests this is a risk.

How to execute the review

Send a standardised AI questionnaire to your account managers at these platforms.

Ask them:

  • How does your model ensure it is not targeting customers based on traits associated with vulnerability?
  • Can you provide a breakdown of audience exclusion filters applied by your AI?
  • How do you mitigate "herding behaviour" where your AI targets the same small pool of "high-value" customers as every other bank, potentially driving up costs and excluding others?

If a vendor cannot answer these, you should consider moving to "manual bidding" or "custom audiences" where you have more granular control, even if it is slightly less efficient.

Overhaul your language to make it simple to understand

Standard privacy policies are written by lawyers for lawyers. To satisfy the FCA's focus on consumer understanding, you need to translate your AI usage into plain language.

Traditional Privacy Policy Text Plain English (Mills Review Standard)
"We use proprietary algorithms to optimise user experience and offer relevance." "We use AI to look at your past spending habits so we can suggest products that are more likely to be useful to you."
"Automated profiling may be utilised for creditworthiness assessment." "An automated system helps us decide if a loan is affordable for you. You can ask for a human to review this decision at any time."

Why this matters for the 2026 findings

By the time Sheldon Mills releases the final findings of the review in the summer of 2026, firms that have already documented these steps will be seen as "leaders in digital maturity". This moves your brand from a defensive position (waiting for rules) to a proactive one (setting the standard).

Ultimately, the Mills Review is not just a regulatory exercise: it is a blueprint for the future of British finance. Far from stifling progress, it provides the framework for a more resilient, inclusive, and human-centric industry. By championing "Responsible Innovation", the review ensures that AI becomes a tool for genuine connection rather than just automated calculation.

Firms that embrace these principles today will be the ones that define the next decade of financial services: turning compliance into a competitive moat and data ethics into a brand's greatest asset. The future of the sector lies in the perfect balance of technological power and human integrity, where every algorithm is designed with the customer's best outcome at heart.

This is our chance to prove that in the age of the machine, the values of trust and transparency remain the most valuable currency of all.

Contact us

Ready. Steady. Grow!

We've helped some of the world's biggest brands transform and grow their businesses. And yours could be next. 

So if you've got a business challenge to solve or a brief to answer, we'd love to hear from you. Simply complete this form and one of our experts will be in touch!