Home Legacy Futures logo
Home Legacy Futures logo

Debates about artificial intelligence (AI) use in the charity sector often begin with a simple question - is the public comfortable with AI, or not?

But recent research from CharityTracker suggests the reality is far more nuanced. Public attitudes on the subject are not uniform - they are shaped by experience, by values and by how people already relate to charities.

For legacy fundraisers, this matters more than it might first appear. Legacy giving is rooted in long-term trust, personal identity and a sense of human connection. Understanding how different audiences feel about technology and the role it should play in charity work will increasingly shape how that trust is built and maintained.

Familiarity changes attitudes

One of the most striking findings in the research is simple - familiarity breeds acceptance.

People who report never having used AI are the only group that are net negative about charities adopting it. As exposure increases, comfort rises. This mirrors the early days of the internet, when uncertainty gradually gave way to everyday use.

For legacy fundraising teams, this tells us something important about timing and perception. The supporters most engaged with legacy giving are often older and may have less day-to-day experience with emerging technologies. Their starting point may therefore be caution, rather than curiosity.

This doesn’t mean they will reject innovation. But it does mean charities will need to explain clearly how technology supports their mission, rather than replacing the human relationships that underpin it.

Five different groups and opinions

The CharityTracker report suggests five distinct attitude groups within the UK public:

  • Practical enthusiasts
  • Power-aware progressives
  • Critical sceptics
  • Risk-averse traditionalists
  • The disengaged

Each of these groups approaches the use of AI from a different perspective. At one end sit the practical enthusiasts. They are comfortable with technology, see its practical benefits and are the most positive about charities using AI. They also tend to be among the highest value donors.

Alongside them are power-aware progressives. This group is optimistic about the potential of AI but wary about who controls it. Their focus is accountability rather than rejection.

At the more cautious end of the spectrum are critical sceptics and risk-averse traditionalists. These groups are more concerned about the loss of human judgement, the concentration of power and the security of personal data.

Finally, there are the disengaged, who are not strongly opposed to AI but are also not strongly engaged with charities.

For legacy fundraisers the lesson is clear. There is no single public view. Different audiences bring different expectations about how charities should use technology.

The legacy audience and the human factor

Two of the more cautious groups sit within the older demographic with critical sceptics and risk-averse traditionalists both including a higher proportion of people aged over 65.

These groups express strong concerns about data security, mistakes made by automated systems and the possible loss of the human factor in charity work.

However, they should not be seen as disengaged. In fact, they are often active supporters who donate goods, buy items for charity and give in person to collectors. They also tend to look for clear evidence that a charity is making a difference.

For legacy teams this is a familiar profile. Many legacy pledgers are motivated by exactly these factors. They want to know that their gift will create real change and that the organisation they support reflects their principles.

If charities begin to use AI in supporter journeys, communications or internal decision making, these audiences will want reassurance that people remain responsible for the decisions that matter.

Shared expectations

Despite differences between groups, there are also areas of broad agreement.

One is the use of AI for fraud and scam detection. Protecting supporters and safeguarding donations is widely seen as a legitimate use of technology.

Another is the expectation that human oversight should remain part of the process in many contexts. Even among more enthusiastic groups, the idea that people remain accountable is important.

This aligns closely with the values that underpin legacy fundraising. Transparency, stewardship and responsibility are central to how charities talk about gifts in wills, and those same principles will matter when explaining how technology is used.

A balanced path forward

The implication for charities is not to take a single position on AI but to understand the range of attitudes within their supporter base.

Legacy fundraising offers a useful lens for this. It reminds organisations that trust is cumulative and deeply personal. Technology can support that trust by improving efficiency, protecting supporters and helping charities work more effectively.

But it cannot replace the human motivations that lead someone to include a charity in their will.

The challenge ahead is therefore not technological, but cultural. As AI becomes more common within organisations, charities will need to show how it strengthens their mission, while keeping people at the centre of their work.

For legacy fundraisers, that message will be familiar. The future of giving may involve new tools, but the reasons people choose to leave a gift remain profoundly human.