Nov 10, 2022 4 min read

Interesting Survey of Public Attitudes to Data and AI

The UK Government’s unit for enabling trustworthy digital innovation has published the second wave of its survey of public attitudes to data and AI. Given the low levels of trust that consumers have in how insurers handling their data, the survey’s findings are worth a closer look.

data
Understanding what the Public Thinks about Data and AI

The Centre for Data Ethics and Innovation undertook their second survey this summer. It was comprehensive and so is a reliable indicator of current sentiment across the UK public. An insurer planning to update or revise their digital strategy will find that the CDEI survey provides a useful backdrop against which to gauge the opportunities and risks their strategy could encounter.

The CDEI analysis of their survey is pretty lengthy, so I’ll focus on their seven key findings. In each, I’ll bring in more specific mentions of finance (into which insurance falls) and draw some conclusions of my own.

1. Health and the economy are perceived as the greatest opportunities for data use.

It’s not surprise that a society learning to live with coronavirus, while also experiencing a ‘cost of living’ crisis, sees health and the economy as the areas of greatest opportunity for data use. At the same time…

“UK adults remain broadly optimistic about data use, with reasonably high agreement that data is useful for creating products and services that benefit individuals, and that collecting and analysing data is good for society.”

This positive outlook needs then to read in the context of the other findings.

2. Data security and privacy are the top concerns, reflecting the most commonly recalled news stories.

Not only were data security and privacy the top two perceived risks of data use, but public concern about them had increased significantly since the first survey in December 2021. Concern about “data will be sold onto other organisations for companies to profit” went up by a third, from 18% to 24%, the larger of the two rises. At the same time, there was some confidence (at 41%) that when firms misuse data, they are then held accountable. That however is conditional upon the next key finding.

3. Trust in data actors is strongly related to overall trust in those organisations.

Trust in actors to use data safely, effectively, transparently, and with accountability remains strongly related to overall trust in those organisations to act in one’s best interests. So what does this mean for the insurance sector? The main study of consumer attitudes to data and insurance (by the Association of British Insurers in 2020) found that trust was low. What the CDEI survey tells us is that insurers that take steps to build trust will find that this should spill over into trust around data. Each feeds into the other.

4. UK adults do not want to be identifiable in shared data - but will share personal data in the interests of protecting fairness.

‘Identifiability’ is the most important consideration for an individual to be willing to share data. UK adults express a clear preference for them not to be personally identifiable in data that is shared. Clearly, insurers want to be able to identify people in the data they bring in. The value of such data is realised as it passes through the underwriting, counter fraud and claims decision systems. What this points to then is the need for balance, between value and trust. So far, insurers seem to have focussed on value. The survey points to this needing to change.

While identifiability is the most important criterion driving willingness to share data, the wider UK adult population is willing to share demographic data for the purpose of evaluating systems for fairness towards all groups. The proportion of people who would be comfortable sharing information about their ethnicity, gender, or the region they live in, to enable testing of systems for fairness is 69%, 68% and 61% respectively.

That’s strong support for making digital decisions systems fair. Yet it is also a level of support that must be seen in the content of Finding 3 (trust) above, and in Finding 5 (governance) below.

5. The UK adult population prefers experts to be involved in the review process for how their data is managed.

The public values the involvement of experts in any process of reviewing how their data is used.   They prefer to delegate decisions around data management to professionals skilled in this field, rather than get involved themselves, or have the firm’s self-assess themselves, or there be no review process at all. Note then how this finding is structured. The public don't mean company experts; they mean independent experts.  

6. People are positive about the added conveniences of AI, but expect strong governance in higher risk scenarios.

The UK adult population has a positive expectation that AI will improve the efficiency and effectiveness of regular tasks. However, the public have concerns about the fairness of the impact of AI on society, and on the effect AI might have on job opportunities. People have higher demands for governance in AI use cases that are deemed higher risk or more complex, such as in healthcare, policing and banking/finance. Healthcare is the area in which UK adults expect there to be the biggest changes as a result of AI.

Together with the Finding 5 above, this points to the need for the insurance sector to device some form of independent governance arrangements for how insurers use data. And it also looks like the governance of health data in insurance settings will be of significant interest to the public.

7. Those with very low digital familiarity express concerns about the control and security of their data, but are positive about its potential for society.

Few people with a low familiarity with things digital feel that their data is stored safely and securely (28%). They are therefore less likely than the wider UK adult population to feel they personally benefit from technology. However, they are reasonably open to sharing data to benefit society, and the majority consider collecting and analysing data to be good for society (57%).

Insurers, and especially those providing mandatory covers like motor third party, need to recognise that the UK public is pretty diverse in its digital understanding and capabilities. And that recognition needs to be factored into everything from product design to claims settlement. If you recall Finding 5 above, this reinforces the need for good independence governance arrangements.

The Overall Message

What I take from these findings is a UK public that is supportive of the digital initiatives going on across business and government, but also concerned that it be done fairly and inclusively. The role of independent expertise in the governance of data management was another important point, especially in the context of low public trust in the sector.

What this means for insurers is that the public will look positively upon the digital transformation of insurance, so long as it’s does with honesty, fairness and integrity. That shouldn’t be much of a surprise, but what this survey does do is evidence those sentiments. What insurers need to think about then is how they can deliver that.

Duncan Minty
Duncan Minty
Duncan has been researching and writing about ethics in insurance for over 20 years. As a Chartered Insurance Practitioner, he combines market knowledge with a strong and independent radar on ethics.
Great! You’ve successfully signed up.
Welcome back! You've successfully signed in.
You've successfully subscribed to Ethics and Insurance.
Your link has expired.
Success! Check your email for magic link to sign-in.
Success! Your billing info has been updated.
Your billing was not updated.