Managers want to improve databases

A new report shows that IT decision-makers at organisations in banking, insurance, and the public sector across the US, Canada, and the UK are largely clueless about the accuracy of the data they store.

The research – commissioned by Quantexa found that 42 per cent of Data and IT leaders believe that all business units trust the accuracy of data available to them. In addition, more than one in five leaders (do not think their organisation is maximising the value of its data.

About half of the managers want to improve their data quality, which is a big change from 2021 when improving data quality was a top priority for only 26 per cent of organisations.

A vital component to improving data quality is removing duplicates from data sets – so all functions in an organisation have an accurate, single view to make confident and trusted decisions. Traditional, inaccurate data matching approaches are becoming obsolete as a result. The most effective way of assuring an accurate and iterative data foundation requires Entity Resolution, which connects data points at-scale across internal and external data sources in real-time to provide decision makers with single enterprise-wide views of people, organizations, and places.

Less than a quarter use this type of sophisticated technology to master data and provide the ability to create the analytical context needed for timely and effective decision making.

On average, respondents say 12 per cent of all data records across their organization are duplicated. The duplicated data sits lurking in data lakes, warehouses and databases and is preventing data/IT leaders from maximising the value of data across their organization.

Respondents revealed that only 38 per cent of organizations have implemented automation and are able to trust the outcomes, while 23 per cent have some operational decision-making automation, but the accuracy needs improvement.

Quantexa Chief Product Officer Dan Higgins, said: “Today, organisations are facing aggressive headwinds: global economic volatility, rising regulation, rapid disruption in technology, and changing customer and citizen expectations. This cocktail is making it tougher than ever before to create resilient organisations.”

Organisations know that to understand and trust their data they need a strong and accurate data foundation and single views of data that become their most trusted and reusable resource across the organization. It’s apparent that one of the biggest causes of untrusted data is duplicated records as part of the overall growing issue of data quality, he said.

“It’s easy for duplicate customer records to appear because different business units are working with siloed data in different systems. Different iterations of a name, changes in address or multiple phone numbers can all harmlessly create these replications. But these customer clones can cause a domino effect in inefficiency and confidence in decision making. This becomes a costly waste of resources across data, IT, and business teams and it stops businesses from creating the necessary agility and resilience to being able to identify risk fast and serve customers at the highest levels,” Higgins said.