The importance of data quality technology in enhancing the user experience
Today, people don’t expect anything less than a standout customer experience when purchasing goods and services via apps and websites. It’s something they are conditioned to after many years of online shopping. Therefore, it’s imperative that those in the public sector deliver a strong web-based experience to taxpayers.
The role of clean data
When it comes to enhancing the user experience the best place to start is for the public sector to have clean and verified data on those that access their services for the first time, as well as ongoing users.
If the data is clean and up to date it can be analysed to gain valuable insight that informs the delivery of personalised communications. It can, where applicable, help to better correctly target and enhance services, and inform the improvement and efficiency of service delivery for users.
Data decay is an ongoing concern
A big issue for the public sector is that data decays on average at two per cent a month and roughly 25 per cent a year, as people move home, divorce or pass away. With data continually degrading it’s vital to have data cleaning processes in place not only at the onboarding stage, but to clean held data in batch. The good news is such an approach usually involves simple and cost-effective changes to the data quality regime.
Use address lookup / autocomplete technology
To obtain accurate contact data at the onboarding stage use an address lookup or autocomplete service. Doing so delivers accurate address data in real-time by providing a properly formatted, correct address when the user starts to input theirs. At the same time the number of keystrokes required is cut by up to 81 per cent, when entering an address. This speeds up the onboarding process and improves the whole experience, making it much more likely that the user will complete an application or purchase.
Such a service is very important because about 20 per cent of addresses entered online contain errors; these include spelling mistakes, wrong house numbers, and incorrect postcodes, mostly due to errors when typing contact information.
The good news is that first point of contact verification can be extended to email, phone and name, so this valuable contact data can also be verified in real-time.
Deduplicate data
Data duplication is a significant and common issue, with the average database containing 8-10 per cent duplicate records. It occurs for a variety of reasons, for example when two departments merge their data and errors in contact data collection take place at different touchpoints. It adds cost in terms of time and money, particularly with printed communications and online outreach campaigns, and it can adversely impact on the sender’s reputation.
Using an advanced fuzzy matching tool to merge and purge the most challenging records is the best solution to create a ‘single user record’ and source an optimum single citizen view (SCV). The insight from which can be used to improve communications.
Importantly, organising contact data in this way will increase efficiency and reduce costs, because multiple communication efforts will not be made to the same person. Finally, the potential for fraud is lessened with a unified record established for each user.
Data cleansing
A vital part of the data cleansing or suppression process is using the appropriate technology that reveals people who have moved or are no longer at the address on file. As well as removing incorrect addresses, these services can include deceased flagging to prevent the distribution of mail and other communications to those who have passed away, which can cause distress to their friends and relatives. By employing suppression strategies the public sector can save money, protect their reputations, avoid fraud and improve their targeting efforts to overall improve the user experience.
Straightforward integration of data quality technology
With cloud-based APIs it’s easy to inject data quality capabilities into any existing platform, such as master data management (MDM), regardless of what software or systems the organisation might already be using. And via cloud APIs the appropriate technology can be delivered quickly online.
Microsoft SQL Server
Many in the public sector, particularly the NHS, utilise SQL Server – a database management system. This offers a great route to easily access data quality tools via SQL Server Integration Services (SSIS). These components can be simply accessed without further integration - just drag, drop and start using.
Use a data cleaning platform
Today, with evolving technology, it’s never been easier or more cost-effective to deliver data quality in real-time to support the delivery of a better user experience and wider organisational efficiencies. One that stands out is a scalable data cleaning software-as-a-service (SaaS) platform that can be accessed in a matter of hours, and doesn’t require coding, integration, or training. This technology can cleanse and correct names, addresses, email addresses, and telephone numbers worldwide. It also matches records, ensuring no duplication, and data profiling is provided to help identify issues for further action. A single, intuitive interface offers the opportunity for data standardisation, validation, and enrichment, ensuring high-quality contact information across multiple databases. It can deliver this with held data in batch and as new data is being gathered. Such a platform can additionally be deployed as a cloud-based API or on-premise.
Identity verification
As part of best practice know your citizen (KYC) screening, undertaking ID verification checks in real-time at the onboarding stage using a SaaS or cloud-based API electronic ID verification (eIDV) tool is just as important in delivering a good user experience while looking to prevent fraud.
Support and training
It’s vital to ensure that those offering data quality and ID verification tools and platforms can provide support and training on using the required routes to integrate the services into your systems and also when utilising these tools, when required. This will help to allay any concerns amongst those who may lack technical knowledge.
In summary
Implementing best practice data quality technology and procedures is a must for all those in the public sector. Doing so ensures not only the delivery of a consistent, positive user experience online, but delivers efficiency savings and maximises value – something that’s vital with huge pressure on public sector budgets.
This guest blog was written by Barley Laing, UK Managing Director at Melissa. To learn more about Melissa, please visit their LinkedIn and Twitter page.
Biography: Barley Laing, UK Managing Director at Melissa
Heather Cover-Kus
Heather is Head of Central Government Programme at techUK, working to represent the supplier community of tech products and services to Central Government.
Ellie Huckle
Ellie joined techUK in March 2018 as a Programme Assistant to the Public Sector team and now works as a Programme Manager for the Central Government Programme.
Annie Collings
Annie joined techUK as the Programme Manager for Cyber Security and Central Government in September 2023. In this role, she supports the Cyber Security SME Forum, engaging regularly with key government and industry stakeholders to advance the growth and development of SMEs in the cyber sector.
Austin Earl
Austin joined techUK’s Central Government team in March 2024 to launch a workstream within Education and EdTech.
Ella Gago-Brookes
Ella joined techUK in November 2023 as a Markets Team Assistant, supporting the Justice and Emergency Services, Central Government and Financial Services Programmes.