Dusting Off Your Data in 2020: Why “Data Quality” Matters
by Devin Cook, on December 16, 2019
Credit Unions and Banks across the country have been inundated with vendors pushing the 'digital transformation' mantra - but which of these technologies bring the most value to these businesses?
Better Data = Better Results
Fresh off the plane from the latest and greatest analytics or industry conference, your mind is racing with new concepts filled with potential. Machine learning, Artificial Intelligence, Predictive Analytics… all promising success at the turn of a key.
What do these buzzwords all have in common? First, they’re driven by data. Second, they (ideally) will create a useful and informative output which positively impacts a particular business metric to produce a positive return on investment.
So, why is it that only one-third of decision makers trust the analytics they generate from their very own business operations? The answer lies in the quality of their data that is fed into the many algorithms and functions that create these outputs. When poor quality data goes in, poor quality results will always come out.
The first step that banks and credit unions can take to address this issue is to evaluate and improve the underlying data layer that feeds into these analytical tools. The skills and expertise necessary to build a robust solution may not exist within the organization, so everyone tries something slightly different, hoping that sheer effort & will-power is the solution. Due to budget restraints or competing priorities, it’s typical to de-prioritize these efforts and lose most of the value that had previously been gained.
Therefore, having a systematic, daily process to improve data quality solution is the best approach, but when is the best time to get started on a project like this?
Financial institution leaders that have their eyes set on a core conversion, acquisition, or merger in the near to mid future are prime candidates for a data management strategy overhaul. The second best time is 5 years ago, the third best time… is now!
Transactional vs. Analytical Databases
There is an important distinction between the two systems used by banks and credit unions. The first of these two, transactional systems, write new data and are used for operations such as processing transactions. Secondly, there are analytical systems which are built to organize, access and read data for analysis.
A core banking system like FiServ DNA or Symitar's Episys, for example, is a transactional system because it is built to insert and update records for deposits, loans, and their corresponding transactions. Querying data for simple or advanced reporting and even machine learning, on the other hand, is best suited to an analytical system such as a data warehouse or data lake that is built for the sole purpose of retrieving and presenting information for analysis.
A well designed modern analytical system serves as a powerful tool for ongoing data quality cleanup efforts, core conversions, and even acquisitions & mergers. In the case of a core conversion, a properly setup data warehouse can stage data from both cores, old and new, in parallel to validate data accuracy during the transition.
After a successful conversion or merger, the system continues to serve as the organization’s single source of truth for business analysis which supports reporting and analytics. Installing a data quality rules engine to identify and track cleanup success is much easier with a data warehouse such as Arkalytics in place.
Planning for Success
It’s hard to build a house without a foundation and it’s doubly hard to build analytics and machine learning without a clean, conformed data set. The data journey can be challenging, but with the right partner on your team, the sky's the limit.
So ask yourself, do you trust your data enough to feed it into a machine learning algorithm or analytical model?