All that glitters is not gold: Is a golden source of data truly the way forward?
When seeking perfect data quality firms often look for a golden source of truth, what are the pitfalls to this approach? The fact that a golden source of data has historically been celebrated doesn’t make it right. How can firms successfully lay the foundations for true data integrity?
by Neil Vernon, CTO, Gresham Technologies
A golden source of data – a single source of truth to supply a business with all the information that it needs to rely on – has historically been seen as the pinnacle of data quality. But while financial institutions have long strived towards a utopia of data perfection, this approach does present some drawbacks. What’s more, just because a golden source of data has long been sought after, it doesn’t necessarily make it the best option.
Today, in an era of waning tolerance for poor data integrity, more complex regulatory reporting requirements, and increasingly tight margins, we ask: how beneficial is a golden source of data in the current environment, and is there another path to take us to the top?
Enhancing reporting across multiple counterparties
The original focus of golden sources of data centres around a financial institution’s internal data repositories. However, the ability of firms to report accurately, on time, and in full, is often bound up with that of their counterparties – as is the case in the recently introduced Consolidated Audit Trail regulation.

This has led to questions over whether there should be golden sources developed across the industry which each firm can access as needed for regulatory reporting and other requirements. It is easy to see the appeal of this – one single source of truth, properly managed, would reduce error rates in transaction reporting dramatically, as well as decrease or eliminate time spent on counterparty communication. If you and your counterpart report from the same repository, how could you possibly report differently?
An innovation roadblock
However, creating such a golden source has a major drawback: it would significantly limit the abilities of financial institutions when it comes to innovation – another current ‘hot topic’ area. Coming with strict usage and management requirements, a true golden source would inhibit the kind of ‘fail fast’ experimentation with processes and products which the industry has been so at pains to encourage.
And of course, there’s the unavoidable fact that everyone’s truth is different. The questions that you are using your data to answer will determine the lens through which you should view it. For example, analysing data for product purposes will require you to consider accurate product hierarchy.
Practical realities: Why accuracy holds the key
But if creating a true golden source is neither practical nor desirable, what should firms do instead?
As far as internal data goes, firms should certainly still strive towards a reliable data source – but they should also recognise that a general-purpose solution is not always possible.
Rather, appropriate control processes should be applied to ensure that there is no slippage in data quality and that the organisation fully understands its usage. Managing data lineage through systems that allow complete visibility of the data lifecycle makes the source of data easily traceable, enhancing understanding further and ensuring that any issues can be easily fixed.
In addition, education is key: organisations should ensure that data consumers have sufficient understanding and knowledge that, when using data, the right ‘lens’ for the situation is applied.
These steps will also help to resolve many of the issues that banks experience when dealing with their counterparties, since each side will have improved the accuracy of its data. Resolving linkage issues with counterparties consumes valuable resources, particularly where escalation is required. But by giving themselves maximum visibility and control over their data, financial institutions can stop many of these issues before they start.
Financial institutions should not change their ambitions to create a strong data source, but a golden source is a naïve objective: it is too simplistic, and harms more than it helps. True data integrity does not come from such a one-size-fits-all approach. Full control and knowledge over the lifecycle of your data, and the upgrading of legacy systems, is the only way financial institutions will be able to build the strong foundations of true data quality.
IBSi News
Get the IBSi FinTech Journal India Edition
- Insightful Financial Technology News Analysis
- Leadership Interviews from the Indian FinTech Ecosystem
- Expert Perspectives from the Executive Team
- Snapshots of Industry Deals, Events & Insights
- An India FinTech Case Study
- Monthly issues of the iconic global IBSi FinTech Journal
- Attend a webinar hosted by the magazine once during your subscription period
₹200 ₹99*/month
* Discounted Offer for a Limited Period on a 12-month Subscription
IBSi FinTech Journal

- Most trusted FinTech journal since 1991
- Digital monthly issue
- 60+ pages of research, analysis, interviews, opinions, and rankings
- Global coverage
Other Related Blogs
November 20, 2024
New horizons for digital workers: AI and Data Labeling in emerging markets
Read MoreOctober 09, 2024
FinTechs Deserve a Seat at the Table: How True Collaboration Can Revolutionise Finance!
Read MoreRelated Reports

Sales League Table Report 2024
Know More
Global Digital Banking Vendor & Landscape Report Q4 2024
Know More
NextGen WealthTech: The Trends To Shape The Future Q4 2023
Know More
IBSi Spectrum Report: Supply Chain Finance Platforms Q4 2023
Know More