Creating Value with Data Currency

November 19, 2019

As traditional banking focuses on customer experiences, financial institutions must evolve into more data-driven cultures. Banks generate substantial data trails revolving around their customers. To add significant value to their relationships, they must stay abreast of changes to their customers’ personas on a near-real-time basis.

Data as an asset

Data is the new air, and banks that breathe the best will win. Banks are the epicenter of all types of data – from psychographic to contextual. Organizations that obtain meaningful information the fastest can apply deeper customer insights and take proactive measures to give their customers an ultra-personalized experience. Banks that cannot adequately analyze their data in a timely manner risk becoming less relevant over time. Gartner terms the value organizations gain from data as Ifonomics. It is the emerging discipline of managing and accounting for information with virtually the same rigor and formality as other traditional assets (e.g., financial, physical, intangible, human capital).

Infonomics suggests that information itself meets all the criteria of formal company assets. While not yet recognized by Generally Accepted Accounting Practices (GAAP), increasingly, it is incumbent on organizations to behave as if it were – to optimize information’s ability to generate business value.

Making contextual sense

The key to making real-time, selective data feeds a reality is to offer banks a platform enabling them to listen and select meaningful changes occurring within their banking ecosystems. This can be accomplished with technology that first “listens” and records transactions from the bank’s various systems (deposits, lending, originations, card payments and other critical bank applications). With the proper technology tools, this information can be relocated into a central data hub which filters and adds supplemental, contextual data to the identified changes. Due to the ability to administer the filters and dictate the nature and frequency of the changes they want to review, a bank will not “drown in their own data.”

Potential uses

The potential applications of a data gathering and contextual solution within a financial institution are unlimited. With FIS Code Connect, banks can subscribe to listen to events as simple as confirming a customer’s email change, to a more complicated product recommendation based on certain customer attribute changes. Near-real-time data can provide current intelligence for an important sales call, help reconcile accounts and improve the immediacy of financial reporting.

The data collected can drive many systems and functions such as:

  • Enhancing fraud detection based on an understanding of certain events in a payments platform that triggers speedy notification of a lack of authorization. This can be especially valuable in situations involving multiple players, such as Apple Wallet, the bank and the cardholder
  • Feeding operational data stores to fuel faster decision-making processes
  • Driving data analytics rules engines that easily create actions such as rounding up payment transactions for immediate deposit into a savings account

The cost of bad data

According to a recent American Banker article, one of the most difficult hurdles to adopting artificial intelligence software is having a data environment that's not ready for it. A prediction or pattern recognition engine that is applied to information that is out of date, for example, could lead to disastrous consequences (e.g., an investor being given bad advice, or a borrower being approved for a loan who clearly shouldn’t be).

Without timely, reliable data, meaningful event notifications and alerts do not create value, because the ability to process and analyze data depends on the timeliness of the data itself.

The acceleration of data analytics requires real-time information to support real-time notifications. Timely data can also fuel fintech experiences like Apple Pay that rely on timely information to support customer interactions.

The cost of untimely, bad data can be severe. The cost of bad and/or missing data is becoming more apparent throughout the U.S. Most companies responding to a data quality survey report that poor data quality destroys business value. Gartner research has found that organizations believe poor data quality to be responsible for losses averaging $15 million per year.

With high-quality, timely data demonstrating such primary importance, the question bankers ask themselves becomes: If many of the services banks provide no longer make a profit, where is the money to be made in the future? …

About the Author
Connie Davis, SVP, Digital Commerce, FIS
Connie DavisSVP, Digital Commerce, FIS

Tom Ruppel, FIS IT Group Executive
Tom RuppelFIS IT Group Executive

Transform your business with AI
Similar Articles