Payments Leader

Artificial Intelligence – The New Frontier in Banking, Part 2: Extending Through the Payment Ecosystem – by Maria Schuld

December 12, 2017

Maria Schuld, Group Executive – Financial Services Group

Focused on applications beyond fighting fraud, this is the second article of a two-part series on the deployment of artificial intelligence in banking. Part 1 of Artificial Intelligence – The New Frontier in Banking discussed how important machine learning has become to financial institutions in the fight against fraud.

Artificial Intelligence is poised to revolutionize payments for many of the same reasons it has become so important in other areas of the financial industry. In order to stay current, it’s paramount that businesses begin to understand the capabilities of AI.

Increasing relevancy of offers – a win for loyalty

Artificial intelligence applications can conquer the challenge of rewarding loyal customers for their business in ways that are directly relevant to those individuals. For instance, analysis of behavioral patterns and other customer data through AI can drive more personalized and relevant recommendations – from helping consumers improve their financial health to offering more targeted rewards that drive sales.

The 2017 FIS PACE survey found that bankers worldwide have fallen short in meeting their consumers’ expectations around being rewarded for their business. That’s especially true when it comes to the youngest generation of customers. Fortunately, AI applications that increase the relevancy of rewards can help mitigate these perceived shortfalls. For example, a bank could give a millennial cardholder a meaningful discount for dining at a restaurant adjacent to an arena where they just bought tickets to a sporting event. This not only builds a loyal relationship between the millennial and the bank card, but it also gives businesses the incentive to partner with the bank as well.

More frictionless communications – moving from typing to talking

According to TechCrunch, AI voice recognition is shifting consumer interfaces. ComScore predicts that 50 percent of searches will be voice searches by 2020.

Many financial institutions have debuted “skills” for Amazon’s Alexa that allow account holders to check balances, review charges and make payments.

Assessing credit risk – a thorny issue

Artificial intelligence holds great promise for financial institutions. However, questions persist around the fair use of AI in assessing credit risk. Bankers racing to deploy AI applications for assessing credit should let up on the gas long enough to reflect on this thorny issue.

On the plus side, AI can help “credit invisibles” – consumers with little or no recent credit histories – gain access to credit through the use of alternative data, such as telecom or utility payments. In fact, FICO sees considerable promise in using machine learning to analyze alternative data as a way of assessing the creditworthiness of applicants.

But, there’s a problem. What if the data used in the alternative algorithms to judge creditworthiness isn’t accurate? Or, what if the machine’s decision based on alternative data falls outside of compliance? Using factual inputs like checking whether or not someone is in the military (under the Military Lending Act, which includes active-duty service members and covered dependents) or is on the FBI’s terrorist watch list (Terrorist Screening Database) is easy. But, using alternative inputs can create problems in the future. Here’s why.

AI-based lending platforms use hundreds of data points to determine creditworthiness, propensity to default on loans and the possibility of fraud. If that data happens to be highly correlated with factors that bankers cannot legally consider, however, AI could end up making a discriminatory decision. Such decisions are illegal.

Tremendous opportunity resides with artificial intelligence and machine learning. Financial institutions and other highly regulated users, however, must assess risk factors associated with AI, particularly in the areas of credit, debit, and fraud:

  • Work with your processor to mine the consumer data that will be used in decision-making and ensure its accuracy.
  • Use caution when creating the if/then analyses (e.g., If consumer A has consistently made car loan payments, then consumer A will consistently make a mortgage payment) that underpin AI algorithms. Ensure that the if/then assumptions are fair to consumers.
  • Make sure the criteria by which certain consumers have been turned down for credit, fall within compliance guidelines. Decisions must be explainable and justifiable.
 
 

Maria Schuld

Group Executive – Financial Services Group

With over 20 years of experience in the financial and payments industry, Maria is the Group Executive for debit, credit, fraud operations and business management. Previously, she was a senior management team member for Metavante before its 2009 acquisition by FIS. Other areas of expertise include implementation management, account management, and professional services management.