Teaching AI to have a heart – FinTech Futures
I asked Kareem what the best approach was to solve the problem. He responded that “the first thing to do is a diagnosis”. Kareem told me that Fairplay has an analysis tool that analyses a bank’s existing lending software for signs of discrimination. It tries to answer the following questions:
Is the algorithm fair?
If not, why not?
How could it be fairer?
What’s the economic impact to the business of being fair?
Do applicants who are rejected get a second look to see if they might resemble favoured borrowers?
Answering these questions forces institutions to look at their decision engines and find ways to re-train them.
Re-evaluating declined loan applications happens using more complete information about the borrowers and different modelling techniques to see if they resemble creditworthy people. So, for example, women tend to have inconsistent employment between 25 and 45. This would be a creditworthiness flag for male borrowers but not necessarily for women taking career breaks to raise children.
Kareem knows that lenders will increase their approval rates for female, Black, and other non-White people by re-training algorithms and taking a second look at rejected customers, particularly those just below the approval threshold. By his estimates, this increase can be 10-30%, which is huge.