It’s a fait accompli that Fintechs will incorporate “big data” into many aspects of their decision-making. While we cannot stop the development of technical innovation, we should pay attention to how it affects our interests. In my opinion, we need to make sure that they use those algorithms in ways that are fair for everyone. This is certainly true in the financial space, where analytical systems are being deployed by financial institutions to tilt the balance of power in their favor and against the interests of consumers.
We have fair lending laws because many banks refused to extend credit to applicants from protected classes for reasons that had nothing to do with their credit-worthiness. Red-lining, the practice of drawing red lines across maps to indicate where loans could and could not be made, was initially an opaque means to limit mortgage lending to “desirable” (not black) neighborhoods in cities in the 30s.
“The real lesson we learn over and over again in banking, wrote Karen Shaw Petrou in an American Banker editorial , “is that retroactive consumer protection leaves a lot of badly hurt, vulnerable households in the ditch. It needs to be thought of now before these products become even larger and more dominant in the financial system.”
Sometimes a company with the best of intentions unwittingly creates a product with side effects that harm people. In my opinion, our technology community places too much faith in the power of data to enhance our communities. The truth is more complicated. Data can be a force for good, but without considering the context of where it comes from and how it will be applied, it is neither good nor bad.
About ten years ago, during a trip to the Bay Area, I had the opportunity to share a meal with a group of youngish start-up types. Most worked at a new company that wanted to aggregate data on local schools and then white label it to consumer-facing real estate brokers. The idea was to make a one-stop method of grading school quality so that people could quickly evaluate the quality of nearby schools when they were contemplating the purchase or rental of a home.
They succeeded in their goal. The company built a system that could capture education data, feed it into an algorithm, and produce a pure 1-10 numerical grade for any school. There was no human judgment – it was an entirely objective process.
Unfortunately, they created a system whose outputs correlated strongly with local socioeconomics. Statistics on student performance correlate highly with the socioeconomic status of the pupils. In the abstract to its hosted discussion of the question, the American Psychological Association comments:
Research indicates that children from low-SES households and communities develop academic skills slower than children from higher SES groups. For instance, low SES in childhood is related to poor cognitive development, language, memory, socioemotional processing, and consequently poor income and health in adulthood. The school systems in low-SES communities are often under-resourced, negatively affecting students’ academic progress and outcomes. Inadequate education and increased dropout rates affect children’s academic achievement, perpetuating the low-SES status of the community.
By blindly collecting inputs out of context and then amplifying them to realtors, an effort at informing consumers had the unintended effect of reinforcing neighborhood inequality.
The use of data can sharpen our thinking, clarify our decision-making, and expose close-mindedness for its shortcomings. Nonetheless, everyone should be at least partially skeptical of data’s benefits, because all data is vulnerable. The garbage-in garbage-out hypothesis, whose expression originated in England in the 19th century, remains as accurate today as it was then.
In a fintech culture where companies succeed or perish in a matter of years, collateral consequences often take a back seat. The truism “do it now and ask forgiveness later” applies to much decision-making. It may not be a stretch to pin that tendency on the captains of the game - the venture capitalists who fund most fintech enterprises.
It seems like a safe assertion that any data collected by a fintechs will survive beyond the lifespan of the companies that initially received it.
We already know that a mistake can have systemic repercussions. Equifax failed to protect the data it collected, and as a result, hackers stole the personally-identifying information of 148 million consumers. The Federal Trade Commission fined the company, but I do not believe consumers were made whole. While the company did have to agree to pay for one year of credit monitoring for each victim, consumers will remain vulnerable to fraud forever.
Political leaders in Washington want to reduce our protections. The Bureau of Consumer Financial Protection, not acting under the non-leadership of non-director Mick Mulvaney, chose not to make any enforcement action against Equifax. Representative Patrick McHenry (R-NC) subsequently introduced a “regulatory sandbox” proposal that would significantly reduce the regulatory supervision of Fintech startups below a certain size (i.e., 10,000 users).
Never mind that by its very essence, regulation will always be catching up to innovation. Now the entities charged with protecting consumers have decided to wear blindfolds.
They need to re-think their laissez-faire instincts. Your financial information is a high-value target. It is not like stealing the number to your grocery store discount club. Your social security number, when coupled with information available to anyone from legal vendors, can unlock your assets to an enterprising criminal.
Fair commerce works within an atmosphere where transactors make decisions with full information. That should be the case in any context. In situations where search costs make that infeasible – as in the case of food or prescription drugs – then we need regulatory intervention to create a safe marketplace. We do not have that in fintech.
We need to address the process of disclosure. When is the last time that you read the full terms of conditions associated with an app? I bet the answer is “never.” Do you realize that your Bitmoji app is collecting all of the data that you have ever entered into your phone’s keypad? Not just the messages you sent using Bitmoji, but everything from text messages to search engine requests. You have given Bitmoji full access to all of your sensitive information – your passwords, your emails, and to the point of this blog – your banking details.
We need to pay more attention to APIs. The integration of apps to fulfill complicated requests leads to cross-sharing of data. When you integrate your GPS with your search engine to facilitate a mapping request, you have shared location data with your search engine provider. We should see the moment when two apps co-mingle their data storehouses as an essential moment for policy considerations.
Nonetheless, APIs do have their virtues. APIs reduce friction. APIs make things work.
Consider the differences among the two main ways that consumers can authorize a bank-to-bank transfer. Anyone with several bank accounts has probably had this experience. Some financial institutions send two micro-deposits before making the first external transfer. The transmission goes over the ACH rails, and as a result, it usually takes several days to go through. In the meantime, the consumer has to wait. Once the deposit finally hits, the account holder has to sign back into their account and verify the amounts of each transfer. The process is friction-filled, slow, and cumbersome enough to thwart efforts to initiate a transfer.
On the other hand, with an API in place, the consumer can authorize the sending back to access the account of the receiving bank. Authorization is seamless. The process can be completed in minutes.
Moreover, once a bank has access to an account via an API-authorized sign-on, it can remain apace with your account information.he Varo Money Card taps this functionality to create global money management tools for their account holders.
We will walk a difficult path. Consumers derive great utility from their technology. The arrival of the smartphone has advanced our quality of life in many ways. I believe that consumers will want to integrate their banking across all of their devices, and when they do, they will gain more control over their finances. The opportunity for fintech to improve financial health is real. However, we cannot ignore the downsides. The incursion continues to progress at a blistering pace, and for many young people, privacy has ceased to be a feasible expectation for their future. The right answer will find a way of permitting people to get the benefits they want, but without giving up the privacy they deserve.