Getting it Right on Fintech Means Balancing the Utility of Products with Strong Consumer Protections

by
Adam Rust

Why, at a time when Mark Zuckerberg is being pilloried by members of Congress for putting his company’s interests ahead of his users, are policymakers creating an opening for financial companies to try the same?

People frame the question in a zero-sum context: either we allow companies to innovate in the spirit of bringing new solutions to the marketplace, or we handcuff progress in the name of protecting the rights of consumers.

The idea of a regulatory sandbox, currently being applied in DC, sides with the proponents of innovation. Unfortunately, they see the question as one or the other. Their support of innovation is limited entirely to reducing protections. On other approaches, they are silent. The best tools in economic development - a subsidized loan fund, workforce training, or technical advice – are missing. The answer, it seems, is to take from consumers to give to businesses.

I am not sure why we need government to do that, and equally, I do not understand why policymakers would perceive that the public benefits when government eliminates protections.

Sometimes a company with the best of intentions unwittingly creates a product with side effects that harm people. In my opinion, our technology community places too much faith in the power of data to enhance our communities. The truth is more complicated. Data can be a force for good, but without considering the context of where it comes from and how it will be applied, it is neither good nor bad.

“The real lesson we learn over and over again in banking, wrote Karen Shaw Petrou in an American Banker editorial is that retroactive consumer protection leaves a lot of badly hurt, vulnerable households in the ditch. It needs to be thought of now before these products become even larger and more dominant in the financial system.”

About ten years ago, during a trip to the Bay Area, I had the opportunity to share a meal with a group of youngish start-up types. Most worked at a new company that wanted to aggregate data on local schools and then white label it to consumer-facing real estate brokers. The idea was to make a one-stop method of grading school quality so that people could quickly evaluate the quality of nearby schools when they were contemplating the purchase or rental of a home.

They succeeded in their goal. The company built a system that could capture education data, feed it into an algorithm, and produce a pure 1-10 numerical grade for any school. There was no human judgment – it was an entirely objective process.

Unfortunately, they created a system whose outputs correlated strongly with local socioeconomics. Statistics on student performance correlate highly with the socioeconomic status of the pupils.  In the abstract to its hosted discussion of the question, the American Psychological Association comments:

Research indicates that children from low-SES households and communities develop academic skills slower than children from higher SES groups. For instance, low SES in childhood is related to poor cognitive development, language, memory, socioemotional processing, and consequently poor income and health in adulthood. The school systems in low-SES communities are often under-resourced, negatively affecting students’ academic progress and outcomes. Inadequate education and increased dropout rates affect children’s academic achievement, perpetuating the low-SES status of the community.

By blindly collecting inputs out of context and then amplifying them to realtors, an effort at informing consumers had the unintended effect of reinforcing neighborhood inequality.

The use of data can sharpen our thinking, clarify our decision-making, and expose close-mindedness for its shortcomings. Nonetheless, everyone should be at least partially skeptical of data’s benefits, because all data is vulnerable. The garbage-in garbage-out hypothesis, whose expression originated in England in the 19th century, remains as accurate today as it was then.

The Fintechs won’t be the first players in banking who make a mistake in how they use data to make decisions. We have fair lending laws because many banks refused to extend credit to applicants from protected classes for reasons that had nothing to do with their credit-worthiness. Red-lining, the practice of drawing red lines across maps to indicate where loans could and could not be made, was initially an opaque means to limit mortgage lending to “desirable” (not black) neighborhoods in cities in the 30s. We addressed those mistakes, but it took time. Can we be more responsive this time? I believe we must, if only because the pace of change is far greater this time.

Picture of card
Picture of card

In a fintech culture where companies succeed or perish in a matter of years, collateral consequences often take a back seat. The truism “do it now and ask forgiveness later” applies to much decision-making. It may not be a stretch to pin that tendency on the captains of the game - the venture capitalists who fund most fintech enterprises.

It seems like a safe assertion that any data collected by a fintechs will survive beyond the lifespan of the companies that initially received it.

We already know that a mistake can have systemic repercussions. Equifax failed to protect the data it collected, and as a result, hackers stole the personally-identifying information of 148 million consumers. The Federal Trade Commission fined the company, but I do not believe consumers were made whole. While the company did have to agree to pay for one year of credit monitoring for each victim, consumers will remain vulnerable to fraud forever.

Some of our leaders in Washington appear to be working to reduce our protections. The Bureau of Consumer Financial Protection, not acting under the non-leadership of non-director Mick Mulvaney, chose not to make any enforcement action against Equifax. Representative Patrick McHenry (R-NC) subsequently introduced a “regulatory sandbox” proposal that would significantly reduce the regulatory supervision of fintech startups below a certain size (i.e., 10,000 users).

Never mind that by its very essence, regulation will always be catching up to innovation. Now the entities charged with protecting consumers have decided to wear blindfolds.

They need to re-think their laissez-faire instincts. Your financial information is a high-value target. It is not like stealing the number to your grocery store discount club. Your social security number, when coupled with information available to anyone from legal vendors, can unlock your assets to an enterprising criminal.

Fair commerce works within an atmosphere where transactors make decisions with full information. That should be the case in any context. In situations where search costs make that infeasible – as in the case of food or prescription drugs – then we need regulatory intervention to create a safe marketplace. We do not have that in fintech.

We need to address the process of disclosure. When is the last time that you read the full terms of conditions associated with an app? I bet the answer is “never.” Do you realize that your Bitmoji app is collecting all of the data that you have ever entered into your phone’s keypad? Not just the messages you sent using Bitmoji, but everything from text messages to search engine requests. You have given Bitmoji full access to all of your sensitive information – your passwords, your emails, and to the point of this blog – your banking details.

We need to pay more attention to APIs. The integration of apps to fulfill complicated requests leads to cross-sharing of data. When you integrate your GPS with your search engine to facilitate a mapping request, you have shared location data with your search engine provider. We should see the moment when two apps co-mingle their data storehouses as an essential moment for policy considerations.

We will walk a difficult path. Consumers derive great utility from their technology. The arrival of the smartphone has advanced our quality of life in many ways. I believe that consumers will want to integrate their banking across all of their devices, and when they do, they will gain more control over their finances. The opportunity for fintech to improve financial health is real. However, the downsides cannot be ignored. The incursion continues to progress at a blistering pace, and for many young people, privacy has ceased to be a feasible expectation for their future. The right answer will find a way of permitting people to get the benefits they want, but without giving up the privacy they deserve.

Back Arrow icon
Back to list of blog posts
The Wisewage blog is not intended to describe any particular product mentioned elsewhere on the site. Please refer to each product page for details about any specific product. You can read our full legal statement about the blog here.
Thank you! Your subscription request has been received!
Oops! Something went wrong while submitting the form.