Big data – Promises and Pitfalls for Consumers and the Finance Industry

Think Forward Initiative-CEPR Session
Posted on May 10, 2018

Nature has its own way of encoding information and creating triggers that generate complex responses to changes in status quo. A simple phenomenon of a change in weather patterns from Winter to Spring gives rise to a complex chain of events that are set in motion – flowers bloom, insects and bees thrive and every plant and animal species incorporates the signal from weather changes to determine its own efficient survival path.

For centuries, human beings have utilised similar information signals gleaned from observations of other fellow humans to precisely do the same. This process took a lot of time for the signal to be extracted from observed actions. These observed actions were typically limited to one’s own experience or those within the physical reach of the human eye.

In the past decade, however, this process of learning from observed action has been put on a digital highway. Billions of individuals today have a digital footprint. With powerful technological tools, machines have allowed for this ever-growing footprint of fellow human beings to be used efficiently and extract a signal to lead efficient lives. The speed, accuracy and vastness of information used to extract the signal have created a lot of excitement across all walks of life: From the very simple and mundane such as finding out the fastest route to work from home, to complex choices such as financial investments and housing and employment choice.

This revolution has arisen on the back of two key aspects of the digital footprint of individuals. Institutions – public and private – can aggregate personal data across millions of individuals and it has the potential to view such information across seemingly unconnected domains of the personal life of an individual. This ability to create a unified picture across different dimensions of the digital life and aggregate across individuals all such data gives rise to endless opportunities to generate substantial efficiency gains for every individual.

At the same time, such an ability brings to fore the notion that such tools can be used for subversive purposes that individuals may or may not have signed up to in the first place. The European answer to this challenge has been the General Data Protection Regulation (GDPR) that will be in force from May 25, 2018.

In this setting, the Think Forward Initiative and CEPR invited a leading thinker from the industry – Suzanne Frey, Director of Google Apps Security, Trust, Compliance and Privacy, Google – to discuss the promises and pitfalls of big data for consumers and the finance industry at large. This was followed by comments and response to her by Chris Bannocks, the Chief Data Management Officer, ING.

Enabling Trust

The ultimate gain from the data revolution, according to Suzanne Frey, is that machines get it right most of the time and thereby it gives rise to large savings in human effort. However, diving deeper into how to enable this promise, Suzanne pointed to how trust forms the foundation of the gains to be reaped from this new world. In going back to the familiar world of Maslow’s hierarchy of needs, she identified how technology affects almost all basic aspects in the hierarchy of needs. Going back centuries into understanding the anthropology of enabling trust, Suzanne drew parallels between clinking of beer mugs to signal that no drink has been poisoned by any of the parties, to the foundations in the technology world of sandboxing, anti-virus scanning and internet security through HTTPS; handshakes to show there are no weapons involved to TLS, TCP/IP and Security Keys; and ancient practices of tying flags to an outpost to signal passage through a property in Mongolia to user-flagging, notice and consent, and the right to be forgotten. These foundations, she argued, have built an enabling environment for individuals to trust the digital world more than ever before, and that this forms the basis of enabling trust.

Additionally, she argued that recognizing the limits of what machines can do is equally important for creating a trusting environment. For example, when the machine has no benchmark for what the right answer is – say, with voice-enabled computation on Google Sheets using data provided by the user in the spreadsheet – it is absolutely critical to recognize the need for transparency in how the machine arrived at the answer. She further honed in on the need for situations such as this to squarely place the agency and choice of accepting what the machine has provided on the user / individual it caters to. This environment of transparency in how the machine arrived at the answer, together with giving the user agency and choice to accept or modify the answer arrived at, she emphasized, enables trust.

Critically, her approach to the promises and pitfalls of big data hinged on the notion that trust determines how institutions and governments deal with the pitfalls and shapes the delivery of the promising gains from using big data. This ecosystem to generate trust also has a critical element of consent, which is the first step before adoption and use of such technology, thus giving rise to a few ways to measure the health of this trust ecosystem: number of individuals who adopt and give consent to using the technology, the explicit adoption of machine learning solutions, and an interesting measure arising from the black market prices of hacking a particular environment for data (the higher the price, the better the technology defending the system to enable trust for individuals).

Going beyond rules: Enable an ethics-based approach to data within institutions

While broadly welcoming the GDPR in Europe, both Suzanne Frey and Chris Bannocks agreed that it enables a more trusting environment by allowing individuals to be more in control of their own personal data. However, Chris Bannocks in his response to Suzanne’s presentation also added that institutions have a critical role to play by going above and beyond the rules-based environment put in place. He argued that the finance industry is completely based on trust and even a single breach of trust can give rise to cascading effects on public perception of such institutions. In order to be more vigilant and ensure that households and counterparties in financial transactions retain trust in the use of technology driven decisions, Chris suggested that data quality, transparency in how it is used and how results are arrived at, and control on what it is used for is only the foundation of the work involved, and that institutions especially in finance must adopt an ethics-based approach to this problem as he has set in motion within ING. In concluding his thoughts on how to address pitfalls and enable promises with big data, he added that consumers and the industry do not care if the law allowed for some use of the data, but only whether that use was socially and ethically palatable. In other words, the bar is set not where the law is, but where society is when it comes to accepting big data driven decisions and solutions for each use-case on hand. Actually, that is often much higher than where the law is.