Big data, finance and inequality
… Financial companies have the option of using data-guzzling technologies that make the observation of shopping habits look downright primitive. A plethora of information gathered from social media, digital data brokers and online trails can be used to mathematically determine the creditworthiness of individuals, or to market products specifically targeted to them.
The degree to which such algorithms are utilised by mainstream banks and credit card companies is unclear, as are their inputs, calculations and the resulting scores. While many types of data-driven algorithms have been criticised for opacity and intrusiveness, the use of digital scorecards in finance raises additional issues of fairness. Using such information to make predictions about borrowers can, critics say, become self-fulfilling, hardening the lines between the wealthy and poor by denying credit to those who are already associated with not having access to it.
“You can get in a death spiral simply by making one wrong move, when algorithms amplify a bad data point and cause cascading effects,” says Frank Pasquale, a professor of law at University of Maryland and author of a book on algorithms called The Black Box Society.
I’ve said before that I am incredibly proud of this Financial Times piece exploring the impact of big data on finance and equality. Researching this kind of topic is challenging because details on the use of big data remain murky – even more so when it comes to banks and financial companies. For that reason, much of the discussion remains theoretical, although it’s hard not to believe that this is the direction we are heading when you read that Google – a company notorious for using big data to personalise ads and search engine results in the name of advertising dollars- is now trialling money transfers. The British bank Barclays has reportedly also begun selling aggregated customer data to third-party companies.
The FT article focuses on two main uses for big data in finance; to augment financial decisions and processes such as loan underwriting, or to analyse customers’ behaviour and status and then push specific products and services to them. It’s the second use that many find to be the most insidious:
… Big data and algorithm-driven consumer scoring may find its greatest use in marketing financial products, rather than underwriting them. Nielsen Prizm looks at postcodes to classify customers into categories ranging from “upper crust” to “low-rise living” — a group it describes as a “transient” world of “ethnically diverse” singles and single parents. The use of such precise targeting means that those lumped in with an undesirable group may never get a chance at a certain loan or financial product simply because they may never see them advertised.
Such targeted advertising and products are particularly worrying in the realm of financial services because limiting access to credit (or offering different types or rates of credit to different groups of people) can have obvious and sinister implications. The opportunities presented to you become defined by the opportunities already associated with your socioeconomic peers. It’s a digital caste system defined by data, and one which may be hardening while remaining under the radar of consumers, regulators and politicians.
Speaking of which I was amazed to read the below excerpt from Henry Kissinger’s new book, taken from Izabella Kaminska’s excellent FT Alphaville post on banking, technology and targeted advertising.
Here’s Kissinger:
Nearly every website contains some kind of customisation function based on Internet tracing codes designed to ascertain a user’s background and preferences.
These methods are intended to encourage users to ‘consumer other content’ and, in so doing, be exposed to more advertising, which ultimately drives the Internet economy. These subtle directions are in accordance with a broader trend to manage the traditional understanding of human choice. Goods are sorted and prioritised to present those ‘which you would like’, and online news is presented as ‘news which will best suit you’.
Two different people appealing to a search engine with the same question do not necessarily receive the same answers. The concept of truth is being relativized and individualized – losing its universal character.
Who knew a 91-year-old political scientist has such a brilliant grasp of the implications of big data?
Big data: Credit where credit’s due
Do you like your payments subsidised with ads?