What should a bank know about us?

De Nederlandsche Bank N.V.

Technology is everywhere these days. As in many other sectors, the financial sector sees many decisions now being made by computers based on big data and artificial intelligence. What does that mean for us? Will algorithms determine the level of our mortgage? And is financial technology discriminatory, or does this ‘fintech’ actually contribute to inclusion? A conversation with Emanuel van Praag, professor of Financial Technology and Law, at Erasmus School of Law.

Big data, machine learning, increasingly powerful computers. Can we still keep pace with the developments?

“Even for a professor of Financial Technology and Law, I have to admit that it is a challenge. However, the core business of the financial industry remains the same as before, namely lending, saving, insuring and investing. The unique thing about the sector is the bond with the customer. If you sell a jar of peanut butter, you don't care who buys it or what they do with it. But as a financial institution, you give money to someone in the hope that it will be returned. It is in your own interest to get to know the customer well, you want to know whether someone applying for a loan is creditworthy. The classic data used to demonstrate this include income, any debts and – in the case of a company – the financial statements. New technology allows you to find out much more about a potential customer. Computers can calculate all the data and decide whether or not someone gets a loan.”

Isn’t there a risk of discrimination in this automated decision-making process?

“That’s a fair question. When you apply for a mortgage, the bank will consult the Dutch Credit Registration Office (BKR); that is regulated by law. The credit registration will show whether you failed to repay a previous loan as a private individual, or went bankrupt as a company. This information is public, as it serves a legitimate purpose. But how does the bank tailor its financial products to the customer? You aren’t allowed to say: women live longer on average, so we will make their life insurance more expensive. That’s discrimination. But as a bank you can also discriminate implicitly, for example by determining the amount of the premium based on a person’s surfing behaviour on social media, which often differs between men and women. Or take the behaviour of Facebook friends into account in a decision. Although this is not allowed according to European regulations, it is difficult to monitor.”

“We’re perfectly okay with price discrimination on a plane, but at a bank?”

There's an ethical component there.

“Indeed. As an academic, you ask yourself the following kinds of questions: as a society, do we feel that this should be allowed? And do we accept the social consequences? This is often subtle. We think it is legitimate for a bank to take the likelihood of repayment into account when granting a loan, for example. So if Mr A's risks are higher than Mr B's, Mr A will pay a higher interest rate. But what if the bank knows from internet data that Mr B has also requested information from other banks and thinks: let’s offer this man a lower interest rate, otherwise he might go to a competitor? That's price discrimination. It’s not prohibited, but is it desirable?

We're fine with two seats next to each other on a plane having different prices, but at a bank? I think we need to be transparent about this. Another question is who is entitled to what information? Tesla offers insurance packages in which the amount of the premium is based on a person’s driving behaviour. In itself, that’s logical. They don't just look at claims. They also measure how quickly the driver accelerates and check whether she or he changes lanes frequently. Other insurers would also like this information. That’s understandable, but should we allow it? I am a member of the European Commission's Expert Group on European Financial Data Space. In this group, we advise directors on legislative proposals and initiatives relating to the exchange of financial data within the EU.”

What is the added value of the law?

“As a lawyer, you do two things. On the one hand, you help to make innovations possible through policy and regulations. On the other, you want to keep these new opportunities on the right track. As I said, it’s quite complicated to find out exactly what is happening. Which data sources do financial institutions employ and how do they use artificial intelligence (AI), for example? Little has been published about this. I therefore talk to the sector, but of course I don’t hear any trade secrets. I’ve written articles about the fact that judges and regulatory authorities have difficulty with the use of AI.

There is currently a lawsuit between the online bank Bunq and De Nederlandsche Bank [DNB], regarding compliance with anti-money laundering regulations. Bunq says: ‘We combat money laundering with artificial intelligence, we no longer feel that your form of supervision is in line with the times.’ DNB says: ‘We don't understand your method, keep to our classic rules!’ A difficult case. In general, as a financial institution, you must be able to explain which choices you are making. But the technology is incredibly complex.”

“Fintech contributes to social inclusion, which we at Erasmus University are also committed to”

As citizens, are we benefiting from fintech?

“Yes, certainly. New technology and regulations make it possible for consumers or entrepreneurs to have more choice, greater ease of use and better prices. Take PSD-2 [the European Payment Services Directive 2], for example, which among other things ensures that an SME entrepreneur is no longer tied to his own bank but can more easily obtain a loan elsewhere. There are also digital wallets. This app allows you to store all the important documents you need to provide to companies you want to do business with. Fintech therefore contributes to social inclusion, which we at Erasmus University are also committed to.”

Are you optimistic about the future?

“Definitely. Fintech is not about making a lot of money quickly. The goal is to estimate risks so that you can calculate exactly what a customer is worth. The basis of a financial relationship is still trust between the parties. Financial institutions faced a great deal of scorn after the banking crisis of 2008, as you will probably know. But I speak to a lot of people who work in the sector. Most of them have their customers' best interests at heart. People tend to forget that. The structures can lead to people doing the wrong things, however, and we are working on that. I just read Rutger Bregman's book Humankind: A Hopeful History. Like him, I have a positive view of humanity, especially the employees in the financial sector.”

Professor

Prof.mr.dr. E.J. (Emanuel) van Praag

More information

Emanuel van Praag is professor of Financial Technology and Law at Erasmus School of Law (Erasmus University Rotterdam). In June 2022, he gave his inaugural lecture, entitled Open finance: the essence. He also works as a lawyer at Kennedy Van der Laan. Prof. Van Praag regularly publishes in academic and general media. In 2020, he wrote the book PSD2: naar open banking en bankieren in een ecosystem (PSD2: towards open banking and banking in an ecosystem) [Deventer: Wolters Kluwer 2020].

Related content

Emanuel van Praag appointed as member of Expert group 'European Financial Data Space'

The Expert group advises the European Commission on financial stability, financial services and capital market unions, on data exchange in the financial sector.
Foto van ICFG seminar

Is Nederland klaar voor Big Data?

Emanuel van Praag, hoogleraar Financiële Technologie en Recht aan Erasmus School of Law geeft uitleg over de juridische uitdagingen op 10 juni 2021.

Compare @count study programme

  • @title

    • Duration: @duration
Compare study programmes