Why are we being saddled with racist, sexist, or discriminating IT systems?

Joris Krijger wordt geïnterviewd tijdens Studio Erasmus
Joris Krijger interview tijdens studio Erasmus
Arie Kers

The algorithms used by governments and large companies in IT systems are not nearly as value-free as we thought. Scandals, such as the tax authorities' benefits affair in our country, have painfully demonstrated this. Why are we saddled with racist, sexist, or discriminating IT systems? And how do we get rid of them? Joris Krijger, philosopher and banker, talks about this during Studio Erasmus.

Krijger, PhD student at Erasmus School of Philosophy, explains that there are several reasons why these mistakes are made: "People who build the systems are often white men of a certain age. These people look at problems in a certain way." Another cause is that systems use data from society. "Women are structurally paid less than men. That is reflected in the data and in how the systems give certain recommendations," says Krijger.

"Organisations need to take responsibility themselves in the first place."

Is a fair system using algorithms even possible? Krijger cannot give a fixed answer to this question. However, he does think that organisations can make conscious decisions when designing these systems. They shouldn't leave this to the technicians who build the system, who, incidentally, often don't want to be given this responsibility at all. The question then becomes, "How can companies record decisions in the best possible way and be held accountable for their value considerations?" 

Watch the entire interview:

Filosoof Joris Krijger over ethiek bij banken

Related content
BBNVARA's programme Reference Man investigates the consequences of a world attuned to the 'standard man
During Studio Erasmus media scientist Simone Driessen explains how people may remain fans of their idol, even if that person has been convicted.
Evenement Rotterdam Ahoy

Compare @count study programme

  • @title

    • Duration: @duration
Compare study programmes