The algorithms used by governments and large companies in IT systems are not nearly as value-free as we thought. Scandals, such as the tax authorities' benefits affair in our country, have painfully demonstrated this. Why are we saddled with racist, sexist, or discriminating IT systems? And how do we get rid of them? Joris Krijger, philosopher and banker, talks about this during Studio Erasmus.
Krijger, PhD student at Erasmus School of Philosophy, explains that there are several reasons why these mistakes are made: "People who build the systems are often white men of a certain age. These people look at problems in a certain way." Another cause is that systems use data from society. "Women are structurally paid less than men. That is reflected in the data and in how the systems give certain recommendations," says Krijger.
"Organisations need to take responsibility themselves in the first place."
Is a fair system using algorithms even possible? Krijger cannot give a fixed answer to this question. However, he does think that organisations can make conscious decisions when designing these systems. They shouldn't leave this to the technicians who build the system, who, incidentally, often don't want to be given this responsibility at all. The question then becomes, "How can companies record decisions in the best possible way and be held accountable for their value considerations?"