ANP
NOS Nieuws•vandaag, 20:28
The Dutch Data Protection Authority has started monitoring algorithms. For example, they can determine whether or not people are invited for a job interview, or judge whether someone is subject to an extra check for fraud.
The extra supervision has been promised in the coalition agreement. The cabinet has set aside one million euros for this year. This will increase in the coming years, to a structural sum of 3.6 million euros in 2026. Both government agencies and the business community will be affected by the new supervision. According to chairman Aleid Wolfsen, there is a great need for more supervision of the operation of algorithms, because all kinds of things can go wrong.
There have been plenty of examples of incorrect use of algorithms in recent years. It happened with the scandal with childcare allowances, for example. Then, based on all kinds of indicators, it was determined who would be checked manually. The indicators were, for example, a person’s nationality, family composition and salary. Another example is the controversial anti-fraud program SyRI.
“Algorithms are increasingly deployed and used in selecting people,” says Wolfsen. “Who may or may not become a customer of a company, who is or is not invited for a job interview, and who is or is not subject to extra checks for fraud.”
Monitoring of all algorithms
According to Wolfsen, an algorithm can be “life-threatening” if it goes wrong. “It could be that an algorithm is programmed incorrectly or trained incorrectly,” says Wolfsen. “It can contain discriminatory elements and that can affect a lot of people at once. We must all try to prevent that.”
The new supervision should result in the privacy watchdog being able to take measures more quickly in the event of incorrect algorithms. “If we are tighter on that, you can intervene faster, warn faster and stop faster,” he says. “We will do that together with other regulators.” Citizens themselves can also report their complaints to the Dutch Data Protection Authority. That authority may also decide to conduct an investigation.
Warning Wolfsen
Wolfsen warns companies that the Dutch Data Protection Authority can supervise all algorithms. “We can access everything, everywhere. There are no secrets for us in that area. We may not disclose it because we don’t want to divulge trade secrets. But that doesn’t mean we shouldn’t, can’t, shouldn’t and don’t want to. know how it works.”
Whether supervision is sufficient remains to be seen. Amnesty International says there are too few resources to effectively tackle discriminatory algorithms. The human rights organization thinks, among other things, that extra money should be made available for the supervisory tasks.