Irregularities identified in college allowances control process by Dutch public sector organization DUO

THE HAGUE – In its control process into misuse of the allowances for students living away from home, Dutch public sector organization DUO selected individuals who lived close to their parent(s) significantly more often. The risk-taxation algorithm, that was used as an assisting tool for selecting students, worked as expected. However, the combination of the algorithm and manual selection resulted in a large overrepresentation of certain groups. Selected students were visited at home to inspect whether allowances were unduly granted. This is the main conclusion of research carried out by NGO Algorithm Audit on behalf of DUO. DUO’s control process was discredited in 2023 after media attention, in which was mentioned that students with a migration background were accused of misuse more often than other students.

Students who were registered living close to their parent(s) were significantly more likely to be selected for a home visit. As a result of such investigations, DUO decides whether the allowances for people living away from home has been frauded with. Such a deviation would not have occurred if the output of a rule-based algorithm had been followed. This algorithm was used by the Dutch Education Executive Agency (‘DUO’) to assign a risk score to all students (more than 500,000) living away from their parent(s) in the period 2012-2022. It is obvious that specific work instructions – which encourage the manual selection of students who are registered near the parental address – have resulted in this disparity. That is the main conclusion of the study Bias prevented of NGO Algorithm Audit, which was sent to the Dutch Parliament on March 1, 2024.

Furthermore, the study does not prove statistical evidence between multiple selection criteria as used in the risk assessment algorithm (i.e., type of education and age) and unduly granted allowances. For the selection criterion distance to parent(s), statistical evidence for such a link with has been found. These results are based on a statistical analysis of risk distributions in random samples drawn from the student population in 2014 and 2017. NGO Algorithm Audit also found that the algorithm does adhere to current standards set for usage of algorithms used in decision-making processes by the Dutch government. At the request of DUO, standards from 2023 were used to assess its decision-making process which has been deployed in the period 2012-2014. For instance, no rationale has been documented why important choices were made during the design phase of the algorithm. Furthermore, the control process has not undergone scrutiny for indirect bias, neither during the design phase nor during the deployment phase.

Commissioned by DUO, NGO Algorithm Audit is auditing the control process of college allowances. The reason for this is that students with a migration background are more often accused of misusage of allowances. Whether a relationship exists between the deviating group, that has been selected significantly more often for a home visit, and students with a migration background will become clear from further research that is currently being undertaken. Data requested from Statistics Netherlands (CBS) will be analyzed to measure the percentage of students per country of origin for each step of the allowances control process. According to NGO Algorithm Audit, publicly accessible data – such as aggregated migration statistics per ZIP code area or the average distance that certain demographic groups live from their parent(s) – is too inaccurate to be used for this delicate research. According to NGO Algorithm Audit, it is too early to conclude whether students with a migration background are actually being disadvantaged, although the study has not demonstrated the opposite. NGO Algorithm Audit does note that there is no evidence of direct discrimination on the basis of migration background in the algorithm.

The results of the studies will be used to determine whether risk profiling can be used responsibly in the future to detect misuse of college allowances. This is relevant as these allowances have been reintroduced since this academic year for students following a higher vocational study (hbo) or a scientific/academic study (wo). DUO and NGO Algorithm Audit will continue to work together in 2024 to interpret the results of the studies together with various societal stakeholders. Currently, DUO solely randomly selects students for checking cases of fraud.

The full report Bias prevented can be found here.

01-03-2024

Independent commission publishes advice to Dutch municipalities on risk profiling for social welfare re-examination

THE HAGUE – On November 29, Dutch Minister of Digitalisation Alexandra van Huffelen accepted the advice report containing specific norms for the responsible utilization of algorithms. Algorithm Audit convened an independent commission of experts to provide advice aimed at preventing unfair treatment in social welfare re-examinations. The impetus for drafting this advice was the controversial risk model used by the Municipality of Rotterdam until 2021.

When municipalities investigate whether recipients of social assistance is duly granted, algorithmic profiling is sometimes employed. The advisory report, prepared by representatives of the Ombudsman of Amsterdam and Rotterdam, various academics, and an alderman from Tilburg, advises against using characteristics such as ZIP code, the number of children, and literacy rate as criteria for profiling. It has been previously demonstrated that profiling in the context of social assistance can lead to discrimination. The independent commission asserts that there are also other ethical concerns associated with algorithmic profiling, as some self-learning algorithms are too complex to effectively explain decisions to citizens. The commission recommends discontinuing the use of certain machine learning algorithms.

The advice is directed at all 340 Dutch municipalities. The guidance is part of the ‘algoprudence’ developed by Algorithm Audit in collaboration with experts and various interest groups to establish concrete norms for the responsible deployment of algorithms. NGO Algorithm Audit is an independent knowledge platform for ethical algorithms and AI, supported by the European AI&Society Fund, the SIDN Fund, and the Ministry of the Interior and Kingdom Relations.

The full report Risk profiling for Social Welfare Re-examination can be found here.

Presentation Dutch Minister of Digitalization

29-11-2023

Newsletter

Stay up to date about our work by signing up for our newsletter

Newsletter

Stay up to date about our work by signing up for our newsletter

Building public knowledge for ethical algorithms