Audit of 9 government algorithms finds 6 do not meet basic requirements
Responsible use of algorithms by government agencies is possible but not always the case in practice. The Netherlands Court of Audit found that 3 out of 9 algorithms it audited met all the basic requirements, the other 6 did not and exposed the government to various risks: from inadequate control over the algorithm’s performance and impact to bias, data leaks and unauthorised access.
These audit findings prompted the State Secretary for Digitisation to take immediate action, thus demonstrating how seriously she takes the Court’s audit and her responsibility for the digitisation of government. She will work with the implementing organisations to tackle the shortcomings. She wants to inform the House of Representatives of any follow-up measures before the summer recess. The Court of Audit said the response of the police and the Migration service to its findings was particularly worrying because they seemed to be at odds with the state secretary’s intentions. The responsible minister should ask both organisations to address the risks attaching to algorithms because they can disadvantage the public. Citizens and businesses must be confident that the government uses algorithms responsibly.
Turning off algorithms not without risk
Algorithms are useful and helpful but not without risk. One of our recommendations is that the ministers concerned prevent undesirable systemic variations by checking for the presence of bias. This will reduce the risk of undesirable outcomes. Without algorithms, however, government policy could not be implemented. Algorithms take millions of decisions, solve millions of problems and make millions of forecasts quickly and automatically every month. Our audit found that taking an algorithm out of use, as the Social Insurance Bank did with its algorithm to assess personal budget applications, can also harbour risks. Civil servants then have to perform more work manually and fewer warnings are received of the potential misuse of a public service.
Simple and complex algorithms audited
In 2021, the Court of Audit developed an assessment framework in consultation with many government organisations to prepare for this audit. The government did not have such a framework of basic requirements. The Court then audited 9 algorithms used by implementing organisations. They included both simple and complex algorithms. Some were supported, such as those that send traffic fines to the right address or check whether aliens have not already registered in the Netherlands. Others took decisions partly automatically, for instance to award housing benefits, decide whether a business was eligible for financial support from the TVL scheme to combat the COVID-19 pandemic and to decide whether an applicant was medically fit to drive a motor vehicle.
The algorithms we audited that were used by the police, the Ministry of Justice and Security’s Directorate-General for Migration and the National Office for Identity Data did not meet the basic requirements on several counts. The last 2 organisations had outsourced the development and management of algorithms but had not made agreements on who was responsible for what. The National Office for Identity Data could not independently verify that its algorithm correctly assessed the quality of passport photographs. This could lead to discrimination. Furthermore, it did not assess the consequences for data protection. The Criminality Anticipation System used by the police to forecast where and when there is a high risk of incidents does not check for bias.
IT general controls often ineffective
Six of the 9 organisations we audited did not known which members of staff had access to the algorithm and its data. Unauthorised access to systems can lead to data being corrupted or lost, with potentially dire consequences for citizens and businesses. The Netherlands Enterprise Agency (which provides financial support under the government’s TVL scheme), the autonomised Benefits department of the Tax and Customs Administration (which awards housing benefits) and the Social Insurance Bank (which assesses state pension applications) are all exposed to risks because their IT general controls are not in order.
The automated system used by the Central Office for Motor Vehicle Driver Testing to decide whether someone is medically fit to drive a motor vehicle meets all the basic requirements, as do the Central Judicial Collection Agency’s algorithm to match traffic fines to vehicle registration numbers and the Benefits Intelligence Agency’s algorithm to check the regularity of benefit payments.