Conference Better Algorithms for better policies

Speech Ewout Irrgang, Vice-President Netherlands Court of Audit, on conference Better Algorithms for better policies
The Hague, 30 November 2021
 

I would first like to introduce ourselves. I myself am board member and Vice-President of the Netherlands Court of Audit. I would like to explain our audit findings and thoughts about algorithms. The Dutch central government uses algorithms to implement its policies. Before taking you through our audit findings, I would like to explain why we think that algorithms will become more and more part of our day-to-day work.

Every year, we audit the annual reports produced by the Dutch ministries, with a view to assessing each minster’s accountability for expenditure, operational management and policy results.

Our annual accountability audit is designed to answer the following three questions:

  1. Was public money spent in accordance with the rules in the past year?
  2. Were matters at the ministry properly organised?
  3. Did the policy pursued produce the desired results?

Building on this statutory task, we express opinions on the quality of the financial information in the annual reports, the preparation of policy and the quality of operational management itself. 

  • Central government uses algorithms to implement its policy. One of our goals in our audit published last year was to help demystify the use of algorithms. 
  • Concerns about discrimination and biases have been expressed regularly, in both the Dutch media and Dutch parliament. Only last week Dutch media reported, from 2013 till 2016 people with lower incomes were being checked more often for fraud by an algorithm of the Dutch tax office which is no longer in use. 
  • We performed our recent audit for three reasons. Firstly, it is our duty to audit primary processes. Algorithms are now replacing certain parts of these processes, which means there is a risk of these processes turning into a black box.  Secondly, as an independent auditor, we wanted to make a valuable contribution to the performance organisations involved. Finally, we want to contribute to a fact-based discussion on algorithms both in parliament as in Dutch society at large. 
  • Our audit was the first government-wide audit of algorithms used by central government. We took the opportunity to develop a unique audit framework as a practical tool that we intend to use in future audits.
Board member Ewout Irrgang (above left) was one of the speakers at the congress, which was held largely online.

Let’s first focus on what an algorithm entails. Both central government and businesses have been using algorithms for many decades. Algorithms are not new in themselves. What is new is the amount of data involved, the computer processing power and their broad range of applications. 

An algorithm is a set of rules and instructions that a computer follows automatically when performing calculations to solve a problem or answer a question. Algorithms come in many different forms, ranging from computational models and decision trees to complex data processing models and ‘self-learning’ applications. In their most sophisticated form, algorithms can exhibit ‘intelligent’ behaviour. This is often referred to as ‘artificial intelligence’ (AI).

Algorithms support and, in many cases, improve operational management and service delivery. 
However, these are some risks we see:

  1. There is a risk that the (learning) algorithm or the data (set) used by the algorithm may contain certain biases that lead to discrimination. 
  2. The way in which an algorithm works in central government and its impact on government action may not be sufficiently clear to the general public.
  3. Lastly, many data sets and algorithms used by central government have been obtained from external suppliers. This makes them more difficult to monitor and control.  

 
Our audit approach consisted of three parts. We began by asking the ministries to identify relevant applications of predictive and prescriptive algorithms. We looked at the purposes for which these algorithms are used, their impact on citizens, and how they are managed and documented. This assessment was based on a questionnaire, after which the ministries performed a self-assessment.

The second part was the audit framework.

I should point out that, at the time, there was no framework available in the Netherlands for auditing algorithms. The audit framework that we used for this audit is based on various types of existing information, parameters and standards. Our audit framework is a practical tool that we intend to use in future audits. 

In the third part of our study, we selected three algorithms from our list and tested them with the help of our audit framework. The algorithms in question involved:

  • a decision tree designed to make recommendations for social benefit checks or extra checks of applications from citizens;
  • an assessment system for schools detecting non-standard objects, generating information for regulators and inspectors;
  • a government facial recognition system for granting individuals physical access to a site or building.


We did not find any fully self-learning algorithms in central government. Most of the algorithms we found performed administrative tasks. These are simple algorithms, which does not mean that they cannot have a substantial impact. A simple algorithm might, for example, consist of a decision tree used to calculate the amount and duration of a benefit payment. 

We found that one third of the 86 algorithms listed by the ministries use automated decision-making. So two third of them did not use automated decision-making. 
Automated decision-making was used only by algorithms that perform simple administrative tasks. 


The second part of the audit concerned the audit framework that we developed. The wide public interest in algorithms has prompted a plethora of initiatives, standards and guidelines, developed by different stakeholders. We brought these together in our framework, that contains five different perspectives for investigating algorithms: governance and accountability; model and data; privacy; IT general controls; and ethics. Rather than constituting a separate aspect, ethics are interwoven with the other four aspects of the framework. 
This audit framework is intended primarily for auditors. It can also be used right from the start when building an algorithm as input for quality requirements. The framework is available online and accessible to anyone using the link at the end of my presentation. Do let me know if you would like further information on the framework.


Let me now discuss our conclusions and recommendations. 

First, citizens do not know enough about who to contact with questions about algorithms, how to notify the government about data errors, and how to object to the use of data or the outcome of algorithms.

Second, improvements are needed for the responsible use and further development of algorithms. For example, no system of life cycle management has been designed for algorithms. This means that there is no regular maintenance of algorithms during their entire life cycle. 

Third, we found automated decision-making only in algorithms performing simple administrative activities. We also found that the complex algorithms that we analysed do not take independent decisions. Government officials play a prominent role in the use of these algorithms, which assist them in performing analyses and taking decisions.

Fourth, algorithms are often developed on the basis of day-to-day working practices. Senior ministry officials do not have any information on this development process and on the use of algorithms. As a result, ministers are unable to mitigate the potential adverse effects of algorithms on government service delivery. There is a need for specific tools to build up a clear picture of the use of algorithms and their risks. 

Finally, the algorithms we assessed were indeed verifiable. We were able to examine the three specific algorithms in our case studies and therefore concluded that algorithms are not black boxes for us. 


We urged the Dutch government to adopt a clear, uniform set of terms and specific quality requirements for algorithms. 

The government should enable citizens to access information on which data is used in which algorithms, how these algorithms work, and what impact their outcomes may have. The government should set up a desk for handling questions, rectifying errors and dealing with objections. 

The government should also ensure that agreements on the use of algorithms are adequately documented. Effective arrangements should be made for monitoring compliance in terms of life cycle management, maintenance and compliance with current legislation and for evaluating algorithms. 

Lastly, we urged the cabinet to see to it that officials working with algorithms have information on the quality of the IT general controls for algorithms.


The publication of our report on algorithms prompted many responses. Both in the Netherlands and also around the world. For example, in my meetings with the Supreme Audit Institutions of South-Korea and India they were very interested in this particular audit and more particular in the audit framework we developed. We recently launched a follow-up audit. 

This audit has two components:

First, using our audit framework to audit other algorithms used by government. We would like to find out how the risks associated with certain selected algorithms are managed, and how they score on our framework. 

Second, the interaction between algorithms and people. How does the ‘human-in-the-loop’ aspect operate in practice? What impact do algorithms have on citizens?

To conclude, we are also trying to integrate auditing algorithms into the financial audits that form part of the accountability audits to which I referred at the start of this presentation. This is because algorithms often play a role in major financial flows. Housing benefits are one example and this is an application we are looking into at the moment. 

In the end, we hope to fully integrate auditing of algorithms in our continuous audit programs. Making sure they are not black boxes. To help spending public money correctly. And more importantly, to spend public money wisely. Which for us means not only spending public money as efficiently as possible but just as important: to the benefit of Dutch citizens.