Algorithmic transparency model – GOV.UK

0


Use this form to provide information on how you use algorithmic tools to support decisions.

You must complete both sections of the template. For the:

  • level 1 information, provide a brief non-technical description of your algorithmic tool, and an overview of what the tool is and why the tool is used
  • level 2 information, provides more detailed technical information, such as specific details about how your tool works and what data it uses

Within each level, the numbers next to each section correspond to the numbers in the Algorithmic Transparency Data Standard.

Level 1 information

How you use the algorithmic tool

To explain:

  • how your tool works
  • how your tool is integrated into your decision-making process

Why do you use the algorithmic tool

To explain:

  • what problem you want to solve using the tool and how it fixes the problem
  • your justification or justification for using the tool
  • how people can learn more about the tool or ask a question – including offline options and a contact email address of the organization, team or responsible contact person

Level 2 information

Who owns and is responsible for the algorithmic tool

Indicate who is responsible for deploying your tool, including:

  • your organization (1.1)
  • the team responsible for the tool (1.2)
  • the primary owner responsible (1.3)
  • your external supplier or any third party involved, if the tool was developed externally (1.4)
  • the head office number of your external supplier (1.5)
  • the role of the external supplier (1.6)
  • the conditions of their access to any government data (1.7)

What is the tool for

Describe the scope of the tool, what it was designed for and what it is not intended for.

Develop your rationale for using the tool, for example:

  • describe what the tool was designed and not designed for, including the purposes for which people may mistakenly think the tool will be used (2.1)
  • provide a list of advantages – value for money, effectiveness or ease for the individual (2.2)
  • list the non-algorithmic alternatives you have considered, if this applies to your project, or a description of your decision-making process before introducing the tool (2.3)

List the technical specifications of the tool, including:

  • the type of model, for example an expert system or a deep neural network (2.4)
  • how often the tool is used – for example the number of decisions made per month, or the number of citizens interacting with the tool (2.5)
  • phase – whether the tool is in the idea, design, development, production or decommissioning stage, including the date and time of its creation and any updates day (2.6)
  • the maintenance and review schedule, e.g. specific details of when and how a human reviews or verifies the automated decision (2.7)
  • system architecture (2.8)

How the tool affects decision making

Explain how the tool is integrated into the process and what influence the tool has on the decision-making process. (3.1)

Explain how humans control the tool, including:

  • how much information the tool provides to the decision maker, and what information (3.2)
  • decisions humans make in the overall process, including options for humans reviewing the tool (3.3)
  • training for people deploying and using the tool, if applicable to your project (3.4)

Explain your appeal and review process. Describe how you allow members of the public to review or appeal a decision. (3.5)

Data

List and describe:

  • the datasets you used to train the model
  • the datasets on which the model is or will be deployed

Add links to datasets if you can.

Understand:

  • the name of the datasets you used, if applicable (4.1)
  • an overview of the data used to train and run the tool, including a description of the categories that were used for training, testing or operating the model – eg “age”, “address” etc. (4.2)
  • the URL of the datasets you used, if available (4.3)
  • how and why you collect data, or how and why the data was originally collected by someone else (4.4)
  • the data sharing agreements you have in place (4.5)
  • details of who has or will have access to this data and for how long the data is stored, and under what circumstances (4.6)

Impact evaluations

List the impact evaluations you have carried out, for example:

  • Data protection impact assessment
  • algorithmic impact assessment
  • ethics review
  • equality impact assessment

For each assessment, add:

  • the name of the evaluation (5.1)
  • a description of the impact assessment carried out (5.2)
  • the date you completed the assessment (5.3)
  • a link to the assessment or summary of the assessment, if available (5.4)

Risks

Provide a detailed description of the common risks to your tool, including:

  • the names of the common risks (5.5)
  • a description of each identified risk (5.6)

For example:

  • potential damage to the tool used in a way for which it was not designed or constructed
  • creation of biased results, including through training data that is unrepresentative or contains bias
  • arbitrariness and functionality, such as the tool providing unfair or incorrect decisions

Mitigations

Provide a detailed description of the measures you have taken to mitigate the risks in your “Risks” section. (5.7)


Share.

Comments are closed.