Government issues policy note on transparency of AI use in procurement
The Cabinet Office has issued a procurement policy note on improving transparency of AI use in procurement, setting out how to manage risks and opportunities associated with such systems.
The note (PPN 02/24) says: “AI systems, tools and products are part of a rapidly growing and evolving market, and as such, there may be increased risks associated with their adoption. Care should be taken to ensure that AI is used appropriately, and with due regard to risks and opportunities. As the Government increases its adoption of AI, it is essential to take steps to identify and manage associated risks and opportunities, as part of the Government’s commercial activities.”
The PPN applies to all Central Government Departments, their Executive Agencies and Non-Departmental Public Bodies (the ‘In-Scope Organisations’). Other public sector contracting authorities may wish to apply the approach set out in the PPN, the Cabinet Office says.
The policy note contains an annex to some of the guidance available to support commercial teams to understand AI as a subject area, the appropriate use of AI within public services and how AI products should be procured.
It also notes that there are potential benefits to suppliers using AI to develop their bids, enabling them to bid for a greater number of public contracts.
“It is important to note that suppliers’ use of AI is not prohibited during the commercial process but steps should be taken to understand the risks associated with the use of AI tools in this context, as would be the case if a bid writer has been used by the bidder,” the Cabinet Office says.
These steps may include asking suppliers to disclose their use of AI in the creation of their tender.
The PPN also recommends putting in place proportionate controls to ensure bidders do not use confidential contracting authority information, or information not already in the public domain as training data for AI systems “e.g. using confidential Government tender documents to train AI or Large Language Models to create future tender responses”.
In addition, the Cabinet Office suggests undertaking appropriate and proportionate due diligence, planning for a general increase in activity, and potentially allowing more time in the procurement to allow for due diligence and an increase in volumes of responses.
In certain procurements where there are national security concerns in relation to use of AI by suppliers, there may be additional considerations and risk mitigations that are required, the policy note warns.
The PPN calls on commercial teams to take note of existing guidance when purchasing AI services, but also calls on them to be aware that AI and Machine Learning is becoming increasingly prevalent in the delivery of “non-AI” services.
“Where AI is likely to be used in the delivery of a service, commercial teams may wish to require suppliers to declare this, and provide further details,” it says.
The Cabinet Office suggests care should be taken to ensure that the use of AI is restricted to use cases where risks can be effectively understood and managed.
The PPN continues: “AI has potential to accelerate and support decision making processes, especially through the use of large data sets. It is essential to ensure that decisions are made with the support of AI systems, not a reliance upon them, in accordance with the principles outlined in the Government’s Data Ethics Framework and guidance on Understanding Artificial Intelligence Ethics and Safety.
“Content created with the support of Large Language Models (LLMs) may include inaccurate or misleading statements; where statements, facts or references appear plausible, but are in fact false. LLMs are trained to predict a 'statistically plausible' string of text, however statistical plausibility does not necessarily mean that the statements are factually accurate.
“As LLMs do not have a contextual understanding of the question they are being asked, or the answer they are proposing, they are unable to identify or correct any errors they make in their response. Care must be taken both in the use of LLMs, and in assessing returns that have used LLMs, in the form of additional due diligence.”