DTA: Greater guidance needed for AI procurement contracts ARN

https://ift.tt/qlguvQ1

Procurement and contracts of artificial intelligence (AI) for public sector use, remain critical areas of concern for the government.

This is according to the Joint Committee of Public Accounts and Audit (JCPAA) inquiry into the use and governance of artificial intelligence systems by public sector entities.

Appearing before the committee on 15 November, the Digital Transformation Agency (DTA) recognised the need for greater guidance for agencies when it comes to procurement of AI, while also acknowledging the need to work in accordance with the Department of Industry, Science, and Resources’ AI safety standards.

Currently the DTA provides standardised legal contract terms, backed by specialised resources, including legal firms. These terms only address data management, liability caps, indemnities, and assurance requirements.

However as digital providers evolve into AI providers, risks increase, particularly as agencies and providers navigate the boundaries of these technologies, DTA chief executive Chris Fechner said. 

“What we’re seeing is digital providers are now rapidly evolving into AI providers by augmenting their products or bringing new products to market,” he said. “Many of the legal conditions that we have around where data is managed and hosted, the controls, the liability caps, the indemnity obligations associated with them.”

The requirement for certain services within the Commonwealth to have higher levels of assurance behind them is actually already in place, Fechner said.

“I think what we see with the rapid evolution of new companies and new services through AI is an amplification of some of those risks, and that those companies and the Commonwealth agencies that are using those companies are still very much finding out what the limits of these things are,” he said.

For example, Microsoft Copilot is powered by OpenAI’s GPT models, but Fechner claimed that other control functions and downstream supplier relationships were less transparent and this presents challenges where contractual arrangements are visible.

The DTA noted that the Commonwealth procurement rules may need additional mechanisms to provide clearer guidance and direction, not just for initial procurement but also for lifecycle project management, especially given the rapid pace of technological change.

“What we’re starting to look at is things like the supply chain and how we can start to control those things, to understand where those risks are,” said Fechner. “But it’s also about whether we start putting information into procurement that says you need to fully inform the Commonwealth on any downstream suppliers that you need to do.”

An example of this was the hosting certification framework that Home Affairs now operates which required data centre providers and cloud providers, he claimed.

“This tells the agency about ownership, supply chain, their security controls and other parties that were involved in their environment,” Fechner said.

He believed that there needed to be a “new mechanism within the core procurement rules to provide more guidance or direction”, not just for the initial procurement, but for project management throughout the entire life cycle, which is now moving very quickly.

Although the DTA was currently developing an AI assurance framework, which takes a life-cycle approach to procurement, the intention behind this is that AI carries risks and what was purchased and tested before deployment can change over time.

“Due to the way data feeds into it and updates to the performance of algorithms,” he said. “Our aim is to ensure we are managing it throughout its life cycle. It’s also important to remember that some costs associated with the use of AI can be incredibly low and may not reach the thresholds for procurement spend analysis.

“These can often be easily accessed from the web and integrated via application programming interfaces, which make it straightforward to use them directly or embed them into systems.”

However, the agency needd guidance on providing training to ensure people are held accountable and understand what AI should and shouldn’t do across its life cycle.

“Within the DTA, we also want to look at model contracts and clauses, similar to how we handle other things, to ensure we have protections, not just during the purchase but also through contract management,” he said. “This would require continuous disclosure on matters such as ethical requirements, technical changes affecting system performance, outage notifications, and the like, with significant penalties for breaches.”

In an interview with ARN, Yule Guttenbeil principal at Attune Legal said as organisations are in the early stages of AI adoption, there’s a big gap in understanding what procurement teams are buying but it was important for service providers to be open and transparent now.

“I deal with software contracts every day, and I see many that don’t disclose enough information,” he said. “It’s often not clear until something goes wrong, especially for non-technical clients who don’t know what to ask. [Customers] need this information to be disclosed and explained because it’s not their area of expertise.

“I tend to think that being upfront and transparent with this kind of information builds more trust, even if you’re a little soft in some areas.”