Chapter 5 explores Contestability as the primary state integrated into the Budget process, with the assessment of proposals that are being brought forward to Cabinet. It will analyse the state’s strengths in having a robust process of providing advice to the Government, clear and connected process for engaging agencies, and the use of safeguards to allow proposals to proceed with strong oversight. It will also analyse the opportunities to refine processes for urgent and unforeseen proposals, improve the objective assessment of policy requirements, and uplift digital talent and skills for deep analysis of proposals.
5. Contestability
Copy link to 5. ContestabilityAbstract
As the state primarily focused on the budget process, Contestability’s strengths include its robust process of providing advice to Government as part of the budget process, as well as a clear and connected process when engaging with agencies, and in its use of safeguards, including quarantined funding, to allow proposals to proceed with strong oversight. As the DTA looks to further develop the IOF, there is an opportunity to further mature its Contestability state by refining processes for urgent and unforeseen proposals, improving objective assessment of policy requirements, and by uplifting digital talent and skills for deeper analysis.
Overview of the Contestability state
Copy link to Overview of the Contestability stateMoving beyond the early planning phases of an investment, Contestability is the IOF state that is closest to the point at which decisions are funded in the budget process. It is during this state that agencies engage the DTA as part of developing robust proposals for digital and ICT investments being proposed to Cabinet. Building on the work done with agencies in Prioritisation, the DTA reviews the proposals that have been prepared for Cabinet consideration, assessing them based on alignment with policies and standards, providing advice to agencies on how to strengthen proposals, and providing advice to Government to help inform its decisions on whether a proposal should receive funding in the budget (DTA, 2024[1])
In Contestability, there are two classes of assessment used to evaluate proposals being brought forward for approval and funding. The first assessment applies where a proposal for new funding is below the high-risk and cost thresholds (i.e. below AUD 30 million whole-of-life cost or AUD 10 million for a digital or ICT component). In this case, agencies complete the Digital Capability Assessment process (DCAP) that assesses proposal alignment with the digital and IT policy and standards, described in Box 5.1 below:
Box 5.1. IOF Toolkit: Digital and ICT policy and standards
Copy link to Box 5.1. IOF Toolkit: Digital and ICT policy and standardsThe DCAP includes several whole-of-government digital and ICT policies and standards, including the:
Digital Experience Policy: to improve the experience of users and the quality, accessibility, and consistency of digital government services. It includes the Digital Service Standard, Digital Inclusion Standard, Digital Performance Standard, and the Digital Access Standard to improve.
Digital and ICT Reuse Policy: which encourages agencies to identify existing solutions to reuse (instead of building new) to minimise duplication, reduce costs, and increase delivery speed.
Digital Sourcing Framework: sourcing strategies must consider cloud first, fair access to procurements, contract limits on size and terms, and the use of procurement arrangements.
Secure Cloud Strategy: to guide agencies on migrating to the cloud, including principles, certification and responsibility models, assessment framework, platforms, and skills uplift.
Hosting Certification Framework: to ensure that agencies choosing to host their data in vetted cloud environments have the right privacy, sovereignty, and security controls.
Cyber Security Guidelines: with practical guidance on what agencies should do to protect their IT systems, applications and data from cyber threats.
Assurance Framework: that outlines several governance mechanisms for robust assurance of digital or ICT investments. This is described further in Chapter 2.
Benefits Management Policy: to guide agencies on how to identify, measure, plan, and realise benefits over the life of a digital or ICT investment. This is described further in Chapter 2.
As part of the DCAP, the DTA supports agencies to develop an Assurance Plan in accordance with the Assurance Framework for Digital and ICT Investments (discussed further in Chapter 6). This provides an idea of the readiness of the agencies to deliver digital and IT investments, which together with the assessment of the alignment with digital and IT policy and standards, supports the DTA in providing clear advice to the ERC on whether a proposal should receive approval and funding (DTA, 2023[9]). Further, the inclusion of the Benefits Management Policy ensures that agencies understand the requirements on them to successfully deliver outcomes agreed for their digital and ICT investments.
An overview of the DCAP is provided in Box 5.2 below:
Box 5.2. IOF Toolkit: Digital Capability Assessment Process (DCAP)
Copy link to Box 5.2. IOF Toolkit: Digital Capability Assessment Process (DCAP)The DCAP is used by the DTA when assessing digital and ICT investment proposals to determine how well aligned they are to the set of policies and standards outlined in Box 5.1.
To support this, agencies are required to complete a ‘DCAP Assessment Tool’ before submitting a proposal for consideration to demonstrate how these policies and standards have been addressed when designing their digital and ICT investment proposals.
This DCAP Assessment Tool is currently an Excel workbook that includes worksheets for guidance, details of the proposal, agency responses on the policies and standards, and DTA’s DCAP assessment.
The evidence against each of the criterion is assessed and will need to be sufficiently strong for the DTA to recommend to ERC that an investment proposal progress. However, some criteria may not be applicable for all investments.
For example, if an agency is seeking funding to upgrade its hardware to a more scalable mainframe platform, the applicability of the Digital Experience Policy – with its focus on improving user experience of digital services – is likely to be less relevant in a DCAP assessment. In these cases, agencies will need to provide justification for agreement by the DTA.
Source: (DTA, 2023[10]; DTA, 2024[11])
The second type of assessment applies when proposals are seeking investment above the thresholds for high-cost (above AUD 30 million whole-of-life cost or AUD 10 million for a digital or ICT component) and high-risk under the IIAP (DTA, 2023[9]). To determine the risk level, the DTA conducts a risk assessment using a ‘Risk Potential Assessment Tool’ based on 21 mandatory questions – 7 relating to Strategic Context and 14 relating to Implementation Complexity (Department of Finance, 2023[12]).
For this class of investments, agencies are required to develop an initial business case for Cabinet to evaluate the proposal’s value and feasibility (called the first-pass business case) in addition to the DCAP. This would be followed by a much more detailed business case (the second-pass business case) that includes more rigorous planning and risk assessment (DTA, 2024[13]). An overview of these is in Box 5.3.
Box 5.3. IOF Toolkit: First, Second and Combined-Pass Business Cases
Copy link to Box 5.3. IOF Toolkit: First, Second and Combined-Pass Business CasesFor proposals above the high-cost threshold, with a whole-of-life cost above AUD 30 million or a digital or ICT component with an estimated value above AUD 10 million, agencies will be required to follow the IIAP’s business case process instead of the DCAP.
First-pass: an initial overview of the benefits, costs, risks and range of digital and ICT implementation options for a proposal to inform in-principle decisions.
Second-pass: builds on the first-pass with risk mitigation strategies, detailed cost estimates, appropriate funding options and provide assurance that the planning, consideration and consultation required for successful implementation has been undertaken.
Combined pass: which consolidates the two and is used only in urgent cases or where there is only one workable implementation option or several highly developed implementation options.
Source: (DTA, 2024[13])
Strengths of the Contestability state
Copy link to Strengths of the Contestability stateThe strengths of the Contestability state include its robust process of providing advice to the Government as part of the budget process, as well as a clear and connected process when engaging with agencies, and in its recommendations around the use of safeguards, like as quarantined funding for example, to allow proposals to proceed with stronger oversight.
Robust process of providing advice to the Government
Both external and internal stakeholders highlighted that Contestability is robust, embedded into the budget process, and respected by the Government. It has a clear value statement as part of the budget process that it delivers on with well-defined processes, transparent and evidence-based advice, and strong engagement with agencies. In this state, the DTA also integrates advice on digital and digitally-enabled proposals into the general budget advice coordinated by central agencies to the members of the ERC. In this way, the DTA has become a trusted source of investment advice to both agencies and the ERC.
Clear and connected processes when engaging with agencies
While there is work underway to modernise and better integrate the data collection and flows at this state, there are still clear and connected processes when engaging with agencies. This includes the work done to assess proposals, identify areas of concern, and to work with agencies to address them. It also has connections into the pipeline work done in Strategic Planning and Prioritisation, as well as the development of Assurance Plans for when proposals move into delivery and within the scope of Assurance.
Tools to derisk investment allowing proposals to proceed with strong oversight
The Contestability state allows the DTA to leverage and recommend the use of several tools to derisk digital and ICT proposals where further work or oversight is required. One of these tools includes the option of ‘quarantining’ funding (see Box 5.4) during the budget process to ensure the proper use of public resources in the delivery of digital and ICT projects. This option is undertaken when a proposal has insufficient planning, or when it is identified that further development of ICT system requirements is needed, to give confidence that proposals can be delivered, or to break up larger projects into smaller tranches or phases. Quarantining funds also requires strong consultation with the Department of Finance, who are ultimately responsible for fiscal matters within the Australian Government.
By tying project funding to the delivery of key milestones, this provides the option for certain critical proposals to proceed with safeguards while further work is undertaken on the development of ICT system requirements. For example, this approach was used for a proposal from the Australian Bureau of Statistics for Phase 2 of its project to continue its cloud migration and improve the production and quality of key statistics. As Phase 2 was critical to a larger 10-year plan to modernise and replace legacy ICT systems, the DTA and the Bureau linked funding to delivery milestones that aligned to identified and measurable benefits (DTA, 2023[14]).
The DTA is considering how this funding mechanism could also be leveraged to support agencies to demonstrate the feasibility of their proposals, especially for smaller agencies that might not have the resources to demonstrate this fully in their inputs into the budget process. However, external stakeholders raised that this should be a process to facilitate more agile or staged delivery approaches and should not create undue burden in needing to repeatedly advocate for or justify the project at each funding gate.
Box 5.4. IOF Toolkit: Quarantined funding to tie payments to the delivery of milestones
Copy link to Box 5.4. IOF Toolkit: Quarantined funding to tie payments to the delivery of milestonesIn Contestability, the DTA can make a recommendation to the Government to tie the funding of proposals to the delivery of key milestones (instead of the proposal being fully-funded upfront). This ‘quarantine funding’ approach is used when there is a digital or ICT investment that is deemed to be critical but lacks a well-formed business case.
In making the recommendation, the DTA collaborates with central agencies to recommend to Cabinet which aspects of the proposal’s funding should be ‘quarantined’ and what the agency must deliver to receive the funding. This funding can be withheld for short periods (up to 3 months) or longer (up to 18 months) to allow for planning and consultation.
Source: Information provided by the DTA.
Where to focus next for Contestability
Copy link to Where to focus next for ContestabilityAs the DTA looks to further develop the IOF, there is an opportunity to further mature its Contestability state by refining processes for urgent and unforeseen proposals, improving objective assessment of policy requirements, and by uplifting digital talent and skills for deeper analysis.
Refining processes for urgent and unforeseen proposals
While the work being done in the Strategic Planning and Prioritisation states (and across the IOF generally) is helping to get an earlier view of agencies’ requirements and proposal in the investment pipeline, urgent and unforeseen proposals are still presenting a significant challenge for the Contestability state – usually at times where there is already a peak period of assessments that limit the capacity of teams to give proposals sufficient analysis. There could be efforts to develop different assessment pathways based on different classes of investment, allowing agency and DTA teams to focus on what is necessary within the timeframes available. There could also be opportunity to consider internally-funded projects into these pathways (i.e. projects not going to Cabinet for funding) as there are investments not currently captured in the IOF that could pose significant risk. Finally, dedicated funds administered by the Australian Government (see Chapter 6) could provide another option to explore these proposals.
Internal stakeholders also raised that the Contestability state should be supported with additional inputs on past delivery performance by the agencies and clear direction on when and how to ‘off-ramp’ investments that require re-work before they can be brought forward to Cabinet. External stakeholders added that there could be additional focus provided to the risk presented by proposals that do not receive funding in a particular budget cycle, as these remain important and can increase the risk profile of these investments with delays in delivery and additional pressure on internal funding to address the issue.
Therefore, the Australian Government could consider refining processes for genuinely urgent or unforeseen proposals that arise due to circumstances beyond an agency’s control, by developing alternate assessment pathways based on different classes of investment (including internally-funded projects), allowing agency and DTA teams to focus on what is necessary within the timeframes available.
Improving objective assessment of policy requirements
There could be efforts to improve the assessment of new policy requirements before adding them to the DCAP, defining clear metrics and scoring guidelines for more objective assessments, and deeper analysis to ensure proposals are meeting the intent of the policies and standards. Internal stakeholders raised that:
policies are sometimes developed without sufficient consideration of their integration into the DCAP, including implications for organisational change management and capacity to support them.
clearly-defined scoring metrics and guidance could help assessors more objectively determine the extent to which proposals are genuinely complying with the policies and standards.
agencies often use the same ‘motherhood’ statements to get through parts of the DCAP (perhaps where proposals are not as developed), so additional rigour would be beneficial.
An example of this was the implementation of the Digital Experience Policy, where internal stakeholders reported uncertainty about how to fully integrate its requirements into Contestability and the DCAP. While some policies – such as the Benefits Management Policy (described in Chapter 2) – were subject to an extended development and pilot period before becoming a compliance requirement across the DTA’s suite of policies, it was not evident that a proportionate approach was applied to the implementation of other policies integrated into the DCAP.
Depending on the scope and potential impact of a policy, different levels of consultation and piloting may be appropriate. There could therefore be benefit in including stakeholders from Contestability (and other states) to test and pilot the policy with the DCAP process in mind. Therefore, the DTA could improve the objective assessment of policy requirements by consulting different IOF stakeholders on the development of policies to be integrated in the IOF with more clearly-defined metrics and more detailed requirements for agencies to demonstrate their alignment with them.
Box 5.5 below offers some examples from the UK that could support the DTA in doing this:
Box 5.5. Country practice: The UK’s evaluation methods and capabilities
Copy link to Box 5.5. Country practice: The UK’s evaluation methods and capabilitiesThe UK Government’s Magenta Book is a comprehensive guide issued by the Treasury on how to conduct evaluations in government. It covers the scoping, design, management, use, and dissemination of evaluations. It emphasises the importance of robust evaluation to ensure that public spending delivers maximum value and improves outcomes for citizens. It provides detailed guidance on various evaluation methods, ethical considerations, and the importance of transparency and openness.
Additionally, there is the Aqua Book, which the Treasury developed to ensure high-quality analysis. It provides principles and best practices for producing reliable and transparent analytical work. Key areas covered include decision-making and analysis, quality assurance, managing uncertainty, and verification and validation of analytical models. The guide emphasises the importance of robust analysis to support effective decision-making and maintain public trust in government processes.
The UK’s analytic methods for evaluation
As an annex to the Magenta Book, the Treasury also developed a detailed overview of various analytical methods used in government evaluations. It includes theory-based methods like Qualitative Comparative Analysis and Realist Evaluation, experimental and quasi-experimental methods such as Randomised Control Trials and Propensity Score Matching, and methods for value-for-money evaluations like Cost Benefit Analysis. Additionally, it covers methods for synthesizing existing evidence, including systematic reviews and meta-analysis.
The UK’s Government Analytical Evaluation Capabilities Framework
The Government Analytical Evaluation Capabilities Framework is a supplementary guide to the Magenta Book, designed to enhance the skills and practices of government evaluators. It outlines the necessary capabilities for conducting high-quality evaluations, including:
Scoping: including to define evaluation questions and construct a theory of change.
Leading and Managing: to plan and manage evaluations, ensuring ethical standards, and effective engagement of stakeholders.
Methods: to select appropriate methods for evaluation, data collection, and analysis.
Use and Dissemination: to communicate findings effectively, ensuring evaluations inform decision-making, and promoting transparency.
The framework also provides resources for improving specific capabilities, such as leading and managing evaluations.
Annex to the Magenta Book: Guidance on the Impact Evaluation of AI interventions
The UK's Evaluation Task Force recently published a new annex to the Magenta Book, setting out best practice for evaluating the impact of AI tools and technologies across government and the wider public sector. The guidance emphasises embedding evaluation from the outset of AI interventions, supported by a robust theory of change and clear, measurable outcomes. Given the iterative nature of AI, evaluations must be continuous and responsive, assessing both effectiveness and equity. While the guidance encourages experimentation, it also emphasises that this should be done with theory-based approaches and the right stakeholder engagement to build public trust. In exploring this, the guidance covers the key challenges and opportunities for evaluating the impact of these types of intervention.
Uplifting digital talent and skills for deeper analysis
Internal stakeholders also raised that the Contestability process would benefit from an uplift in digital skills and talent to complement the state’s ability to provide robust technical analysis of solutions. Additional capability in the areas of technology risk, data, and project delivery could be beneficial for the earlier identification of risks or areas where proposals are not yet fully-developed. This technical expertise could be developed within the DTA or sourced from agencies across the public administration with policy domain expertise, as with the Department of Home Affairs’ responsibility for cybersecurity or the Department of Finance for government data, for example.
Taking this approach could enable existing generalist policy resources within the DTA to focus on the DCAP assessments of lower-risk investments. By better targeting the generalist resources, this could also help the state respond to peak periods ahead of the budget. Therefore, the Australian Government could work to uplift digital talent and skills in the IOF for deeper analysis for more expert technical analysis in the areas of technical risk, data, and project delivery.
Addressing environmental considerations
According to the 2023 OECD DGI, there is opportunity for the Australian Government to further imbed environmental considerations of the risk and impact specific to digital and ICT projects into the Prioritisation state to improve its overall maturity on digital government investments. The findings indicate that there could be opportunities to further integrate these considerations into different parts of the assessment process specific to digital and ICT investments, including:
the use of the value proposition method (e.g. business case) in the development of digital/ICT projects to assess the environmental impact of these investments.
dedicated risk assessments to consider environmental risks specific to digital and ICT projects.
the development of a common methodology or tool to evaluate the impact of digital and ICT projects, leveraging the DTA’s expertise as the leading digital government institution, in relation to the environmental impacts on government, citizens, and businesses.
Therefore, an indicator could be included in advice to Government on how proposals have considered environmental impact in their funding proposals, as well as to consider environmental impacts in the indicators on implementation risk. These indicators could then be used as projects move into implementation to monitor how agencies have addressed them in their sourcing strategies and project delivery.
In addition to the examples on this from other countries, provided in Box 5.6, the development of such criteria could be informed by the Green Architected Framework, which was developed by the Singapore Government in collaboration with the DTA as part of the Digital Government Exchange. This Framework considers environment, economic, and social sustainability across the areas of:
data and privacy, including data minimisation and efficiency, privacy by design, and data sovereignty.
infrastructure, including energy-efficient data centres, server virtualisation, and edge computing.
solutions and architecture, including microservices architecture, containerisation, and serverless.
procurement, including sustainable vendor criteria, lifecycle assessment, and collaborative and shared procurement.
The framework also presents a Risk-Value-Cost-Effort (RVCE) Prioritisation Tool to rate each of these areas based on a 5-point Likert scale (very-low, low, medium, high, very-high), as well as then to categorise based on:
Ease to implement the initiative + Value to sustainability
Risk to sustainability of not doing the initiative + Cost to implement the initiative
This framework demonstrates the ways in which the DTA could apply its domain expertise in assessing digital and ICT proposals to consider the environmental risks and impacts that are specific to the architecture, delivery, and ongoing management of these solutions, together with other considerations around social and economic impacts.
Box 5.6. Country practice: Assessing environmental impact of digital investments
Copy link to Box 5.6. Country practice: Assessing environmental impact of digital investmentsThe UK’s Green Book and its supplementary guidance
The UK’s Treasury developed its ‘Green Book’ to offer agencies a comprehensive guide on how to appraise and evaluate policies, programmes, and projects. It provides a framework for assessing the costs, benefits, and trade-offs of different options to ensure that public resources are used effectively.
The Green Book emphasises the importance of incorporating environmental considerations into the appraisal and evaluation of projects. It includes extensive guidance on how to consider the environmental impacts and risks of projects, including supplementary advice on:
Accounting for the effects of climate change; and
Valuation of energy use and greenhouse gas (GHG) emissions.
Separately, the Treasury has guidance on how to develop business cases for projects and programmes, including on how to consider environmental risks and impacts at this stage of the process.
France’s guidance on eco-design of digital services
The Référentiel général d'écoconception de services numériques (RGESN) is a framework developed by the French Government to promote eco-design of digital services. It aims to reduce the environmental impact of digital services by focusing on sustainable practices throughout their lifecycle. The 2024 version builds on previous efforts and includes guidelines for evaluating the utility of digital services, minimizing resource consumption, and avoiding obsolescence of equipment. The framework covers various aspects such as strategy, specifications, architecture, UX/UI, content, frontend, backend, hosting, and algorithms. It encourages the use of renewable energy, efficient resource management, and compliance with environmental standards.
Source: (DINUM, 2024[25])
References
[6] ASD (2024), Cyber security guidelines, https://www.cyber.gov.au/resources-business-and-government/essential-cyber-security/ism/cyber-security-guidelines.
[22] DEFR (2024), Accounting for the effects of climate change, https://assets.publishing.service.gov.uk/media/6645e47e993111924d9d3655/Accounting_for_the_effects_of_climate_change.pdf.
[12] Department of Finance (2023), Completing the Risk Potential Assessment Tool, https://www.finance.gov.au/government/managing-commonwealth-resources/risk-potential-assessment-tool-rmg-107/completing-risk-potential-assessment-tool.
[5] Department of Home Affairs (2024), Hosting Certification Framework, https://www.hostingcertification.gov.au/.
[23] DESNZ (2023), Valuation of energy use and greenhouse gas (GHG) emissions, https://assets.publishing.service.gov.uk/media/65aadd020ff90c000f955f17/valuation-of-energy-use-and-greenhouse-gas-emissions-for-appraisal.pdf.
[25] DINUM (2024), Référentiel général d’écoconception de services numériques (RGESN) - 2024, https://ecoresponsable.numerique.gouv.fr/publications/referentiel-general-ecoconception/.
[8] DTA (2025), Digital Experience Policy, https://www.digital.gov.au/policy/digital-experience.
[7] DTA (2024), Benefits Management Policy, https://www.dta.gov.au/advice/benefits-management-policy.
[1] DTA (2024), Contestability, https://www.dta.gov.au/advice/digital-and-ict-investments/contestability.
[2] DTA (2024), Digital and ICT Reuse Policy, https://architecture.digital.gov.au/digital-and-ict-reuse-policy.
[11] DTA (2024), Digital Capability Assessment Process (DCAP), https://www.dta.gov.au/advice/digital-capability-assessment-process-dcap.
[3] DTA (2024), Digital Sourcing Policy, https://www.dta.gov.au/government-architecture/strategies-policies-standards/policies#digital_sourcing_policy.
[13] DTA (2024), ICT Investment Approval Process, https://www.dta.gov.au/advice/digital-and-ict-investments/ict-investment-approval-process.
[4] DTA (2024), Secure Cloud Policy, https://www.dta.gov.au/government-architecture/strategies-policies-standards/policies#secure_cloud_policy.
[14] DTA (2023), 2022–23 Annual Report, https://www.dta.gov.au/about-us/reporting-and-plans/annual-reports/annual-report-2022-23.
[9] DTA (2023), Digital and ICT Investment Oversight Framework, https://www.dta.gov.au/blogs/digital-and-ict-investment-oversight-framework.
[10] DTA (2023), Digital Capability Assessment Process (DCAP): A guide for agencies, https://www.dta.gov.au/sites/default/files/2023-02/DCAP_Agency%20Guide_DTA_v4_050123_Acc.pdf.
[19] Frontier Economics (2024), Guidance on the Impact Evaluations of AI Interventions, https://assets.publishing.service.gov.uk/media/672c84ebbd79990dfa67cab4/2024-11-05_Guidance_on_the_impact_evaluation_of_AI_interventions_FINAL_PDF_WITH_ACCESSIBILITY_CHANGES.pdf.
[20] Government Analysis Function (2025), New Guidance on Evaluating the Impact of AI Interventions, https://analysisfunction.civilservice.gov.uk/news/new-guidance-on-evaluating-the-impact-of-ai-interventions/.
[24] HM Treasury (2024), Business case guidance for projects and programmes, https://www.gov.uk/government/publications/business-case-guidance-for-projects-and-programmes.
[21] HM Treasury (2022), The Green Book, https://www.gov.uk/government/publications/the-green-book-appraisal-and-evaluation-in-central-government/the-green-book-2020.
[17] HM Treasury (2020), Analytical methods for use within an evaluation, https://assets.publishing.service.gov.uk/media/5e96c41a86650c2dd9e792ea/Magenta_Book_Annex_A._Analytical_methods_for_use_within_an_evaluation.pdf.
[15] HM Treasury (2020), The Magenta Book, https://assets.publishing.service.gov.uk/media/5e96cab9d3bf7f412b2264b1/HMT_Magenta_Book.pdf.
[18] HM Treasury (2015), The Aqua Book, https://assets.publishing.service.gov.uk/media/5a7f3bb8e5274a2e87db49be/aqua_book_final_web.pdf.
[16] HM Treasury (2011), The Magenta Book, https://www.gov.uk/government/publications/the-magenta-book.