In-flight assessments

These assessments focus on continuous improvement and learning at regular intervals.

[toc]

All services within the scope of the Digital Service Standard are assessed against the Standard.

Depending on the service, this will follow either the in-flight assessment process or the staged assessment process.

The in-flight assessment process focuses on continuous improvement and learning. It is less about regular tests and more about a mentoring relationship between the assessment team and the delivery team.

Both teams should celebrate moving from one stage to the next.

Assessment check-ins

When the service team can demonstrate the service meets the standard, they should either arrange an internal agency assessment or consider seeking an external expert to perform an assessment, including against accessibility requirements.

Your delivery team will meet with the assessment team regularly during the Discovery, Alpha and Beta stages, until the service is launched as a public beta.

As the service moves towards the Live stage, the assessment and delivery teams can move these check-ins to longer intervals. However, check-ins should happen at least monthly.

If you are moving from staged assessment to in-flight, you might spend more time on the first check-in meeting to help the assessment team understand the product.

How the check-ins work

Your check-in meetings should focus on the progress you’ve made against the Digital Service Standard. You should document this using a red, amber, green (RAG) kanban. You can use the Digital Service Standard kanban poster (PDF 117 KB) to help with this.

Both teams work together to make sure the service meets all of the requirements for each criterion in the current stage.

The assessors document what the delivery team has done well since the last meeting and make recommendations for the next period of work. The assessors will also recommend ways the delivery team can work closely to the Digital Service Standard and create a successful service.

The teams should complete their comments and RAG ratings by the end of each in-flight session and make them available for weekly reporting to stakeholders.

Assessors will use a spreadsheet to track the delivery team’s progress and record recommendations.

Moving to the next stage

Both the assessors and the delivery team agree on when the delivery team has met the requirements of the stage and when the service can move to the next stage. They can do this by showing the criteria being assessed as green in the red, amber, green (RAG) rating.

At each of the 3 stages — Alpha, Beta and Live — the assessment team produces a brief report. This explains the achievements and learnings of the previous stage. We encourage agencies to share these assessment reports with the DTA so they can be published on our website

Maintaining your service

Launching your service is only the beginning. Once a service goes live you will need to keep listening to your users to make sure it continues to meet their needs and you are still following the Digital Service Standard. You will need to continue to update and improve the service on the basis of user feedback, performance data, changes to best practice and service demand.

You will also need to check that:

  • you maintain high levels of user satisfaction and transaction completion

  • cost per transaction goes down

  • digital take-up goes up

  • you provide assisted digital support to people who need it

  • there is progress against other measurements you identified when you were developing the service

Get in touch

If you have any questions you can get in touch with us at digitalpolicy@dta.gov.au