Performance dashboard — Alpha assessment
The performance dashboard will measure the performance of government services against the key performance indicators (KPIs) defined in the Digital Service Standard and other service related metrics and report them publicly.
Areas of good performance
The Performance dashboard has been assessed by the team and we have agreed that it is ready to proceed to the Beta phase.
Criterion 1 - Understand user needs
A wide range of user and usability testing took place.
Criterion 2 - Have a multidisciplinary team
Up to this point of the Alpha assessment the team have been working well together – all members participate and are involved in decision making. A data visualisation specialist has been added to the standard team construct.
Criterion 3 - Agile and user-centred process
The team are using agile ceremonies and artefacts as well as integrating user-centred design practices into their day. They have demonstrated a great ability to pivot as requirements for the dashboard have changed over time.
Criteria 6 - Consistent and responsive design
The team refactored their code using the frameworks specified by the team developing common design patterns. The team’s code was then provided back into the guides for integration into the design guide, forming the foundational style for representing graphical data and a graphing library.
Criterion 8 - Make source code open
Due to the requirements for specific data visualisation, the team resourced up a specialist data visualiser as well as someone with specialist skills in representing this in code. The code library for the graphing has been fully open-sourced, is being supported by its developer and contributed to by the community, which is to be applauded.
The alpha prototype was made available to the public to the agreed and publicised schedule. This enables three services to commence publishing their performance data as required in the Beta phase.
On the path the Beta
During Alpha and the weekly in-flight check-ins, assessors provided recommendations that the team need to consider during the Beta phase:
Criterion 1 - Understand user needs
The onboarding process for agencies needs some refinement. There are issues from an agency perspective around gaining a clear understanding of the KPIs required and how to calculate them.
This process requires some streamlining with the generation of clear onboarding materials for agencies, whilst planning for automation when all parties are able to support it.
The team have demonstrated some successes in working with partner agencies to free up the data required, however more work needs to be done from a communications and senior engagement perspective to unblock relationships with agency partners.
Criterion 2 - Have a multidisciplinary team
The resource profile of the team has varied over time, through no fault of their own. A period of full resource availability and stability would serve the team well as they move to Beta phase.
Criterion 13 - Encourage everyone to use the service
Now the caretaker period is over, an expansion of user testing to a wider cohort is recommended—to include more non-government and media users.
What we’ve learnt
Assessment team
Initially, there were indications that the team were being reactive to the requirements of the Digital Service Standard. However aligning with the Standard became second nature with the team working cooperatively alongside the assessors, employing an easy, conversational approach.
It has been a pleasure to work with this team and watch the product develop. We have been particularly impressed with the response to the usability testing—offering alternate views to assist with accessibility.
The open sourcing of the graphing library and the re-use that the code is already experiencing (and being iterated on) is very impressive.
The team’s ability to pivot and work with a changing resource profile whilst still delivering has been great.
Work on the onboarding process has been progressive and we look forward to seeing this iterated and refined moving forward and codified in artefacts.
Delivery team
The performance dashboard has been a great opportunity for expanded learning around data, data presentation and the visual representation of information in a way that aligns with the Standard.
Throughout the Alpha phase, the team has been presented with some real challenges. The dashboard, unlike other projects, has mandated outcomes which need to be delivered; for example the Standard requires that the 4 KPIs are reported for each service, in conjunction with outcomes that are driven by user needs.
Asking for organisations to bring performance information into the public domain has presented some challenges, and it has required a high degree of stakeholder engagement to give transformation team’s confidence in the benefits of the dashboard. It will be a rewarding outcome to publish the dashboards and begin receiving feedback from our users.
Assessment against the Digital Service Standard
Criterion | Result |
---|---|
1 | Pass |
2 | Pass |
3 | Pass |
4 | On track |
5 | On track |
6 | On track |
7 | On track |
8 | On track |
9 | On track |
10 | On track |
11 | On track |
12 | On track |
13 | On track |