ABS Census Digital Service – Alpha assessment
The purpose of the Census Digital Service is to support the Australian public participating in the 2021 Census. The Census Digital Service will include the online form as well as informational and self-service options, providing the public with the ability to complete their Census online.
The ABS expects that the 2021 Census Digital Service will be the default and most convenient channel for the public to participate in the Census. The ABS is building on their learnings from the 2006 to 2016 censuses to deliver a 2021 Census that is easy, secure and produces high data quality and meets user needs.
The Census Digital Service team (team) has met criteria 1 to 3 of the Digital Service Standard and shown sufficient progress towards meeting the remaining criteria. This report recommends the service proceed from Alpha to Beta stage, taking into account the recommendations outlined in this report.
Throughout Discovery and Alpha, the team has involved a wide range of participants in ongoing user research and have sought to make the service as easy to use for the widest possible audience while balancing the need to maintain good data consistency across previous censuses. The entire team, and stakeholders, have been involved in the research process, to help build empathy for the end-user.
Criterion 1: Understand user needs
The team demonstrated a thorough understanding of their users. The approach to learning about their user needs stemmed from previous research conducted during the Discovery phase as well as experiences from the 2016 census. The team was interested in creating an experience that has the least burden on users, where users can easily know where they are in the process, and where users can successfully complete the census process.
During Alpha, the team did further explorations of their users and used the opportunity to step back and look at the holistic experience. They explored how a question, which has been designed for presentation on a physical paper form, can be translated for the digital channel.
For example, breaking a question into four separate questions to reduce the cognitive burden for the user.
While the user base is all Australians, the team identified a number of user groups to help focus the user research, which also included culturally and linguistically diverse (CALD) users, different household groups (e.g. 12 person sharehouse, families etc), geographical groups (urban and regional/rural), and vulnerable user groups (homeless, indigenous community groups). There were some user groups that the team were unable to get to within the timeframe, including low literacy, Aboriginal and Torres Strait Islander peoples and assisted digital. The team worked with other programs within the ABS to access insights on user groups that they were unable to reach during Alpha. The team developed a number of hypotheses and prototypes to test with their users and iterated on a regular basis.
The entire team was involved in the user research process - from observations to synthesisation of the findings.
The team has worked through their story map to identify how the features stack up against the feasibility, desirability and viability lenses. This forms the foundation of their minimal viable product (MVP) and what is needed for the 2021 Census. Also, the story map is being tested against the Alpha findings to help provide clear delineation of what is being addressed moving forward, or what is not, and the reasoning for that.
Criterion 2: Have a multidisciplinary team
The team was made up of staff from the ABS and supplemented with external technical and design specialists that joined near the start of Alpha. Roles included a product manager, a delivery manager, user researchers, service designers and subject matter experts.
There were no technology experts embedded into the team at the start of Alpha, although this issue was addressed later on in Alpha. The impact of this was that the team was unable to deeply explore the technical feasibility of some concepts.
Although there was a gap around a content designer, the team had taken steps to work with other content subject matter experts to address some of the capability gaps. The team noted that some of the core team will be moving into Beta along with the implementation vendor. The following roles are currently planned to move into Beta: user researcher, business analyst, technical experts, product manager, and a quality assurance/paper form expert.
The team has developed an effective onboarding process for new team members. New team members were provided with a Discovery Playbook that documented findings, principles and quotes from users. Throughout Alpha, the team continued to document along the way to capture hypotheses, research findings and decisions for each sprint. This was done to help onboard and support any new team members for Beta.
The team learnt a substantial amount about decision making at pace, how decisions are impacted, and influenced by the overall Census program and supported governance. This had affected the ability of the team to clearly define a MVP to be taken into Beta, although this was later rectified right at the end of Alpha.
Criterion 3: Agile and user-centred process
The team is effectively using agile/scrum processes. They were initially using 2 week sprints, but shifted to a 3 week cadance to give the team capacity to conduct more research. The team demonstrated the use of agile methods such as backlog grooming, sprint planning, daily stand-ups, sprint reviews and sprint retrospectives. As there was no scrum master, the team shared the responsibility of this role and was self-managing by around Sprint 7.
We note that members of the team that were not ongoing employees of the ABS, did not have access to the ABS environment and therefore the team resorted to using online tools such as Slack and Trello as part of their agile tooling to enable greater collaboration.
At the start of each sprint, the team reviewed the previous sprint and the learnings from the research. They looked at the hypotheses, the findings and the synthesis board to determine next steps such as whether a hypothesis needed further investigation.
The team initially had difficulty defining the MVP as they wrapped up Alpha, due to challenges around decision-making processes and access to technical experts who could help define the technical feasibility. This was later rectified, with the team defining the MVP and identifying Must, Should and Could user stories.
Criterion 4: Understand tools and systems
The team has begun to identify and understand the systems and tools they need to integrate with to achieve their private Beta and final product. The team has a deep understanding of the technologies that will need to be used to deliver the service effectively to their broad range of users. The work conducted within the team and broader census program will aim to use common project management tools and software development approaches such as Test Driven Development (TDD).
Criterion 5: Make it secure
The team has an understanding of privacy by design principles to ensure that the solution is appropriate for a broad range of users. The team has committed to engage further with privacy and IT security experts throughout the development of the service.
Criterion 6: Consistent and responsive design
The team has identified the need for a consistent and responsive design in order to deliver the online service to a large audience including taking a mobile-first and whole of government approach to design. They utilised the DTA’s Design System during alpha to enable them to quickly create and iterate on prototypes.
Criterion 7: Use open standards and common platforms
The team has considered reuse and consistency with other government services including similar services being delivered by other governments internationally. They have also used these engagements as an opportunity to share what they are doing and to learn from the experiences of their international counterparts.
Criterion 8: Make source code open
The team has identified a variety of new technology components that could be produced in delivering the service. When completed, the team will look for opportunities to release for reuse under an open source licence.
Criterion 9: Make it accessible
The team is targeting a WCAG 2.1 AA level of compliance. They have defined ‘online design principles’ that demonstrates their focus on 10 areas, including plain english and accessibility.
During the Alpha phase, they had challenges accessing a range of users with accessibility needs. They aimed to address this issue by working with accessibility industry associations to help determine the needs of those user groups.
The team is looking to pull together a panel of expert accessibility people as a consultation group throughout the Beta phase.
Criterion 10: Test the service
The team had taken a test driven development approach to the development of their Alpha products and intend to continue this into Beta.
The team has been talking to the wider ABS about how to build pieces of the final solution concurrently, so that the team can continue trialing the service in a test environment that accurately reflects the user experience. The team notes that they are committed to working with the vendor to implement testing strategies that reflects this.
They are also plan to engage a test manager within the team to provide additional assurance, as well as looking to establish an online panel to do unmoderated testing.
Criterion 11: Measure performance
The team is currently using real time metrics to monitor performance of service and service channels. This includes analytics data such as the time users spend on each page and how long the whole form takes. The team has been advised to look at the plan for measuring and reporting on mandated KPIs including measuring response rate and take-up off different service channels.
Criterion 12: Don’t forget the non-digital experience
The team has done a lot of work to understand the user journey and the Census service as a whole. They have brought people together from other parts of the organisation that support the other channels to help them understand the impact on the user and what it means for their component of the Census.
The team also worked with Census field staff and teams looking at the inclusive strategy of the Census program, to help them understand what works well and identify areas of improvement.
Criterion 13: Encourage everyone to use the digital service
The team is working towards the approach of ‘enabling participation’ - which is about creating a new pathway for people to complete the Census digitally. They have been working closely with the ABS Communications team to help support the call to action of using the digital service.
Here are suggestions on things to consider when moving into Beta:
- Continue user research with groups that the team were unable to access during Alpha, and to ensure that user needs are being met through Beta, particularly with a new vendor coming on board.
- Track any potential issues that the team might be unable to test/resolve in Beta and how the team will handle that as they move into public Beta.
- Consider the team construct that will be needed to achieve the right outcomes for the service and the users. This includes a service manager, delivery manager and product manager as well as bringing together technical, business and design skills sets.
- The team also needs to consider how they will be keeping the Beta vendor engaged as part of the multidisciplinary team, as opposed to shifting into a waterfall-like model where the vendor is distanced from the user (we note that the team indicated that this is not the intent moving into Beta). This is particularly important to ensure that empathy for the user and user needs are kept at the forefront of Beta, and that the team continue to solve the right problems in the right way.
- Keep ensuring that stakeholders are fully across decision-making processes and that they continue to be across the journey.
- Provide the team with the appropriate governance structures to ensure they have autonomy and are able to quickly move forward with the continuous improvement of the service based on learnings.
- Ensure timely access for new team members (and new vendor) to the ABS tooling environment.
- Keep the product manager and delivery manager roles separate, as there is meant to be a healthy tension between the roles.
- Technical load and performance testing should be conducted in a continuous automated manner for each component. Performance characteristics such as maximum response time and maximum download size should be defined early in development with warnings if they are exceeded at any point in software development.
- Development should focus on a mobile connection first design to take into account low maximum bandwidth and potentially failing requests. The team should consider the regular use of the Google Lighthouse audit tool for Progressive Web Apps.
- Consider the use of existing government and commercial cloud platforms such as cloud.gov.au and notify.gov.au to enable early testing of new functionality.
- Think about what components can be made open source to enable reuse across government to provide similar functionality and ensure intellectual property arrangements will support this reuse.
- As it can be challenging going from Alpha prototyping to building “the thing”, ensure that available and appropriate development/testing environments can support multiple uses such as user research and end-to-end testing.
- In terms of accessibility and inclusivity, continue to focus on making the service accessible to all users regardless of their ability, language and environment (for example, intermittent access to the internet). Ensure that user research and usability testing in the Beta phase is conducted with actual users with inclusive needs (not just with the accessibility consultation group).
- Connect with the DTA Dashboard team to get a service dashboard up and running on dashboard.gov.au prior to public Beta. Also consider metrics from other areas such as accessibility.
Assessment against the Digital Service Standard
Criterion | Result |
---|---|
1 | Pass |
2 | Pass |
3 | Pass |
4 | On track |
5 | On track |
6 | On track |
7 | On track |
8 | On track |
9 | On track |
10 | On track |
11 | On track |
12 | On track |
13 | On track |