myGov — Alpha assessment

A recently formed partnership between DTA and DHS provided an opportunity to review the entire myGov experience and reimagine a user experience that puts users’ needs at the forefront. The focus is on designing a prototype that demonstrates creative ways to remediate known issues (login, lockout, usability, switching between services and customer support), address user needs and showcase ‘What could good look like?’.

The service passed the assessment because:

  • The myGov team has met criteria 1 to 3 of the Digital Service Standard and shown sufficient progress towards meeting the remaining criteria. This report recommends the service proceed from Alpha to Beta stage.

  • The team has conducted extensive user research throughout the Discovery and Alpha stages in line with the diversity and scale of the myGov user base. The technical and security design constraints for future work have been investigated with an improved product planned. A number of key issues and improvements have been identified and the potential way of working for Beta is beginning to be developed.

Criterion 1: Understand user needs

The team have demonstrated a good understanding of why users engage with the service and what their core needs are for interacting with the product. This informed which areas of myGov require attention and what would have a positive impact for the largest audience.

A wide range of users and demographics were covered during research. These included varied and rural locations, internet access, age, digital ability, cultural backgrounds including Indigenous Australians, frequency of service usage, and physical abilities.

This research was done through contextual in-home interviews, usability testing, shopfront site visits, support staff interviews, and testing with users of assistive technology. All of which provided a direct means to understand how the service should be designed.

All members of the team participated in research and throughout the assessment referenced stories and learnings from the experience of real users who they had met and tested their prototypes with. While demonstrating the product and design decisions, these stories naturally surfaced making it clear that the product and the team had been directly driven by the needs of users.

Criterion 2: Have a multidisciplinary team

Team members were represented from both DHS and DTA, providing an opportunity to strengthen the partnership between the two agencies, and enable capability uplift back into DHS as team members rolled back into their originating team.

The team composition was well rounded throughout Alpha, seeing the core of the team consistent throughout Alpha. One exception was the late introduction of a Content Designer, which had an impact on the efficiencies of the content process.

The team benefited from the inclusion of two Subject Matter Experts, sourced from DHS and ATO, providing valuable input regarding call center scenarios.

Product management, and therefore decision making, was embedded within the team. When needing support or escalation, the Product Manager had regular access to the Service Owner and stakeholders.

When commencing Alpha the team held a kickoff workshop, allowing them to reform as a team, and iterate on their team charter and expectations of each other. The team reflected on their team charter throughout Alpha.

Criterion 3: Agile and user-centred process

The team successfully established and maintained a agile and user-centred design process. Throughout their research and prototyping they developed a regular cycle of communicating outcomes and iterating designs based on user behaviour.

Throughout Alpha the team surfaced opportunities for improvement via their fortnightly retrospectives. Examples include the impact of not having a content designer within the team, and opportunities to streamline their approach to capturing/iterating on feedback gathered during research.

Midway through Alpha the team recognised the need to refocus their problem statement and prioritise their backlog.

Criteria 4: Understand tools and systems

The team has established an effective technology stack that served them well in Alpha and lays the foundations for the delivery of a Beta product. Their use of open source technologies and adoption of the GOV.AU Design Guide has resulted in a prototype that will work for the diverse users of myGov on a wider range of devices.

Criterion 5: Make it secure

The team consulted with both DTA and DHS Information Security experts to ensure concepts were both user friendly and appropriate for the security and privacy concerns of users. They have a good understanding of the design constraints relevant to the myGov service particularly in relation to authentication and profile features.

Criterion 1: Understand user needs

Research should continue. The assessment team highly recommends, particularly as the team scales up in size, that effort be made in ensuring all members of the team are permitted to engage with and build empathy for their end-users.

The team should continue to test with, and engage with, users from culturally and linguistically diverse backgrounds.

Criterion 2: Have a multidisciplinary team

The team should consider an ongoing content design role to help with communicating how myGov works with linked-services, and providing consistency across the breadth of edge cases and other channels like call-centre support or the shopfront experience.

Criterion 3: Agile and user-centred process

Given the breadth and complexity of the scope currently represented within the prototype, the team need to ensure they have an approach to aid their prioritisation process throughout Beta, balancing user needs, technical dependencies and government commitments.

Criterion 4: Understand tools and systems

The team should ensure that they have the tools and systems including easy access to development and deployment environments in place to maximize their ability to rapidly iterate the service in response to user feedback.

Criterion 8: Make source code open

The team should continue to identify and prepare components that can be released as open source to benefit other teams throughout the agencies and wider government technology communities.

Criterion 11: Measure performance

The team need to consider the key performance indicators they will baseline and track, allowing them to measure the impact of the improvements the project will deliver. Understanding these metrics could also assist the team in their prioritisation process.

Assessment against the Digital Service Standard

Criterion Result
1 Pass
2 Pass
3 Pass
4 On track
5 On track
6 On track
7 On track
8 On track
9 On track
10 On track
11 On track
12 Not assessed
13 Not assessed