Lighthouse — Alpha assessment
The Veteran Centric Reform (VCR) Lighthouse Project has been approved to progress a deep user engagement. The Lighthouse is a 20 week project to address a significant pain point for Department of Veterans’ Affairs (DVA) clients. The service is being developed using the Digital Service Standard with a view to addressing the pain points using digital experience.
Areas of good performance
The Lighthouse project has met the assessment criteria of the first three criteria of the Digital Service Standard. The Delivery Team is on track for other criteria and they should proceed to Beta stage, taking account of the recommendations outlined in this report.
The Assessment Panel understands the user engagement performed by the team has helped them understand the service requirements. The key service requirement identified for the user has been to quickly access DVA services through easier and more convenient systems.
Each team member is involved in the wider project activities and share research to ensure insights are shared across the team. The team has a diverse skill set and has embedded the agile techniques into the processes they follow, including decision-making. Specialists have been included to ensure connections with business processes and privacy requirements.
Criterion 1: Understand user needs
The team is able to articulate who the users of the service are and their needs. This is evident by the journey mapping and the five representative user profiles (personas) developed during the Discovery and Alpha stages.
The team have applied a broad number of strategies to identify users and their advocates. This ensured the users are well represented within the project considerations, and where appropriate the team have re-engaged users for Alpha testing. Throughout the Alpha stage, the user awareness has grown into a partnership, evidenced by people volunteering to participate. The partner agencies have supported additional opportunities for the VCR Lighthouse team to visit bases throughout Australia.
A staff workshop is planned to help validate Alpha concepts. Research has shown very few clients understood or were aware of the DVA Statements of Principles (SoPs). A change in how the rules in the SoPs are applied has been proposed to senior DVA decision makers and if adopted, will enable acceptance of claims on a basis that can be easily understood by staff and clients.
Service maps were developed and discussed with users to gain their perspective of the service, including pain points. The information gathered recognised a number of problems the users experienced. These problems identified the similarities of behaviours, circumstances and shared experiences of the users.
These experiences have elevated a range of prototype concepts to enable questions to be answered around what does the user expect and what does it mean to them. The hypothesis was adapted from ‘sure and simple claiming’ to ‘how might we help those who served be healthy and productive?’ Doing this enabled the team to identify and address areas of complexity, and to narrow down their focus to define the Minimum Viable Product (MVP).
Criterion 2: Have a multidisciplinary team
The multi-disciplinary team draws on staff across the DVA and the Department of Human Services (DHS) with backgrounds in people management, design, development, project management, legal affairs, policy writing and service delivery.
The team duties and unique skills were evident through the conversations about research and product development.
The product manager has the required authority to make daily decisions and keep work progressing. The DVA Executive appear to be engaged in the new way of working and getting key decisions made at showcase and out of formal governance session meetings which is definitely allowing the team to pivot and progress as needed.
Lessons learnt in building the team have identified the importance of on-boarding new staff. Education and awareness processes have been developed to get new team members familiarised quickly.
Criterion 3: Agile and user-centred process
The panel was satisfied, by the evidence provided that an agile method is being used to manage the project. The product manager (and the whole team) spoke confidently about the processes and agile practices in place. A strong agile-based rhythm was established early in the project, including sprints, daily stand ups and a Kanban system.
User research and prototype design was clearly integrated, as the team provided evidence on how the design iterated over time in response to feedback from users.
A hypothesis was developed in Discovery, and was updated during Alpha, which demonstrates that the team continued to learn more about their users and the real problem trying to be solved. Demographics and research from Discovery has also been used to identify which ideas resonated with particular personas. The personas have been socialised with users.
A key outcome has been the proposed transformation of policy language to ensure that it can be understood by staff and users, without needing to understand government language and process. Major and transformative policy changes are also being considered to allow for straight through automated processing which demonstrates the team are considering past Beta and into the long term future state.
Criterion 1: Understand user needs
The prototypes are based on integration with Defence and have been widely shared with the user. The team recognised that more work with the DVA staff regarding internal business processes will need to be explored during the Beta. The panel recommends the team engages with DVA Staff members to obtain feedback on prototypes.
The team has created five personas based on users engaged, taking into account behavioural and attitudinal patterns. There are plans to expand the use of these profiles into the Beta phase by compiling and overlaying user specific statistical data.
Criterion 2: Have a multi-disciplinary team
The team has identified additional Beta resource requirements, which will need to be addressed in the ongoing sprints leading into the Beta. Additional staff and consideration of usability testing and accessibility have been identified and the panel agrees this is on track.
What we’ve learnt
Delivery team
At the start of the project there was a perception that DVA’s conservatism would be a constraint to using the Digital Service Standard. Within weeks of demonstrating how user-centred design can inform service delivery the team received overwhelming support from DVA Executives. The lesson learnt is that by showing why we are making a change breaks down traditional ‘perceived’ barriers.
Generally conservative staff will accept new ways of working if they can see tangible results. External scepticism needs to be ignored by the team and a results focus maintained at all times. The lesson here is to invite other levels of staff to Showcases, opening up the possibilities for them too. Having a multidisciplinary team co-located enables the team to respond and make decisions quickly.
Developing a MVP does not mean thinking small, it means accepting not everything can be fixed at once — but lots can.
Sourcing a Delivery Manager experienced in the DTA’s service design and delivery process and sourcing external user research, user interaction and ICT skills have significantly contributed to the current success of the project.
Assessment team
The panel has identified the importance of the user journey. Discovery to Alpha is the foundation of the evidence for the best possible MVP. More importantly a high number of users are not the key to discovery but the diversity of the group to identify similarities of problems.
The establishment of a team with clear roles and decision-making has been key to the ongoing success of the project. Traditional governance structures (which this team has none) are not the key to decision-making. Governance is about ensuring the champion and executives are present, driven and have an agreed objective.
The team’s commitment and adoption of the DTA Digital Service Standard processes has demonstrated the value and benefits it provides to investment decisions.
Assessment against the Digital Service Standard
Criterion | Result |
---|---|
1 | Pass |
2 | Pass |
3 | Pass |
4 | On track |
5 | On track |
6 | On track |
7 | On track |
8 | On track |
9 | On track |
10 | On track |
11 | Not assessed |
12 | Not assessed |
13 | Not assessed |