Digital Marketplace – Live assessment

The Digital Marketplace simplifies the process of government procurement and makes it easier for businesses of all sizes to access government contracts.

The service passed the assessment because:

The Digital Marketplace team have made significant improvements and increased engagement with their users since their Beta assessment. The team have a clear remit, are working collaboratively in an agile environment and are regularly reflecting on how they can continue to manage and grow their product.

Criterion 1: Understand user needs

The Digital Marketplace team has worked towards a deeper understanding of user needs and iterating the product through Beta.

An Engagement Lead role was introduced within the team during Beta, and this person has been extensively engaging with stakeholders, which helps to inform issues for research. To support feature releases, the team work with active sellers, piloting work and conducting user interviews. The team have weekly support meetings to track correspondence and also review pain points identified through analytics. The product backlog is informed and prioritised by feedback from these methods.

The team has prioritised the user needs into two categories - deliberate scope (priorities that are put upon the team) and emerging scope (priorities decided by the team). They have demonstrated a good understanding of the wide user base and how to balance a number of user and organisational needs. The team will continue user research as the product moves into Live, including with potential users to increase adoption of the Marketplace.

Criterion 2: Have a multi-disciplinary team

The team has demonstrated a good understanding of the roles and capabilities required to support the Marketplace. They have documented roles and responsibilities for each team member, which are continuously monitored and are responsive to the needs of the team.

Over the last few months, the roles have settled with the onboarding of dedicated Delivery Manager and Product Manager roles. Previously, the roles were undertaken by the Service Designer, which the assessor panel noted was an unreasonable burden on a single officer. The team is confident they have found new stability, with their Service Manager setting a clear remit and helping the team move forward successfully.

The team is autonomous and feel empowered to make decisions based on the information at hand.

Criterion 3: Agile and user-centred process

Following the onboarding of new Product and Delivery Managers, the team shifted from a Kanban approach to a monthly sprint cadence. With clear goals set, regular agile ceremonies and through the use of common tools to manage workload, the team are confident in their ability to estimate effort and deliver effectively with a refined product backlog and roadmap

Criterion 4: Understand tools and systems

The team demonstrated a good understanding of the tools and systems needed to effectively run their product. The team benefit from use of the cloud and open technologies and are able to deploy as needed. The team use peer review on code, live monitoring and have notifications embedded into standard operating environment systems which limits need for bespoke or redundant systems.

The team were able to discuss their technological processes at length, but are encouraged to also ensure production release notes are properly documented and shared across the team for business continuity. The team are reminded to ensure patching and upgrading of components continues to ensure health and security of the system.

Criterion 5: Make it secure

The team demonstrated robust processes for verifying suppliers, and assessing their value for money, prior to their inclusion on the Marketplace (and the associated Digital Marketplace Panel). Roles and responsibilities, including access controls, are monitored.

The team consulted with legal and privacy experts to ensure data is collected and managed appropriately, and updated their privacy statement to clarify the process. The team have developed a data breach response plan and have communicated this across the team. Other product teams within the DTA have been able to leverage off this work.

The team monitors their software dependencies and receives warnings when a security vulnerability is found. The developers give advice on the priority when an action is needed. The release pipeline is kept deployable so updates can be quickly deployed.

As the product moves into Live, the team are encouraged to continue monitoring and managing data closely; including maintaining records on access controls and roles and responsibilities across the team. The team will need to continue to actively monitor for security patches and ensure they are quickly applied.

Criterion 6: Consistent and responsive design

The team use the DTA Design System to leverage common components and design and have contributed to that code base. The product works on mobile and desktop devices.

Criterion 7: Use open standards and common platforms

The Digital Marketplace is itself a platform and has been used by other organisations and teams within the DTA. As a product, it uses open and standard code and is hosted on cloud.gov.au.

Content shared across the platform is mixed format, however the team are exploring moving to primarily HTML output, which is encouraging. The team should also continue exploring how data from the product might integrate into data.gov.au; should there prove a good business case for such.

Criterion 8: Make source code open

The Marketplace uses open source code supplied under an MIT licence. The team are collaborating with government teams within the United Kingdom and New Zealand on sharing and improving the code base. The team have contributed to new features within the DTA Design System.

Criterion 9: Make it accessible

After not passing their Beta assessment, the team have undergone extensive work to improve processes, design and code to support improved accessibility. The team regularly meet with the DTA accessibility lead and have undergone numerous rounds of WCAG 2.0 Level AA and inclusive usability testing. The product continues to use JavaScript for certain functions but now includes graceful degradation and back-up phone support, which has not been used to date.

The team consider the experience of improving accessibility to be one of learning and reflection, and now capture accessibility requirements as business as usual. Improvements include rewriting contracts in plain English and minimising use of inaccessible (e.g. scanned) PDFs. A team member noted: “making it accessible makes it better for everyone”.  

The assessor panel is pleased with the significant progress the team has made since Beta. The panel encourages the team to continue engaging with the accessibility community, especially with culturally and linguistically diverse users and those with low literacy.

Criterion 10: Test the service

The team shared evidence on testing strategies which cover over 70 user journeys. Though currently manual, the team will be soon assisted by a seconded staff member to help automate processes. The have also made significant improvements to the search feature to make it faster.   The team are using tools to help identify bottlenecks in the application to assist with finding scaling issues.

The team are encouraged to share their testing approaches with other teams in a showcase. Additional load testing is recommended as the product continues to scale.

Criterion 11: Measure performance

The team regularly maintain their performance dashboard and collect additional metrics which demonstrate use and satisfaction, such as numbers of agencies and transactions as well as completed contracts versus briefs published.

The team are planning an additional discovery to better understand the sellers experience and needs regarding take up of the Marketplace, which will factor into ongoing feature releases and measurements.  

Criterion 12: Don’t forget the non-digital experience

The team have recently employed an Engagement Lead to support users to take up the service and support non-digital access. A telephone line and offline support is available.

Criterion 13: Encourage everyone to use the digital service

The service manager has specified a clear remit to engage more buyers. The team are working to meet this priority through engagements with agencies, better understanding of users and ongoing site improvements.

The assessor panel note the success of the Marketplace team to date, with over $200 million worth of contracts awarded, 40% of which by volume have been awarded to small to medium enterprises. The panel wishes the team ongoing success and is pleased to award them with a pass for their live assessment.

Recommendations

The team are performing well in all areas. In moving into Live, the team is encouraged to continue to:

  • Keep records up to date and shared across the team on access controls, product release notes and roles and responsibilities.

  • Monitor and manage data, exploring how this might be shared across the government or used to demonstrate value of the product.

  • Engage with the accessibility community, and seek additional advice on making the product usable for culturally and linguistically diverse communities.

  • Keep up to date on patching, upgrading of components and load testing to ensure health and security of the system.

Assessment against the Digital Service Standard

Criterion Result
1 Pass
2 Pass
3 Pass
4 Pass
5 Pass
6 Pass
7 Pass
8 Pass
9 Pass
10 Pass
11 Pass
12 Pass
13 Pass