Bonded Return of Service — BROSS
Background
The Bonded Medical Program provides students a Commonwealth supported place in a medical course at an Australian university in return for a commitment to work in regional, rural or remote areas for a specified period.
The Bonded Medical Program was established in 2001. There are two schemes; the Medical Rural Bonded Scholarship (MRBS) and the Bonded Medical Places (BMP) Scheme. The MRBS Scheme closed to new participants after the 2015 academic year.
To date more than 10,000 participants have participated. Over 4,500 of these participants are currently completing their medical course at one of the 20 Australian universities participating in the program. From 1 January 2020, applicants will enter the program under a modernised BMP Scheme.
The Rural Access Branch (RAB) is working in collaboration with the Data Services Branch (DSB) on the Enhancements to Bonded Programs project. RAB has engaged DSB to develop and implement a solution to support enhancements to the Bonded Medical Places (BMP) Scheme and Medical Rural Bonded Scholarship (MRBS) Scheme. These enhancements will support the provision of vocationally recognised (VR) Australian trained doctors in rural and remote areas and areas of workforce shortage.
The project is running over three years, commencing in July 2018 and due to finish in June 2021.
The IT component of the Enhancements to Bonded Programs project will define and deliver an application (BRoSS) that provides:
- Data from existing data sources
- Secure access for users, this includes internal Health users and external authenticated users (using either AUSkey or myGov depending on the type of external user)
- An intuitive browser based interface for use by scheme recipients, university administrators and Health administrators
- Pre-defined reports for Health administrators of the bonded programs
Criterion 1: Understand user needs
In advance of the project, a major initiative to improve the user experience was undertaken to combine the process from two different departments, DHS and Health to be centralized under health. It was also considered outdated and no longer fit for purpose.
The team provided a strong understanding of their external user groups and demonstrated depth knowledge of their user’s journey throughout the life cycle of the bonded medical process.
The identified pain points for resolution have been well documented being:
- The department is not able to effectively manage and monitor bonded students and doctors to ensure they are on track to meet their Return of Service Obligation (RoSO), as they move through the different phases of the 18-year long program
- Bonded students and doctors are not able to self-manage their information
- Bonded students and doctors are not able to effectively plan their RoSO
- Support organisations are not able to self-serve information on bonded students and doctors in their location
A detailed explanation was provided regarding the main user’s groups and how the new system will address their pain points as follows:
- BRoSS will provide the Department with the ability to review and monitor RoSO plans registered by participants
- BRoSS will provide the participants the ability to self-manage their information, including update contact and work details
- Participants will be able to create and save multiple different RoSO plans (scenarios) of how they might be able to meet their obligations
- Universities will be able to register applicants directly into BRoSS and access a full list of bonded students for their university
- BRoSS will provide Rural Workforce Agencies with access to contact information for bonded students in their state
Each of the nominated groups including internal health users will benefit from the introduction of the new system and portal. The team was able to articulate the pain points felt by their external user groups under both the old and new bonded medical process. One focus point was the capacity to self-serve on status as well as being able to monitor the status of an application and the application processing time post submission to the Department of Health.
Throughout the discovery and requirements gathering phase of the project, the team engaged with users in several workshops identify high level requirements. User research interviews were conducted with the following groups:
- Members of the Rural Support Section
- Members of the Call and Information Centre (CIC)
- Admissions officers from universities offering both undergraduate and postgraduate courses
- Rural Workforce Agencies
Focus groups for the different user groups have been established.
- University focus group
- Participants focus group
- Rural Workforce Agency (RWA) focus group
Prior to Beta assessment two focus groups met, Universities and Participants.
The focus group meetings with Universities and Participants provided an opportunity for further user research to be conducted.
A system demonstration of functionality specific to each of these user groups was provided at the end of the focus group meeting. Both were very well received.
User Research Overview
- Several workshops were held with the Business as Usual (BAU) support team (Rural Support Section) to identify high level requirements
- 10 User research interviews were conducted with:
- Members of the Rural Support Section
- Members of the Call and Information Centre (CIC)
- Admissions officers from universities offering both undergraduate and postgraduate courses
- Rural Workforce Agencies
Reviewed three months of CIC reports detailing number of calls, the subject of the majority of calls for the month and trends. Five user personas were created based on the information gathered.
Focus Group User Research
As part of the Focus Group meetings with Participants and Universities additional user research was conducted.
University focus group
- The University focus group was attended by representatives of 20 universities
- Pain points with the current state were discussed and documented
- A questionnaire was provided to Universities to gather information about their current business processes and timeframes
Participant focus group
- The Participant focus group was attended by 22 participants, a mix of bonded scholars and junior doctors
- Pain points with the current state were discussed and documented
Research Findings
The findings from user research where incorporated in high-level requirements for BRoSS
The team also noted that consultations were ongoing throughout the requirements gathering phase and solution design via interactions with the implementation working group (IWG) comprised of internal and external user’s representatives.
In the Alpha assessment, the assessment panel recommended that the team consider approaching a select group of external stakeholders directly outside of workshops to further test the prototypes in a one on one setting. It is also recommended to engage more widely across the full spectrum of impacted users where possible and feasible.
The assessment panel have identified that the BRoSS team have taken the advice of the assessment panel and thus have engaged further workshops to further understand the needs of the users.
Criterion 2: Have a multidisciplinary team
The multidisciplinary team responsible for design and delivery of the project is made up of the following roles:
- Product Owner
- Project Manager
- Delivery Manager
- Organisational Change Manager
- Business Subject Matter Experts (SMEs)
- Solution Architect
- Business Analysts
- UX Designer
- Developers
- Testers
The creation of a supportive management structure provided a positive impact and helped form a successful and multidisciplinary team.
The product owner has empowered the SMEs, and decisions on the project scope and direction have been made at the appropriate level. Senior executive support is available when required and at the appropriate times.
Further to the above, members of the technical development team were also able to articulate the business’s end user groups and have a sound understanding of the current business scenarios as well as the future state implementation of the solution.
The technical development team have continued working in a transparent environment well into the Beta stages of the project.
Criterion 3: Agile and user-centered process
The team is effectively using agile (Scrum) processes and is aligning work appropriately against the Digital Transformation Agency’s (DTA) Digital Service Standard.
The project has first targeted the applicants with the first MVP and release. Further releases will target other main users’ groups and add to the existing MVP as the project continues in the delivery and implementation phases.
The panel discussed the lack of a full private beta in the production environment. This is due to the time constraints between student obtaining marks and the need to apply for existing spots (28% roughly being bonded spots).
The panel recommended that the team continue to develop alpha based prototypes as additional functionality becomes available relevant to an expanded user group, and consult externally for feedback on those prototypes.
During the Beta phase, the BRoSS team have continued to demonstrate further development of prototypes, which align to user stories and Beta product.
Criterion 4: Understand tools and systems
The team have demonstrated a very strong understanding in the technology of choice used for the BRoSS system. The team have implemented a custom Java system that will have integration with myGov in the near future. The BRoSS system also uses existing infrastructure and middleware WebSphere, Oracle, TRIM Attachment Service and XCOM. The system is not dependable on future vender specific upgrades, which gives developers full control of the product without vender specific impacts.
TRIM documents that have been identified will be linked to a separate instance of TRIM and then will be linked to the participant’s BRoSS record in the new system.
Criterion 5: Make it Secure
The team have also demonstrated a very strong understanding in ensuring that the system is secure.
- External users of BRoSS must be authenticated using either AUSkey or username/password prior to gaining access to BRoSS
- Information security measures for BRoSS will conform to Australian Government standards as provided by the Australian Security Manual
- Security accreditation process in accordance with Health’s IT Security Accreditation framework commenced in March 2019
- An IT Risk Register and Security Risk Management Plan were drafted to support approval for an Interim Authority to Operate (IATO), valid to 31 January 2020
- A Privacy Impact Assessment (PIA) has been completed
- It is expected that security accreditation of BRoSS will be achieved before Release 1 Go Live in December 2019 and that the solution will be subject to penetration testing prior to its release
The assessment panel recommended that the BRoSS team continue using their internal security processes. The BRoSS team have continued to follow their internal security policies well into the Beta phase of the project.
Criterion 6: Consistent and responsive design
The team have engaged in UX/UI capability that ensures that the system complies to modern web standards. UX/UI techniques such as responsive stylesheets have been created to ensure that the user can operate the system on multiple hand-held devices.
The design layout has a user friendly feel and the UI is logical, easy to navigate and uses WCAG standards to ensure the user is having a pleasing experience while navigating the application.
- The User Interface responds to all screen sizes
- It is mobile responsive – however BRoSS will support access to maps which may impact it’s use on mobile platforms
- It is functional in all supported web browsers across desktops
- BRoSS has been designed to work effectively with accessibility hardware and/or software, including screen-reading software
Criterion 7: Use open standards and common platforms
The BRoSS team have advised that in accordance with DTA guidelines, the Project will apply open standards and common platforms where this delivers a secure and acceptable solution.
The team have utilised the Java Enterprise Edition (JEE) standards for this project.
AUSkey has been utilised for organisational user authentication via the Health Data Portal.
MyGov will be utilised for individual user authentication via the Health Data Portal once it is implemented within the Health environment. This is forecast to be in the first quarter of 2020.
Criterion 8: Make source code open
The code is currently being stored in a BitBucket repository. This repository gives teams one place to plan projects, collaborate on code, test, and deploy the code. The BitBucket repository can be accessed by Health single sign-on users once an account has been set-up for them in BitBucket. The application code can be supplied to other government departments upon request.
Criterion 9: Make it accessible
The team have ensured that the BRoSS application meets WCAG 2.0 level AA standard but have yet to have a WCAG report run across the application.
The assessment panel has made a recommendation that the BRoSS team engage the Department of Health testing team to run a WCAG report across the application to see if it meets the WCAG 2.0 level AA standard.
The BRoSS team have ensured that the UI responds to all desktop screen sizes and is also mobile responsive.
The design layout has also been adapted to support keyboard navigation techniques.
The UI incorporates usability and accessibility standards set by the Digital Service Standard by utilising the Australian Government Design System.
For each of the three releases of BRoSS: Alpha, Beta and Go-live, User Acceptance Testing (UAT) has been undertaken where representatives of the user groups have been involved in reviewing and testing BRoSS with process flows from a user’s perspective.
A key element of UAT is for users to review system content for accuracy, simple English expression and consistency of user experience.
Feedback logs where used to consolidate feedback from users. The user feedback was then triaged with defects and enhancements being identified for inclusion in subsequent sprints.
Criterion 10: Test the service
The team have been testing the service, which has been incorporated through the lifecycle of a user story starting with acceptance criteria, which is used to derive test cases.
Developer unit testing and peer review has been incorporated in the Development environment. Functional testing occurred in a designated Test environment. UAT has also been conducted in the Acceptance environment.
User Acceptance Testing for the Private Beta release has been conducted with members of the BAU Support team and representatives of 10 Universities. Feedback from this round of UAT has been documented and will be assessed by business.
Further rounds of UAT are scheduled for the future releases. The same approach will be used for all releases.
- There will be targeted involvement of user groups for the different releases depending on the functionality being implemented
- User test exercises will be created to support formal user acceptance
- Testing results and general feedback will be captured and categorised. These will be discussed with the project team and prioritised within the backlog for resolution
Criterion 11: Measure performance
User satisfaction — to help continually improve the user experience of our service
User satisfaction will be measured by periodic Qualitative surveys facilitated by ‘citizen space’ Improvement will be measured by the improved responses from the baseline survey completed prior to BRoSS development.
Digital take-up — to show how many people are using the service and to help encourage users to choose the digital service
The BRoSS team will measure the number of Bonded applicants and participants who log-in to application to edit/confirm their contact details as a percentage of the total number of Bonded Applicants and Participants.
Completion rate — to show which parts of the service we need to fix
The number of existing Bonded Participants who log in to the BRoSS system and accept the terms and conditions of the new arrangements.
Secondary method is to measure the participant completion rate, specifically around breaches in meeting Return of Service Obligations (RoSO). This will be baselined measured in July 2019, at the mid-way point (1 July 2020) and again at evaluation 1 February 2021.
Cost per transaction — to make our service more cost efficient
This will be measured using the cost per session and dividing the total number of sessions per month by the monthly operating cost.
Reporting
Reporting is internal to the Health Workforce Division Program Management Board at this time.
Criteria 12: Don’t forget the non-digital experience
Current state requires participants to contact the Call and Information Centre or the BAU support team by phone or by email. This process will still be available
The BAU support team will have full access to update details in BRoSS for participants, at their request.
Criteria 13: Encourage everyone to use the digital service
There has been no digital equivalent in existence before. Therefore, there is no baseline to reference. The Alpha and Private Beta functionality has been popular with stakeholders.
Digital take-up is almost guaranteed since only the digital space can provide the functions identified as being required for this complex program.
The panel congratulates the team on their performance and provides the following recommendations as feedback:
- Ensure that a WCAG 2.0 accessibility report has been organised. Try to remedy any issues that may arise.
Assessment against the Digital Service Standard
Criterion | Result |
---|---|
1 | Pass |
2 | Pass |
3 | Pass |
4 | Pass |
5 | Pass |
6 | Pass |
7 | Pass |
8 | Pass |
9 | Pass |
10 | Pass |
11 | Pass |
12 | Pass |
13 | Pass |