Introduction
User Acceptance Testing (UAT) for SELV is intended as an opportunity for HSG ensure the system (including both the website and reporting) is ready to be used by VillageReach and PAV staff. Prior to UAT, ISG has conducted functional testing of the individual components of SELV, as well as basic end-to-end tests from data entry through reporting. UAT moves beyond basic functional validation to focus on actual user behavior in the field, ensuring that the overall system meets user needs and works as intended.
Roles & Responsibilities
HSG & ISG are jointly responsible for UAT testing, however ultimately HSG is responsible for approving the results of UAT and deciding to roll-out to production. Roles & responsibilities are as follows:
Responsible | Approver | Consulted | Informed | |
---|---|---|---|---|
UAT Plan | Sarah | Timoteo | Wendy, Vidya, Ron, Josh | n/a |
UAT Execution | Timoteo, Vidya, Sarah | Timoteo | n/a | Wendy, Ron, Josh, Allen |
Defect Triage | Sarah | Wendy | Timoteo, Vidya, Ron, Josh, Allen | n/a |
UAT Coordination | Sarah | n/a | n/a | Wendy, Ron, Josh, Vidya, Allen |
UAT Approval | Timoteo | Wendy | Vidya | Ron, Josh, Sarah, Allen |
RACI Definitions:
Responsible: Those who do the work to achieve the task.There is at least one role with a participation type of responsible, although others can be delegated to assist in the work required (see also RASCI below for separately identifying those who participate in a supporting role).
Accountable: (also approver or final approving authority)The one ultimately answerable for the correct and thorough completion of the deliverable or task, and the one who delegates the work to those responsible.[7] In other words, an accountable must sign off (approve) on work that responsible provides. There must be only one accountable specified for each task or deliverable.
Consulted: (sometimes counsel)Those whose opinions are sought, typically subject matter experts; and with whom there is two-way communication.
Informed: Those who are kept up-to-date on progress, often only on completion of the task or deliverable; and with whom there is just one-way communication.
Location
All UAT testing will be done in the VillageReach Mozambique office. If a test is also executed in Seattle that is fine, but it should not be considered "passed" until it has been verified locally.
UAT Testing Strategy
Validations fall into four major categories. The testing strategy for each category is listed below.
ID | Category | Scope | Test Strategies |
---|---|---|---|
1 | SELV Website for Field Coordinators |
| A. Volume - Argentina to enter Maputo Jan 2014 data B. "Real-World" Scenarios - Mimic actual field coordinator data entry conditions and scenarios, including online & offline usage C. UI Coverage - Exercise each option available in the UI to verify functionality and translation D. Facility Special Cases - Enter distribution data into SELV for special cases such as No Visit / Not in DLS, routine data collection only, or a visit where all fields are NRd |
2 | SELV Website for Administrators |
| A. System Configuration - Configure the system for production readiness, with the byproduct of testing admin functionality B. UI Coverage - Exercise each option available in the UI to verify functionality and translations C. "Real-World" Scenarios - Execute typical administrator functions as defined by Timoteo. This will include the Data Edit Tool. D. Monthly Reporting Scenario - Work through business process required to verify data completeness prior to distributing monthly reports
|
3 | vrMIS Data Migration |
| A. Compare vrMIS and SELV Reports - Randomly select a period and confirm vrMIS reports and SELV reports match (allowing for known exceptions). B. Spot Check - Two-hour time box to explore comparing vrMIS and SELV data C. Data Set Completeness - Check that historical data for all periods and facilities has been migrated as expected (check for newly added facilities, vaccines, etc.) |
4 | VillageReach Analytical (English) Report Workbook |
| A. Facility Special Cases - For cases entered in 1D, confirm that facility-level reports included or exclude the data as appropriate. B. Provincial-Level Data Check - For data entered by Argentina in activity A1, confirm Analytical Workbook reports match vrMIS for same period (with known exceptions) C. HSG Review - Check that information presented on reports meets HSG expectations and needs |
5 | Provincial (Portuguese) Report Workbooks |
| A. Compare Provincial Reports with Analytical Report - For a selected month, spot check to verify that data in each provincial workbook matches data in analytical report B. Translation - Select a particular provincial workbook and verify all translations are correct C. HSG Review - Check that information presented on reports and PDF meets HSG expectations and needs. Confirm HSG understands how to use Tableau. |
6 | SELV on Tablet |
| A. Connectivity in Mozambique - Verify connectivity solution works in-country B. UI Coverage - Exercise each option available in the UI to verify functionality and translation C. "Real-World" Scenarios - Mimic actual field coordinator data entry conditions and scenarios, including online & offline usage D. Break In - Attempt to access functionality other than OpenLMIS E. End-to-End Field Test - Field Coordinators to bring tablet into the field and confirm usage or hardware, software, and connectivity in actual vaccine distribution setting
|