When all 3.17 tickets have been completed, we can begin testing, and we will follow the test plan detailed below.
Before we start the Release Candidate testing:
If there are any blocker or critical bugs related to the features in the 3.17 release, they must be fixed and tested before the RC testing can begin;
If there are any open tickets related to the features in the 3.17 release, they must be completed and marked as Done before any RC testing begins;
Regular regression testing has been completed in previous sprints before we start the release candidate testing.
Starting the Release Candidate testing:
If there are no open blocker or critical bugs, then move to the release candidate's deployment and testing:
RC1 (Three full days of testing, and 1 days for bug fixes and bug fix testing) for each release candidate as needed;
The testing will be done in the following environments:
Team Mind the Parrot's UAT environment: https://uat.openlmis.org;
Performance testing: https://perftest.openlmis.org.
How to enter bugs: Testing Process & Test Plans#EnterDefectsduringsprinttesting;
Prioritization of bugs: http://docs.openlmis.org/en/latest/contribute/contributionGuide.html#reporting-bugs.
Test Phase | Components/Features | Assigned in Test Cycle | Dates | QA lead responsibilities |
---|---|---|---|---|
RC 1 |
| 3.17 RC1 | 3 full days (as needed) | Team Mind the Parrot:
|
Bug triage | Bug triage for RC1 | Bug fixes test cycle for a given Release Candidate, e.g. 3.17 RC1 Bug fixes | 2 full days (as needed) |
|
Suggested schedule, assuming we start testing on April 15:
Week One:
Mon April 08 | Tue April 09 | Wed April 10 | Thu April 11 | Fri April 12 |
---|---|---|---|---|
|
Week Two:
Mon April 15 | Tue April 16 | Wed April 17 | Thu April 18 | Fri April 19 |
---|---|---|---|---|
|
|
|
Week Three:
Week Four:
Aleksandra Hinc will be the QA lead for the Mind the Parrot team.
Owner | Responsibilities | Questions/Tasks before release begins |
---|---|---|
| ||
Team leads: |
|
|
Team responsibilities |
|
|
Members | Responsibilities | Questions |
---|---|---|
|
|
What needs to be communicated daily?
The test cycle execution status (including the test cycle's name and % of completion);
# of test cases that were executed, passed and failed;
will post any test cases or items that need attention and review by the end of her day on the QA Slack channel;
will communicate with the Malawi team (via the Malawi Slack channel) and notify them about the release candidate testing:
The communication of the scheduled start date before the release candidate testing begins;
The communication of when we start the release candidate testing (Malawi has one week to test);
The Malawi team is included in the daily bug triage meetings.
The best time of day (for each team to communicate the morning status and the end of day status & share blockers):
The beginning of the day: Post what we are doing today;
The end of day: Post the status of what we have done, and anything pending.
Environments: uat.openlmis.org.
Refer to the demo data readme for more details about user permissions: https://github.com/OpenLMIS/openlmis-referencedata/blob/master/src/main/resources/db/demo-data/README.md .
ONLY test with the users "admin" or "administrator" when executing test cases related to administrative activities.
Component | Username | Program | Concerns |
---|---|---|---|
Requisition | srmanager1, smanager1, psupervisor, wclerk1 srmanager2, smanager2, psupervisor, wclerk1 srmanager4 (for the second approval), smanager4, dsrmanager, psupervisor chaz (for the second approval) administrator (testing requisition template updates/changes or program settings changes) | Family Planning Essential Meds ARV Essential Meds and Family Planning ARV Family Planning Essential Meds EPI ARV |
|
Stock Management | srmanager2 divo1, rivo | Family Planning, Essential Meds EPI | |
Fulfillment | vsrmanager1 vsrmanager2 divo1 divo2 rivo vwclerk1 wclerk1 | EPI ARV | |
Administration (Reference Data) | admin administrator | All programs | |
CCE | divo1 divo2 vsrmanager1 (supervises Cuamba) vsrmanager2 (one facility) | EPI | |
Reporting | administrator | All programs |
Translations;
Edge case scenarios.
More details concerning this kind of testing can be found in the section on exploratory testing in the Testing Guide: https://openlmis.readthedocs.io/en/latest/conventions/testing.html#exploratory-testing.
The performance testing scenarios are located here: Performance Metrics.
Enter the performance metrics here: https://docs.google.com/spreadsheets/d/1z1D4EUHsE-R_bUTt4HYcWiVDEy_UX50lZhdiyiC4bYg/edit#gid=0 .
Deploying the release: Release Checklist 3.4.