The 3.6 Regression and Release Candidate Test Plan

The Release Candidate Test Plan

When all 3.6 tickets have been completed, we can begin testing, and we will follow the test plan detailed below.

Before we start the Release Candidate testing:

  • If there are any blocker or critical bugs when we start Sprint 123, they must be fixed and tested before the RC testing can begin;
  • If there are any open tickets related to the features in the 3.6 release, they must be completed and marked as Done before any RC testing begins;
  • Regular regression testing has been completed in previous sprints before we start the release candidate testing.

Starting the Release Candidate testing:

Test Phase
Components/Features
Assigned in Test Cycle
Dates
QA lead responsibilities
Phase 1

New features for the 3.6 release

Reporting: https://docs.google.com/document/d/1pVlgGXV9nb-nYlrTCRHS7Sor8ZrcnL59ch_Lnn2whZ0/edit?ts=5c864704

3.6 RC1 Phase 1




1 full day each team (as needed)

Team Parrot: 

  • Joanna will provide the status of testing by the end of the day; 
  • Joanna will triage and prioritize any bugs found in Phase 1;
  • If there are blocker or critical bugs, they will be assigned to a 3.6Phase1Bug test cycle;
  • Bugs are labeled 3.6Phase1bug.

Team Mind the Gap: Sam Im (Deactivated)

  • Sam will provide the status of testing by the end of the day;
  • Sam will triage and prioritize any bugs found in Phase 1.
Bug triageBug triage for Phase 1

Phase 1 Bug test cycle

1 full day (as needed)


Phase 2

Regression testing

Exploratory testing

Translations

Performance testing

3.6 RC1 Phase 2


1 full day each team (as needed)

  • Anyone not participating in the bug fixes will complete edge case, exploratory or translation testing;
  • For all bug fixes, we will require pre-push reviews via pull requests or have more reviewers for any change. If reviews are still pending at the end of the day, please mention the reviewer on Slack.
  • Bugs are labeled 3.6Phase2bug.
Bug triageBug triage for Phase 2

Phase 2 Bug test cycle

1 full day  (as needed)

Suggested schedule, assuming we start testing on April 11:

Week One:

Mon Apr 08Tues Apr 09Wed Apr 10Thurs Apr 11Fri Apr 12
  • Send Malawi a notification that we expect to start testing on Thursday April 11 if we decide to start the 3.6 release testing at the Showcase meeting

End of Sprint 122

Sprint 122 Showcase

  • Go/No-go decision to start 3.6 release testing
  • Overview of the 3.6 test plan/schedule
  • Deploy 3.6 RC to Malawi and all other environments
  • Start testing

    Testing is focused on testing the new features in Phase 1 of testing, then regression testing in Phase 2

    • Daily bug triage at 6 AM PST
  • Daily bug triage at 6 AM PST


Week Two:

Mon Apr 15Tues Apr 16Wed Apr 17Thurs Apr 18Fri Apr 19
  • Daily bug triage at 6 AM PST
  • Daily bug triage at 6 AM PST
  • Daily bug triage at 6 AM PST
  • Daily bug triage at 6 AM PST
  • Daily bug triage at 6 AM PST


Week Three:

Mon Apr 22Tues Apr 23Wed Apr 24Thurs Apr 25Fri Apr 26
  • Daily bug triage at 6 AM PST
  • Daily bug triage at 6 AM PST
  • Daily bug triage at 6 AM PST
  • The 3.6 Release (hopefully)


Week Four:

Mon Apr 29Tues Apr 30Wed May 01Thurs May 02Fri May 03







Roles & Responsibilities

QA Team Leads

Sam Im and Joanna Bebak will be the QA leads for each team. Joanna Szymańska will be assisting.

Owner
Responsibilities
Questions/Tasks before release begins
  • Create test cycles;
  • Create missing test cases and assign them to test cycles;
  • Point person for any questions about test cases from the team (from the QA and from the dev team);
  • Review execution of the test cycles, there should be positive progress during the day, and communicate the status at the end of the day;
  • Triage bugs before the scheduled bug triage;
  • Prioritize bugs per test cycle, check that developers have detailed proposed solutions (if time or the developer's experience allows). When a bug is created during the day, the bugs are triaged before the end of the day and detailed in the daily QA Slack communication;
  • Report the status of each test cycle, including defects reported, before the end of day; 
  • After a bug fix test cycle, review automated testing and provide its status before the end of the day (go to http://build.openlmis.org/view/all/builds).
  • It would be nice to have a review of the new features with the QA team before the release (preferably, at the QA meeting);
  • For 3.5, it worked well to include a dev from each team at the bug triage calls.

Team leads:



  • Attend and make a decision for Go/No-Go before we start the release process:
    • Agree all features are done and ready to be tested;
    • All teams are ready to start testing;
    • Review roles & responsibilities.
  • Coordinate manual testing with team members;
  • Coordinate manual performance testing with team members;
  • Assign bugs to developers after they have been prioritized at the bug triage;
  • Refresh test environments as needed (see the instructions here: /wiki/spaces/OP/pages/112106340).
  • Team Parrot focuses on manual performance testing (not all team leads);
  • Team Parrot typically refreshes all test environments during the release testing.
Team responsibilities
  • Execute the test cases assigned to you;
  • Record the test execution by following these steps: Testing Process & Test Plans#ExecutingTestCase;
  • Enter bugs as encountered:
    • If there are any blocker bugs, try to spend some time completing a root cause analysis and add the details in the bug ticket for ease at the bug triage;
    • When a bug is found, research and provide proposals in the ticket for review by Sam & Joanna (as time allows);
    • Check the bugs for completeness, proposed solutions, priorities and labels;
    • All bugs are labeled with their release candidate: RC1, RC2, etc.
  • Assist other testers as needed;
  • Josh Zamor - to set up the environment(s) with the Release Candidate before testing begins.
  • Joanna Bebak (Deactivated) will create a bug test cycle when bugs are found, and assign the test cases to them. For each phase, there will be a separate bug fix test cycle.

The Bug Triage team

Members
Responsibilities
Questions
  • Review the list of bugs provided by Sam & Joanna on the QA Slack channel (Joanna Bebak (Deactivated) to link the bug query here);
    • Only bugs that have been entered and labeled Phase1bug or Phase2bug will be reviewed during the triage;
  • Prioritize bugs; 
  • Communicate which bugs need to be fixed, who is assigned to fix for each team;
  • Sam & Joanna to create test cycles for retesting if needed;
  • Sam & Joanna to provide the status update on the bug fixes on the QA Slack channel as needed;
  • Sam & Joanna to communicate on Slack the bugs that have been added to the board for the day.
  • When should we meet every day? Once in the morning (6am PST) and a communication on the bug triage status at the end of the day;
  • Guidance on bug prioritization is located here: http://docs.openlmis.org/en/latest/contribute/contributionGuide.html#reporting-bugs;
  • If there are bugs, then testing for this phase will be done in UAT only;
  • If there are critical/blocker bugs found, then we must retest (see the workflow diagram above for testing after the bug fix phase). If the bugs concern the Reference Data service, all test cases related to the phase during which the bug was found have to be executed again. If the bug concerns any other service, only test cases from the phase during which the bug was found concerning the service have to be executed again;
  • If the bugs are prioritized as major or less, they will be triaged in the Phase 2 bug triage;
  • Communication of the test plan before we start testing for the release:

Communication on the Test Plan and the daily testing status:

  • What needs to be communicated daily?
    • The test cycle execution status (including the test cycle's name and % of completion);
    • # of test cases that were executed, passed and failed;
    • Joanna Bebak (Deactivated) will post any test cases or items that need attention and review by the end of her day on the QA Slack channel;
    • Sam Im (Deactivated) will post any test cases or items that need attention and review by the end of her day on the QA Slack channel;
    • Sam Im (Deactivated) will communicate with the Malawi team (via the Malawi Slack channel) and notify them about the release candidate testing:
      • The communication of the scheduled start date before the release candidate testing begins;
      • The communication of when we start the release candidate testing (Malawi has one week to test);
      • The Malawi team is included in the daily bug triage meetings.
  • The best time of day (for each team to communicate the morning status and the end of day status & share blockers): 
    • The beginning of the day: Post what we are doing today;
    • The end of day: Post the status of what we have done, and anything pending.

Environments and the demo data for testing

Environments: uat.openlmis.org, uat3.openlmis.org and uat4.openlmis.org.

Refer to the demo data readme for more details about user permissions: https://github.com/OpenLMIS/openlmis-referencedata/blob/master/src/main/resources/db/demo-data/README.md.

ONLY test with the users "admin" or "administrator" when executing test cases related to administrative activities.

Test Data

Component
Username
Program
Concerns

Requisition



srmanager1, smanager1, psupervisor, wclerk1




srmanager2, smanager2, psupervisor, wclerk1 


srmanager4 (for the second approval), smanager4, dsrmanager, psupervisor


chaz (for the second approval)


administrator (testing requisition template updates/changes or program settings changes)

Family Planning

Essential Meds

ARV


Essential Meds and Family Planning




ARV


Family Planning

Essential Meds

EPI

ARV

  • Demo data restriction: May need to refresh the environment if all current requisition periods are processed (request and post the status on the QA Slack channel).
Stock Management

srmanager2


divo1, rivo

Family Planning, Essential Meds


EPI


Fulfillment

vsrmanager1

vsrmanager2

divo1

divo2

rivo

vwclerk1

wclerk1

EPI






ARV


Administration (Reference Data)

admin

administrator

All programs
CCE

divo1

divo2

vsrmanager1 (supervises Cuamba)

vsrmanager2 (one facility)

EPI
Reporting

reporter1

reporter2

reporter3

All programs

Exploratory Tests

  • Translations;
  • Edge case scenarios.

More details concerning this kind of testing can be found in the section on exploratory testing in the Testing Guide: https://openlmis.readthedocs.io/en/latest/conventions/testing.html#exploratory-testing.

Performance Tests

The performance testing scenarios are located here: Performance Metrics.

Enter the performance metrics here: https://docs.google.com/spreadsheets/d/1z1D4EUHsE-R_bUTt4HYcWiVDEy_UX50lZhdiyiC4bYg/edit#gid=0.


Deploying the release: Release Checklist 3.4.

OpenLMIS: the global initiative for powerful LMIS software