The 3.18 Regression and Release Candidate Test Plan

The Release Candidate Test Plan

When all 3.18 tickets have been completed, we can begin testing, and we will follow the test plan detailed below.

Before we start the Release Candidate testing:

  • If there are any blocker or critical bugs related to the features in the 3.18 release, they must be fixed and tested before the RC testing can begin;

  • If there are any open tickets related to the features in the 3.18 release, they must be completed and marked as Done before any RC testing begins;

  • Regular regression testing has been completed in previous sprints before we start the release candidate testing.

Starting the Release Candidate testing:



Test Phase

Components/Features

Assigned in Test Cycle

Dates

QA lead responsibilities

Test Phase

Components/Features

Assigned in Test Cycle

Dates

QA lead responsibilities

RC 1

  • Execution of the 3.18 RC1 test cases on Firefox (Regression testing concerning all components/features);

  • Performance testing on Chrome;

  • Exploratory and translation testing on Chrome.

3.18 RC1

5 full days (as needed)

Team Mind the Parrot: 

  • @Szymon Rujner

     will provide the status of testing by the end of the day; 

  • @Szymon Rujner

     will triage and prioritize any bugs found during RC testing;

  • If there are blocker or critical bugs, they will be assigned to a 3.18RC1Bug test cycle;

  • Bugs are labeled 3.18RC1bug.

Bug triage

Bug triage for RC1

Bug fixes test cycle for a given Release Candidate, e.g. 3.18 RC1 Bug fixes

2 full days (as needed)

  • @Szymon Rujner

     to lead the Bug Triage meetings.

Suggested schedule, assuming we start testing on October 23:

Week One:

Mon October 21

Tue October 22

Wed October 23

Thu October 24

Fri October 25

Mon October 21

Tue October 22

Wed October 23

Thu October 24

Fri October 25

 

  • Deploy 3.18 RC

  • Testing

  • Testing

  • Testing


Week Two:

Mon October 28

Tue October 29

Wed October 30

Thu October 31

Fri November 1

Mon October 28

Tue October 29

Wed October 30

Thu October 31

Fri November 1

  • Testing

  • Testing

  • Release 3.18 RC

 

 

Week Three:

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Week Four:

 

 

 

 

 

 

 

 

 

 



 



 



Roles & Responsibilities

QA Team Leads

@Szymon Rujner will be the QA lead for the Mind the Parrot team.

Owner

Responsibilities

Questions/Tasks before release begins

Owner

Responsibilities

Questions/Tasks before release begins

@Szymon Rujner

  • Create test cycles;

  • Create missing test cases and assign them to test cycles;

  • Point person for any questions about test cases from the team;

  • Review execution of the test cycles, there should be progress during the day, and communicate the status at the end of the day;

  • Triage bugs before the scheduled bug triage;

  • Prioritize bugs per test cycle, check that developers have detailed proposed solutions (if time or the developer's experience allows). When a bug is created during the day, the bugs are triaged before the end of the day and detailed in the daily QA Slack communication;

  • Report the status of each test cycle, including defects reported, before the end of day; 



Team leads:





  • Attend and make a decision for Go/No-Go before we start the release process:

    • Agree all features are done and ready to be tested;

    • The team is ready to start testing;

    • Review roles & responsibilities.

  • Coordinate manual testing with team members;

  • Coordinate manual performance testing with team members;

  • Assign bugs to developers after they have been prioritized at the bug triage;

  • Refresh test environments as needed (see the instructions here: /wiki/spaces/OP/pages/112106340).

  • Team Mind the Parrot focuses on manual performance testing;

  • Team Mind the Parrot typically refreshes all test environments during the release testing.

Team responsibilities

  • Execute the test cases assigned to you;

  • Record the test execution by following these steps: Testing Process & Test Plans#ExecutingTestCase;

  • Enter bugs as encountered:

    • If there are any blocker bugs, try to spend some time completing a root cause analysis and add the details in the bug ticket for ease at the bug triage;

    • When a bug is found, research and provide proposals in the ticket for review by @Szymon Rujner  (as time allows);

    • Check the bugs for completeness, proposed solutions, priorities and labels;

    • All bugs are labeled with their release candidate: RC1, RC2, etc.

  • Assist other testers as needed.

  • @Szymon Rujner

     will create a bug test cycle when bugs are found, and assign the test cases to them. For each RC, there will be a separate bug fix test cycle.



The Bug Triage team

Members

Responsibilities

Questions

Members

Responsibilities

Questions

@Szymon Rujner

@Artur Lebiedziński

@Maciej Grochalski

 

  • Review the list of bugs provided on the QA Slack channel;

    • Only bugs that have been entered and labeled RC1bug will be reviewed during the triage;

  • Prioritize bugs; 

  • Communicate which bugs need to be fixed;

  • @Szymon Rujner

     to create test cycles for retesting if needed;

  • @Szymon Rujner

     to provide the status update on the bug fixes on the QA Slack channel as needed;

  • @Szymon Rujner

     to communicate on Slack the bugs that have been added to the board for the day.

  • When should we meet every day? Meetings will be held every two days (6am PST) and a communication on the bug triage status at the end of the day;

  • Guidance on bug prioritization is located here: http://docs.openlmis.org/en/latest/contribute/contributionGuide.html#reporting-bugs;

  • If there are bugs, then testing for the RC will be done in UAT only;

  • If there are critical/blocker bugs found, then we must retest. If the bugs concern the Reference Data service, all test cases related to the RC in which the bug was found have to be executed again. If the bug concerns any other service, only test cases concerning the service in which it was found have to be executed again;

  • Communication of the test plan before we start testing for the release:

    • @Szymon Rujner

       is responsible for communicating this test plan on the QA Slack channel.

Communication on the Test Plan and the daily testing status:

  • What needs to be communicated daily?

    • The test cycle execution status (including the test cycle's name and % of completion);

    • # of test cases that were executed, passed and failed;

    • @Szymon Rujner

       will post any test cases or items that need attention and review by the end of her day on the QA Slack channel;

  • The best time of day (for each team to communicate the morning status and the end of day status & share blockers): 

    • The beginning of the day: Post what we are doing today;

    • The end of day: Post the status of what we have done, and anything pending.

Environments and the demo data for testing

Environments: uat.openlmis.org.

Refer to the demo data readme for more details about user permissions: https://github.com/OpenLMIS/openlmis-referencedata/blob/master/src/main/resources/db/demo-data/README.md .

ONLY test with the users "admin" or "administrator" when executing test cases related to administrative activities.

Test Data

Component

Username

Program

Concerns

Component

Username

Program

Concerns

Requisition





srmanager1, smanager1, psupervisor, wclerk1

srmanager2, smanager2, psupervisor, wclerk1 

srmanager4 (for the second approval), smanager4, dsrmanager, psupervisor

chaz (for the second approval)

administrator (testing requisition template updates/changes or program settings changes)

Family Planning

Essential Meds

ARV

Essential Meds and Family Planning

ARV

Family Planning

Essential Meds

EPI

ARV

  • Demo data restriction: May need to refresh the environment if all current requisition periods are processed (request and post the status on the QA Slack channel).

Stock Management

srmanager2

divo1, rivo

Family Planning, Essential Meds

EPI



Fulfillment

vsrmanager1

vsrmanager2

divo1

divo2

rivo

vwclerk1

wclerk1

EPI

ARV



Administration (Reference Data)

admin

administrator

All programs



CCE

divo1

divo2

vsrmanager1 (supervises Cuamba)

vsrmanager2 (one facility)

EPI



Reporting

administrator

All programs



 

Exploratory Tests

  • Translations;

  • Edge case scenarios.

More details concerning this kind of testing can be found in the section on exploratory testing in the Testing Guide: https://openlmis.readthedocs.io/en/latest/conventions/testing.html#exploratory-testing.

Performance Tests

The performance testing scenarios are located here: Performance Metrics.

Enter the performance metrics here: https://docs.google.com/spreadsheets/d/1z1D4EUHsE-R_bUTt4HYcWiVDEy_UX50lZhdiyiC4bYg/edit#gid=0 .



Deploying the release: Release Checklist 3.18.

OpenLMIS: the global initiative for powerful LMIS software