3.5 Regression and Release Candidate Test Plan


Release Candidate Test Plan

When all 3.5 tickets have been completed we can begin testing and we will follow the test plan detailed below.

Before we start Release Candidate testing:

  • If there are Blocker or Critical bugs when we start Sprint 113, they must be fixed and tested before testing can begin.
  • If there are any open tickets related to the features in the 3.5 release then these tickets must be completed and marked as Done before any testing begins.
  • Regression testing has been completed in previous sprints, before we start release candidate testing.

Starting Release Candidate testing:

Test Phase
Components/Features
Assigned in Test Cycle
Dates
QA lead responsibilities
Phase 1

New features for 3.5 release

Reporting: https://docs.google.com/document/d/1pVlgGXV9nb-nYlrTCRHS7Sor8ZrcnL59ch_Lnn2whZ0/edit

New features test cases will have 3.5 fix version 

3.5 RC1 Test cycle




1 full day each team

Team Fast Parrot: 

  • Joanna will provide status of testing by end of day. 
  • Joanna will triage and prioritize any bugs found in Phase 1
  • If there are Blocker or Critical bugs they will be assigned to a 3.5Phase1Bug test cycle 
  • Bugs are labeled 3.5Phase1bug

Team Mind the Gap: Sam Im (Deactivated)

  • Sam will provide status of testing by end of day.
  • Sam will triage and prioritize any bugs found in Phase 1

Team JSI: 

Team Ona Gap: 

Bug triageBug triage for Phase 1

Phase 1 Bug test cycle

1 full day 

This bug test cycle is done in parallel (on the same day -  as Phase 2 below)


Phase 2

Regression

Exploratory Testing

Translations

Performance Testing

3.5 RC2

3.5 RC2 test cycle




1 full day each team

  • Anyone not participating in bug fixes will complete Edge case, exploratory or translation testing
  • For all bug fixes, we will require pre-push reviews via pull requests or have more reviewers for any change. If reviews are still pending at the end of the day, please mention the reviewer in slack.
  • Mateusz Kwiatkowski will lead performance testing
  • Bugs are labeled 3.5Phase2bug
Bug triageBug triage for Phase 2

Phase 2 Bug test cycle

Phase 2 bug fix cycle

1 full day  (as needed)

After these test phases are complete, and all tests have passed with no Blocker or Critical bugs, we can deploy the release candidate and begin the next phases of testing that are defined below.


Suggested Schedule: Assuming we start testing on Nov 22nd (Actual testing start date was Nov 30)

Week One:

Mon Nov 19Tues Nov 20Wed Nov 21Thurs Nov 22Fri Nov 23
  • Sent Malawi notification that we expect to start testing on Thurs

End of Sprint 112

  • Overview of 3.5 test plan/schedule in the Product Committee meeting

Sprint 112 Showcase

Go/No-go decision to start 3.5 release testing

Teams worked on finishing 3.5 features until Nov 30



Week Two:

Mon Nov 26Tues Nov 27Wed Nov 28Thurs Nov 29Fri Nov 30




Teams worked on finishing 3.5 features until Nov 30

Start testing

Testing is focused on testing the new features in Phase 1 of testing, then regression testing in Phase 2


Week Three:

Mon Dec 3Tues Dec 4Wed Dec 5Thurs Dec 6Fri Dec 7

Daily bug triage at 6am PST

  • Deployed 3.5 RC to Malawi to begin testing
Daily bug triage at 6am PSTDaily bug triage at 6am PSTDaily bug triage at 6am PSTDaily bug triage at 6am PST

Week Four:

Mon Dec 10Tues Dec 12Wed Dec 13Thurs Dec 14Fri Dec 15
Daily bug triage at 6am PSTDaily bug triage at 6am PST

Daily bug triage at 6am PST

  • No bugs found 
  • All testing completed
  • Release deployed




Roles & Responsibilities

QA Team Leads

Sam Im and Joanna Bebak will be the QA leads for each team

Owner
Responsibilities
Questions
  • Create Test Cycles 
  • Create missing test cases and assign to test cycles
  • Assign Test Cases to team members
  • Point person for any questions about test cases from the team
  • Review execution of the test cycles, there should be positive progress during the day
  • Prioritize bugs per test cycle, Check that developers have detailed proposed solutions (if time allows or developer's experience allows) When a bug is created during the day, the bugs are triaged before end of day and detailed in the daily QA slack communication
  • Report the status of each test cycle, including defects reported, before end of day 
  • After a bug fix test cycle, review automated testing and provide status before end of day (go to http://build.openlmis.org/view/all/builds)

Team leads:

Mateusz Kwiatkowski

Muhammad Ahmed (Unlicensed)

Sam Im (Deactivated)

Craig Appl (Unlicensed)


  • Coordinate manual testing with team members
  • Coordinate manual performance testing with team members
  • Attend bug triage & assign bugs to team members
  • Review bugs created by team and check for completeness, proposed solutions, priorities, and labels
  • Refreshing testing environments as needed see instructions here: /wiki/spaces/OP/pages/112106340

Team responsibilities
  • Execute test cases assigned to you by your team lead
  • Record the test execution by following these steps:Testing Process & Test Plans#ExecutingTestCase
  • Enter Defects as encountered:
    • If there are any Blocker bugs, try to spend time completing a root cause analysis and add details in the bug ticket for ease in bug triage
    • When a defect is found, research and provide proposals in the ticket for review by Sam & Joanna (as time allows)
    • All bugs are labeled with their release candidate: RC1, RC2, etc.
  • Assist other testers as needed
  • Josh Zamor - to set up update environment(s) with Release Candidate before testing begins
  • Team leads to provide updates on automated performance testing results
  • Team leads to coordinate manual UI Performance Testing


  • Joanna Bebak (Deactivated) will create a bug test cycle when bugs are found, and assign the test cases to them, for each phase there will be a separate bug fix test cycle

Bug Triage team

Members
Responsibilities
Questions
  • Review list of bugs provided by Sam & Joanna in QA slack channel (Sam Im (Deactivated) to link the bug query here)
    • Only bugs that have been entered and labeled Phase1bug or Phase2bug will be reviewed during triage
  • Prioritize bugs
  • Communicate which bugs need to be fixed, who is assigned to fix for each team
  • Sam & Joanna create test cycles for retesting if needed
  • Sam & Joanna provide status update on bug fixes in QA slack channel as needed
  • Sam & Joanna communicate in slack the bugs that have been added to the board for the day.
  • When should we meet every day? Once in the morning (7am PST) and a communication on bug triage status at the end of the day
  • Guidance on Bug prioritization is located here:http://docs.openlmis.org/en/latest/contribute/contributionGuide.html#reporting-bugs
  • If there are bugs, then testing for this phase will be done in uat only.
  • If there are bugs found, then we must retest (see workflow diagram above for testing after bug fix phase)
  • If bugs prioritized as Major or less, they will be triaged in Phase 2 bug triage
  • Communication of the test plan before we start testing for the release

Communication on the Test Plan and the daily testing status:

  • What needs to be communicated daily?
    • Test Cycle execution status (including test cycle name and % of completion)
    • #of test cases that were executed, passed, and failed
    • Joanna Bebak (Deactivated) will post any test cases or items that need attention and review by the end of her day in the QA slack channel
    • Sam Im (Deactivated) will post any test cases or items that need attention and review by the end of her day in the QA slack channel
    • Sam Im (Deactivated) will communicate with Malawi team (via Malawi slack channel) and notify them about release candidate testing
      • Communication of scheduled start date before release candidate testing begins
      • Communication of when we start release candidate testing (Malawi has one week to test)
      • Malawi team is included in the daily bug triage meetings
  • Best time of day (for each team to communicate morning status and end of day status & share blockers) 
    • Beginning of day, post what we are doing today
    • End of day, post status of what we have done and anything pending

Environments and Demo Data for testing

Environments: test.openlmis.org, uat.openlmis.org, and two additional environments once this ticket is completed  OLMIS-5603 - Getting issue details... STATUS

Refer to demo data readme for more details about user permissions:https://github.com/OpenLMIS/openlmis-referencedata/blob/master/src/main/resources/db/demo-data/README.md

ONLY test with the users "Admin" or "Administrator" when executing test cases related to Administration activities.

Testing Data

Component
Username
Program
Concerns

Requisitions



srmanager1, smanager1, psupervisor, wclerk1

srmanager2, smanager2, psupervisor, wclerk1 


srmanager4 (for second approval), smanager4, dsrmanager, psupervisor


administrator (testing requisition template updates/changes or program settings changes)

Family Planning

Essential Meds


Essential Meds and Family Planning

  • Demo data restriction: May need to refresh environment if all current period reqs are processed (request and post status in QA slack channel)
Stock Management

srmanager2

divo1



Fulfillment

vsrmanager1

vsrmanager2

divo1, divo2

rivo

vwclerk1



Administration (Reference Data)

admin

administrator

All programs
CCE

divo1, divo2

vsrmanager1 (supervises Cuamba)

vsrmanager2 (one facility)



Reporting

Username: admin

Password: password



Exploratory Tests

  • Translations
  • Edge case scenarios

More details concerning this kind of testing can be found in the section on exploratory testing in the Testing Guide: https://openlmis.readthedocs.io/en/latest/conventions/testing.html#exploratory-testing.

Performance Tests

Performance testing scenarios are located here: Performance Metrics

Enter Performance Metrics here: https://docs.google.com/spreadsheets/d/1z1D4EUHsE-R_bUTt4HYcWiVDEy_UX50lZhdiyiC4bYg/edit#gid=0


Deploying the release: Release Checklist 3.5


OpenLMIS: the global initiative for powerful LMIS software