3.4 Regression and Release Candidate Test Plan

Sam Im (Deactivated) and Joanna Bebak (Deactivated) to review 3.3 plan and create a 3.4 test plan here.

For the 3.4 release, the performance testing plan and status is tracked on this page.

Release Candidate Test Plan

During Sprint 104 we will follow the test plan detailed below: 

Test Phase
Assigned in Test Cycle
QA lead responsibilities
Phase 1

Requisitions (Team JSI)

Fulfillment (Team Parrot)

Stock Management (Team Parrot)

CCE (Team Mind the Gap)

Administration/Reference-data UI (Team Mind the Gap)

Reporting (Team Ona Gap)

Team Mind the Gap: 3.4 Administration (Reference Data) RC1

3.4 CCE RC1

Reporting: Testing the OpenLMIS Reporting Stack

1 full day each team

Team Parrot: 

  • Joanna will provide status of testing by end of day. 
  • Joanna will triage and prioritize any bugs found in Phase 1
  • If there are Blocker or Critical bugs they will be assigned to a 3.4Phase 1 Bug test cycle 
  • Bugs are labeled 3.4Phase1bug

Team Mind the Gap: Sam Im (Deactivated)

  • Sam will provide status of testing by end of day.
  • Sam will triage and prioritize any bugs found in Phase 1

Team JSI: Testing Requisitions

Team Ona Gap: Testing Reporting

Bug triageBug triage for Phase 1

Phase 1 Bug test cycle

1 full day 

This bug test cycle is done in parallel (on the same day -  as Phase 2 below)

Phase 2

Bug Fixes & edge cases

Exploratory Testing


Performance Testing for:

  • Regression
  • New features

3.4 RC2

Team Mind the Gap: 3.4 ReferenceData RC2

3.4 CCE RC2

1 full day each team

  • Anyone not participating in bug fixes will complete Edge case, exploratory or translation testing
  • For all bug fixes, we will require re-push reviews via pull requests or have more reviewers for any change. If reviews are still pending at the end of the day, please mention the reviewer in slack.
  • Brandon and one other team member will complete performance testing
  • Bugs are labeled 3.4Phase2bug
Bug triageBug triage for Phase 2Phase 2 Bug test cycle1 full day  (as needed)

After these test phases are complete, and all tests have passed with no Blocker or Critical bugs, we can deploy the release candidate and begin the next phases of testing that are defined below.

Roles & Responsibilities

QA Team Leads

Sam Im and Joanna Bebak will be the QA leads for each team


Sam Im (Deactivated)

Joanna Bebak (Deactivated)

  • Create Test Cycles 
  • Create missing test cases and assign to test cycles
  • Assign Test Cases to team members
  • Point person for any questions about test cases from the team
  • Review execution of the test cycles, there should be positive progress during the day
  • Prioritize bugs per test cycle, Check that developers have detailed proposed solutions (if time allows or developer's experience allows) When a bug is created during the day, the bugs are triaged before end of day and detailed in the daily QA slack communication
  • Report the status of each test cycle, including defects reported, before end of day 
  • After a bug fix test cycle, review automated testing and provide status before end of day (go to http://build.openlmis.org/view/all/builds)

Team leads:





  • Coordinate manual testing with team members
  • Coordinate manual performance testing with team members
  • Attend bug triage & assign bugs to team members
  • Review bugs created by team and check for completeness, proposed solutions, priorities, and labels

Team responsibilities
  • Execute test cases assigned to you by your team lead
  • Record the test execution by following these steps:Testing Process & Test Plans#ExecutingTestCase
  • Enter Defects as encountered:
    • If there are any Blocker bugs, try to spend time completing a root cause analysis and add details in the bug ticket for ease in bug triage
    • When a defect is found, research and provide proposals in the ticket for review by Sam & Joanna (as time allows)
    • All bugs are labeled with their release candidate: RC1, RC2, etc.
  • Assist other testers as needed
  • Chongsun Ahn (Unlicensed) or Josh Zamor to set up update environment(s) with Release Candidate before testing begins
  • Team leads to provide updates on automated performance testing results
  • Team leads to coordinate manual UI Performance Testing

Bug Triage team








  • Review list of bugs provided by Sam & Joanna in QA slack channel
  • Prioritize bugs
  • Provide priority to Sam & Joanna to update in QA slack channel (via filter or dashboard?)
  • Sam & Joanna create test cycles for retesting if needed
  • Sam & Joanna provide status update on bug fixes in QA slack channel as needed
  • Sam & Joanna communicate in slack the bugs that have been added to the board for the day.

Communication on the Test Plan and the daily testing status:

  • What needs to be communicated daily?
    • Test Cycle execution status (including test cycle name and % of completion)
    • #of test cases that were executed, passed, and failed
    • Joanna Bebak (Deactivated) will post any test cases or items that need attention and review by Team ILL at the end of her day in the QA slack channel
    • Sam Im (Deactivated) will post any test cases or items that need attention and review by Team Parrot at the end of her day in the QA slack channel
  • Best time of day (for each team to communicate morning status and end of day status & share blockers)
    • Beg of day, post what we are doing today
    • End of day, post status of what we have done and anything pending

Environments and Demo Data for testing

Environments: test.openlmis.org, uat.openlmis.org, and two additional environments once this ticket is completed  OLMIS-5066 - Getting issue details... STATUS

Refer to demo data readme for more details about user permissions: https://github.com/OpenLMIS/openlmis-referencedata/blob/master/demo-data/README.md

ONLY test with the users "Admin" or "Administrator" when executing test cases related to Administration activities.

Testing Data

Team Parrot 
Team Mind the Gap
Team JSITeam OnaConcerns


srmanager1, smanager1, psupervisor, wclerk1

srmanager2, smanager2, psupervisor, wclerk1 

srmanager4 (for second approval), smanager4, dsrmanager, psupervisor

administrator (testing requisition template updates/changes or program settings changes)

Family Planning

Essential Meds

Essential Meds and Family Planning

  • Demo data restriction: May need to refresh environment if all current period reqs are processed (request and post status in QA slack channel)
Stock Management






divo1, divo2



Administration (Reference Data)



All programs


divo1, divo2

vsrmanager1 (supervises Cuamba)

vsrmanager2 (one facility)


Release Candidate Tests

Exploratory Tests

Performance Tests

Performance testing scenarios are located here: Performance Metrics

Deploying the release: Release Checklist 3.4

Release Candidate Testing Concerns:

  • Release testing should include all teams testing during the same sprint period
  • Concern: There have been multiple teams working on one component (requisitions) there isn't a clear way to break out the release of features by team. So teams will release components. Who releases what?
    • The team that has made the most changes may take ownership of releasing the component.
    • This includes manual performance testing
      • Team Ona owns reporting
      • Team Mind the Gap owns Administration (reference data) 
      • Team Parrot 
      • Team JSI owns Requisitions 
  • Concern: Managing multiple teams, teams testing in parallel
    • End of day reports of testing status
    • Automate testing status reports (when possible)
    • More servers for testing (might need 2 new servers)
  • Concern: Who fixes bugs found during regression & release testing
    • During Bug triage, identify and assign bugs to teams
    • Some team members may be fixing bugs while others don't have any release candidate activities to do (no additional testing needed)
    • Representatives from each team should be included in the bug triage to estimate and discuss capacity to fix
  • Concern: Teams don't have work to do during testing (blocked by waiting for bug fixes)
    • Plan work for each team in case they are done early
      • Updating documentation, prepping release documentation
      • Potentially start on new work as time permits
      • Exploratory testing
  • Concern: setting up perftest for manual performance testing (matching baseline data)

OpenLMIS: the global initiative for powerful LMIS software