3.4 Regression and Release Candidate Test Plan

3.4 Regression and Release Candidate Test Plan

@Sam Im (Deactivated) and @Joanna Bebak (Deactivated) to review 3.3 plan and create a 3.4 test plan here.

For the 3.4 release, the performance testing plan and status is tracked on this page.

Release Candidate Test Plan

During Sprint 104 we will follow the test plan detailed below: 

  • If there are Blocker or Critical bugs when we start Sprint 104, they must be fixed and tested before testing can begin

  • Regression Test Plan will only include regression for Requisitions component (2 days of testing for both teams, and 1 day for bug fixes and bug fix testing)

  • Please execute the test cases in the order that they are listed in the test cycle, as assigned to you by the QA lead

  • If no Blocker or Critical bugs, then move to the release candidate deployment and testing

    • RC Phase 1&2 (Two full days of testing for both teams, and 2 days for bug fixes and bug fix testing) for each release candidate as needed

    • Testing will be done in test and uat environments (@Sam Im (Deactivated) add links to new environments here)

  • Performance testing (data set and environment set up)

  • How to enter bugs:Testing Process & Test Plans#EnterDefectsduringsprinttesting

  • Prioritization of bugs:http://docs.openlmis.org/en/latest/contribute/contributionGuide.html#reporting-bugs

Test Phase

Components/Features

Assigned in Test Cycle

Dates

QA lead responsibilities

Test Phase

Components/Features

Assigned in Test Cycle

Dates

QA lead responsibilities

Phase 1

Requisitions (Team JSI)

Fulfillment (Team Parrot)

Stock Management (Team Parrot)

CCE (Team Mind the Gap)

Administration/Reference-data UI (Team Mind the Gap)

Reporting (Team Ona Gap)

Team JSI: 3.4 Requisitions RC1

Team Parrot: 3.4 Fulfillment RC1

3.4 Stock Management RC1

Team Mind the Gap: 3.4 Administration (Reference Data) RC1

3.4 CCE RC1





Reporting: Testing the OpenLMIS Reporting Stack

1 full day each team

Team Parrot: 

  • Joanna will provide status of testing by end of day. 

  • Joanna will triage and prioritize any bugs found in Phase 1

  • If there are Blocker or Critical bugs they will be assigned to a 3.4Phase 1 Bug test cycle 

  • Bugs are labeled 3.4Phase1bug

Team Mind the Gap: @Sam Im (Deactivated)

  • Sam will provide status of testing by end of day.

  • Sam will triage and prioritize any bugs found in Phase 1

Team JSI: Testing Requisitions

Team Ona Gap: Testing Reporting

Bug triage

Bug triage for Phase 1

Phase 1 Bug test cycle

1 full day 

This bug test cycle is done in parallel (on the same day -  as Phase 2 below)



Phase 2

Bug Fixes & edge cases

Exploratory Testing

Translations

Performance Testing for:

  • Regression

  • New features

3.4 RC2

Team JSI: 3.4 Requisitions RC2

Team Parrot: 3.4 Fulfillment RC2

3.4 Stock Management RC2

Team Mind the Gap: 3.4 ReferenceData RC2

3.4 CCE RC2







1 full day each team

  • Anyone not participating in bug fixes will complete Edge case, exploratory or translation testing

  • For all bug fixes, we will require re-push reviews via pull requests or have more reviewers for any change. If reviews are still pending at the end of the day, please mention the reviewer in slack.

  • Brandon and one other team member will complete performance testing

  • Bugs are labeled 3.4Phase2bug

Bug triage

Bug triage for Phase 2

Phase 2 Bug test cycle

1 full day  (as needed)



After these test phases are complete, and all tests have passed with no Blocker or Critical bugs, we can deploy the release candidate and begin the next phases of testing that are defined below.





Roles & Responsibilities

QA Team Leads

Sam Im and Joanna Bebak will be the QA leads for each team

Owner

Responsibilities

Questions

Owner

Responsibilities

Questions

@Sam Im (Deactivated)

@Joanna Bebak (Deactivated)

  • Create Test Cycles 

  • Create missing test cases and assign to test cycles

  • Assign Test Cases to team members

  • Point person for any questions about test cases from the team

  • Review execution of the test cycles, there should be positive progress during the day

  • Prioritize bugs per test cycle, Check that developers have detailed proposed solutions (if time allows or developer's experience allows) When a bug is created during the day, the bugs are triaged before end of day and detailed in the daily QA slack communication

  • Report the status of each test cycle, including defects reported, before end of day 

  • After a bug fix test cycle, review automated testing and provide status before end of day (go to http://build.openlmis.org/view/all/builds)



Team leads:

Sebastian

Ashraf

Sam

Craig



  • Coordinate manual testing with team members

  • Coordinate manual performance testing with team members

  • Attend bug triage & assign bugs to team members

  • Review bugs created by team and check for completeness, proposed solutions, priorities, and labels



Team responsibilities

  • Execute test cases assigned to you by your team lead

  • Record the test execution by following these steps:Testing Process & Test Plans#ExecutingTestCase

  • Enter Defects as encountered:

    • If there are any Blocker bugs, try to spend time completing a root cause analysis and add details in the bug ticket for ease in bug triage

    • When a defect is found, research and provide proposals in the ticket for review by Sam & Joanna (as time allows)

    • All bugs are labeled with their release candidate: RC1, RC2, etc.

  • Assist other testers as needed

  • @Chongsun Ahn (Unlicensed) or @Josh Zamor (Deactivated) to set up update environment(s) with Release Candidate before testing begins

  • Team leads to provide updates on automated performance testing results

  • Team leads to coordinate manual UI Performance Testing



  • @Sam Im (Deactivated) & @Joanna Bebak (Deactivated) will create a bug test cycle when bugs are found, and assign the test cases to them, for each phase there will be a separate bug fix test cycle

Bug Triage team

Members

Responsibilities

Questions

Members

Responsibilities

Questions

Sam

Joanna

Ashraf

Craig

Sebastian

Chongsun

  • Review list of bugs provided by Sam & Joanna in QA slack channel

  • Prioritize bugs

  • Provide priority to Sam & Joanna to update in QA slack channel (via filter or dashboard?)

  • Sam & Joanna create test cycles for retesting if needed

  • Sam & Joanna provide status update on bug fixes in QA slack channel as needed

  • Sam & Joanna communicate in slack the bugs that have been added to the board for the day.

  • When should we meet every day? Twice a day: 9am and 4pm

  • Guidance on Bug prioritization is located here:http://docs.openlmis.org/en/latest/contribute/contributionGuide.html#reporting-bugs

  • If there are bugs, then testing for this phase will be done in uat only.

  • If there are bugs found, then we must retest 

  • If bugs prioritized as Major or less, they will be triaged in Phase 2 bug triage
    Communication of the test plan before we start testing for the release

    • @Joanna Bebak (Deactivated) and @Sam Im (Deactivated) are responsible for communicating this test plan in the QA slack channel

Communication on the Test Plan and the daily testing status:

  • What needs to be communicated daily?

    • Test Cycle execution status (including test cycle name and % of completion)

    • #of test cases that were executed, passed, and failed

    • @Joanna Bebak (Deactivated) will post any test cases or items that need attention and review by Team ILL at the end of her day in the QA slack channel

    • @Sam Im (Deactivated) will post any test cases or items that need attention and review by Team Parrot at the end of her day in the QA slack channel

  • Best time of day (for each team to communicate morning status and end of day status & share blockers)

    • Beg of day, post what we are doing today

    • End of day, post status of what we have done and anything pending

Environments and Demo Data for testing

Environments: test.openlmis.org, uat.openlmis.org, and two additional environments once this ticket is completed https://openlmis.atlassian.net/browse/OLMIS-5066

Refer to demo data readme for more details about user permissions: https://github.com/OpenLMIS/openlmis-referencedata/blob/master/demo-data/README.md

ONLY test with the users "Admin" or "Administrator" when executing test cases related to Administration activities.

Testing Data

Component

Username

Program

Team Parrot 

Team Mind the Gap

Team JSI

Team Ona

Concerns

Component

Username

Program

Team Parrot 

Team Mind the Gap

Team JSI

Team Ona

Concerns

Requisitions





srmanager1, smanager1, psupervisor, wclerk1

srmanager2, smanager2, psupervisor, wclerk1 



srmanager4 (for second approval), smanager4, dsrmanager, psupervisor



administrator (testing requisition template updates/changes or program settings changes)

Family Planning

Essential Meds



Essential Meds and Family Planning









  • Demo data restriction: May need to refresh environment if all current period reqs are processed (request and post status in QA slack channel)

Stock Management

srmanager2

divo1













Fulfillment

vsrmanager1

vsrmanager2

divo1, divo2

rivo

vwclerk1













Administration (Reference Data)

admin

administrator

All programs











CCE

divo1, divo2

vsrmanager1 (supervises Cuamba)

vsrmanager2 (one facility)













Reporting















Release Candidate Tests

Exploratory Tests

Performance Tests

Performance testing scenarios are located here: Performance Metrics



Deploying the release: Release Checklist 3.4





Release Candidate Testing Concerns:

  • Release testing should include all teams testing during the same sprint period

  • Concern: There have been multiple teams working on one component (requisitions) there isn't a clear way to break out the release of features by team. So teams will release components. Who releases what?

    • The team that has made the most changes may take ownership of releasing the component.

    • This includes manual performance testing

      • Team Ona owns reporting

      • Team Mind the Gap owns Administration (reference data) 

      • Team Parrot 

      • Team JSI owns Requisitions 

  • Concern: Managing multiple teams, teams testing in parallel

    • End of day reports of testing status

    • Automate testing status reports (when possible)

    • More servers for testing (might need 2 new servers)

  • Concern: Who fixes bugs found during regression & release testing

    • During Bug triage, identify and assign bugs to teams

    • Some team members may be fixing bugs while others don't have any release candidate activities to do (no additional testing needed)

    • Representatives from each team should be included in the bug triage to estimate and discuss capacity to fix

  • Concern: Teams don't have work to do during testing (blocked by waiting for bug fixes)

    • Plan work for each team in case they are done early

      • Updating documentation, prepping release documentation

      • Potentially start on new work as time permits

      • Exploratory testing

  • Concern: setting up perftest for manual performance testing (matching baseline data)

OpenLMIS: the global initiative for powerful LMIS software