7am PST
Time | Item | Who | Notes |
---|---|---|---|
30min | Discussion of QA Testing Workflow and expectations for each sprint Presentation of Test Cycles and Test Case execution:
| Sam Im (Deactivated) |
|
10m | Migrating data cross service | Paweł Gesek | https://groups.google.com/forum/#!topic/openlmis-dev/JPUUrsY5y70 |
15m | Performance | Josh Zamor | Performance Data Performance Tips WIP A call to be adding at least the data and tests with new endpoints (CCE?) |
5 | Close up / action items | Josh Zamor | July 25th action items |
Not everyone is aware of the QA test cycles and how it should fit into the sprint. https://openlmis.atlassian.net/wiki/x/BYBUBQ
Types:
Sam and or QA lead will create test cycle.
For testing we all add new tests for bugs as we fix them.
For regression testing the QA leads, Sam and Mary Jo will define which to-be-released features will have a regression test.
In grooming, QA is going to be more apart of the planning process to identify where test cases are missing. At the end of the sprint, QA will showcase the test plan, test cycle, new test cases and defect tracking. If there's any regression testing, that also will be showcased.
OLMIS-2797 has an example of testing criteria and a test case ticket that relates to it. It's important that we all ensure there's a test case so we can validate that a ticket is complete.
Each sprint has a test cycle or a regression cycle. When you create a new test, you should add it to the existing test cycle.
As each step is done, update the status and add any comments if needed.
Updating test exection is kinda manual - do each step, the test itself might need to be updated manually if it doesn't prompt you.
Feedback in Monday QA (next one is on Friday) meeting and then Slack QA channel
Option 2, a cross-service script, sounds like the path. Malawi didn't have any objection.