2017-08-08 Meeting notes

Date

7am PST


https://meetings.webex.com/collabs/#/meetings/detail?uuid=MC4MI8V4AU87PSQRPHTS1BHJF1-3O29&rnd=240422.52612


Attendees

Goals

  • QA Testing Workflow
  • Review recent dev forum posts:  migrating data cross-service
  • Review where we're at with performance
  • Review previous action items

Discussion items

TimeItemWhoNotes
30min

Discussion of QA Testing Workflow and expectations for each sprint

Presentation of Test Cycles and Test Case execution:

  • Test Strategy
  • Explain the purpose of Test Cycles and reporting of Test Metrics
  • Walk through the Test Case creation & ownership

Sam Im (Deactivated)
 10mMigrating data cross servicePaweł Gesek
 
 https://groups.google.com/forum/#!topic/openlmis-dev/JPUUrsY5y70
15mPerformanceJosh Zamor

Performance Data

Performance Tips WIP

A call to be adding at least the data and tests with new endpoints (CCE?)

5Close up / action itemsJosh ZamorJuly 25th action items

Notes


QA Testing Workflow


Not everyone is aware of the QA test cycles and how it should fit into the sprint.  https://openlmis.atlassian.net/wiki/x/BYBUBQ


Types:

  • manual
  • automatic
  • regression - haven't been keeping up, will be moving into one of each sprint


Sam and or QA lead will create test cycle.

For testing we all add new tests for bugs as we fix them.

For regression testing the QA leads, Sam and Mary Jo will define which to-be-released features will have a regression test.


In grooming, QA is going to be more apart of the planning process to identify where test cases are missing.  At the end of the sprint, QA will showcase the test plan, test cycle, new test cases and defect tracking.  If there's any regression testing, that also will be showcased.


OLMIS-2797 has an example of testing criteria and a test case ticket that relates to it.  It's important that we all ensure there's a test case so we can validate that a ticket is complete.


Each sprint has a test cycle or a regression cycle.  When you create a new test, you should add it to the existing test cycle.


  1. Unexectuted
  2. WIP
    1. Pass (only if ALL pass)
    2. Fail
    3. Blocked

As each step is done, update the status and add any comments if needed.


Updating test exection is kinda manual - do each step, the test itself might need to be updated manually if it doesn't prompt you. 


Feedback in Monday QA (next one is on Friday) meeting and then Slack QA channel


Migrating Data Cross Service


Option 2, a cross-service script, sounds like the path.  Malawi didn't have any objection.

  • release notes?
  • how to run?
  • should it support multiple data sources?  (Malawi's only 1, is there ever any reason to have this work one more?)
  • is their any sort of framework or container that's re-usuable?


Performance


Action items

OpenLMIS: the global initiative for powerful LMIS software