Testing Conversations


 Anna Czyrko (Unlicensed)Brandon Bowersox-JohnsonPaweł Gesek, Mary Jo Kochendorfer (Deactivated) Sebastian Brudziński



 Anna Czyrko (Unlicensed)Brandon Bowersox-Johnson, Paweł Gesek and Mary Jo Kochendorfer (Deactivated)

  • Anna Czyrko (Unlicensed) created three contract tests, skipping periods and skipping lineItems for both regular and emergency requisitions
  • Paweł Gesek add the concept of non-full supply products should be included in the contract tests
    • would need to tests, one for a program with the setting is on and off. 
    • We think Essential Meds it is off and Family Planning it is on (non-full supply)
  • Anna Czyrko (Unlicensed) will look at acceptance criteria for sprint 19 when she has time.
  • Brandon Bowersox-Johnson brought up contract tests for recent calculations added to the requisition template (adjusted consumption, average consumption, max stock, calculated order quantity)
  • Paweł Gesek would like to know the priority of the various tests, non-full supply
  • Priorities are driven by exposure to end users and invisible (like calculations, non-full supply)



 Anna Czyrko (Unlicensed)Brandon Bowersox-Johnson and Mary Jo Kochendorfer (Deactivated)

  • Raised OLMIS-1606 to understand why contract tests are failing.
  • Anna Czyrko (Unlicensed) sent an email of tests.
  • Review Backlog Grooming Sprint 18
    • Prioritize the tests related to the end user workflow (de-prioritize the configuration ones till later)
  • Look into creating contract tests for the calculations since users will not know if they are broken.
  • Anna Czyrko (Unlicensed) will look into how permissions are handled in contract tests. Would be great if the test is creating its own user and rights/roles. However, let's figure out how it is handled.
  • Anna Czyrko (Unlicensed) will look into skipping products, periods, requisition groups contract tests next.

 Anna Czyrko (Unlicensed)Brandon Bowersox-Johnson, Paweł Gesek and Mary Jo Kochendorfer (Deactivated)

  • OLMIS-1638 was created
  • Contract test proposal. Should be mixed and using both robust and granular depending on what it is testing. So sometimes we need robust contract tests for workflow. Granular contract tests will be for validations.
    • Background: Emergency contract tests were broken down into small pieces (rather than what was done for regular requisition) and required many repetitions for each test.
  • Outline what contracts we want in 3.0. Currently created:
    • Regular requisitions workflow 
    • Emergency requisition workflow
    • Convert an order
    • User login
    • Create a user
  • Need contract tests to test the following
    • Adding delete to regular requisitions contract test
    • Supervisory nodes, requisition groups should be tested (either in a new contract test or rewrite the current ones to use multiple users/roles)
    • Skipping products
    • Skipping periods


 Anna Czyrko (Unlicensed) and Mary Jo Kochendorfer (Deactivated)

  • discussed OLMIS-924 calculation and removed the n-1 denominator. 
  • OLMIS-1409 is scheduled for Sprint 17 and Mary Jo Kochendorfer (Deactivated) is available for questions/review. Very important!
  • External folks will look at the UAT server to do a walk through and log bugs
  • Anna Czyrko (Unlicensed) create a stub ticket for updating passwords for users (OLMIS-1638)
  • Anna Czyrko (Unlicensed) to think about how the contract tests should be structured (granular or more robust) currently regular is robust and emergency is granular. We would like to harmonize these eventually. Once approach is finalized, we will make a task to update the testing strategy doc.


 Anna Czyrko (Unlicensed) and Mary Jo Kochendorfer (Deactivated)

Discussed issues with test and UAT servers.

Next sprint (17) we will aim to complete 1409 and prioritized contract tests. 

  • Anna Czyrko (Unlicensed) to break apart the regular requisition creation and status changes into smaller tests to help identify where things have gone wrong
  • Anna Czyrko (Unlicensed) to add a wiki page to the Testing Strategy about setting up and implementing contract tests. This isn't a top priority but she will complete when she has time. Depending on the content, either put in readthedocs or the wiki.
  • Anna Czyrko (Unlicensed) will suggest a couple contract tests to the backlog grooming page, 1403 perhaps




 

Attendees: Paweł GesekNick Reid (Deactivated)Brandon Bowersox-Johnson

Testing strategy is near completion.  Folks should review and raise their final comments.

Anna will talk with her team about testing on browsers and sizes

Test Coverage

  • We are looking into Sonar to monitor our test coverage
  • Unit tests - what should unit tests, test? Do we need further clarification?

Monitor Tests:

  • Are there more contract tests we should be writing to test the requisition service?
  • What is the best way to support the process of writing tests? Thinking about adding acceptance criteria within tickets around contract, component, and integration tests.
    • Paweł Gesek would like separate tickets for contract tests. Unit tests are a focus now for code reviews. Would Sonar monitor if a commit would bring the test coverage down?
    • Brandon Bowersox-Johnson is in agreement and would love to have sonar be able to monitor, that would be great.

Tickets

  • What should be in each ticket around testing?
  • Should it be a checklist which developers reference or should there be things in the ticket?

Discussion: Have a general check list on test coverage and link to it within the ticket. Specific test coverage instructions should be in acceptance criteria. Unit test each acceptance criteria. Team ILL doesn't need to put this in, rather it should be captured during the review process and all team members are responsible.

Definitions of Done

We need to update the "definition of done" documention. We could also define the life-cycle/process of a ticket. We could clean up the steps a ticket (and its sub-tasks) takes.

What about roll over? What about sub-tasks.... Cross that bridge when we get there (smile)



Past conversations

Items to be completed before the next call.

  • Move the test strategy page to Developer Guide
  • Anna to incorporate the following into the Test Strategy page
    • please make sure to incorporate all topics addressed on https://openlmis.atlassian.net/wiki/display/OP/Quality+Assurance page
    • add responsibility of who writes the tests
    • please link to the Testing Guide https://github.com/OpenLMIS/openlmis-template-service/blob/master/README.md
    • QA workflow needs to finished
    • Przemysław Studziński (Semik) is helping with contract tests. Fixing current one. Create facility type, Creating users.
    • Anna to create the story in JIRA for facilities contract test
    • Anna and Pawel to define which contract test we will have Beta.  New test - Initiate Requisition.
    • When supporting the acceptance criteria please remember to
      • Test a failure path (try to break the code)
      • Review the configuration guide
      • Ask for help if needed
      • Mary Jo to add instructions to the Story Definitions and Template around what we’d like to see around testing comments/updates to stories when moved to done
      • Brandon and Mary Jo to speak with the team around goals for monitoring unit, integration, and component testing


High-level What we discussed

1. Introductions

  I would like to hear about the SolDevelo OpenLMIS QA team. Describe what each person does and an example of what they have been working on lately.

2. Review Previous Meeting Notes

  Review "Notes from the call" email below. Talk through what has been done for each item.

3. Process

  •   a. New Test Plans: Mary Jo and I reviewed them. We’d like to hear how you are planning to use them.
  •   b. Test Cases: Explain how you are making those and how you are using them.
  •   c. Testing Strategy: Look at the wiki section by section: https://openlmis.atlassian.net/wiki/display/OP/Quality+Assurance
  •   d. How has SolDevelo handled tests for a similar project like this?

4. NEXT STEPS

  Discuss next steps from here. I do have a few ideas for discussion:

  • - In every ticket, the Definition of Done should include automated and manual test expectations for that ticket.
  • - In every ticket, the QA team should write a comment explaining their work when they review/QA the ticket. Do not move a ticket to Done without an explanation.
  • - Create additional tickets specifically for the contract or E2E testing.
  • - VillageReach: create expectations about automated unit test coverage

OpenLMIS: the global initiative for powerful LMIS software