Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 20 Next »


Working Draft

This is a working draft of a new release process that OpenLMIS is considering adopting in Fall 2017 for future releases. Once this is reviewed by stakeholders, this information will be incorporated into the official release documentation on docs.openlmis.org and this wiki page will be archived.

Reference Distribution Release Process

Active Development

  • Multiple agile teams develop OpenLMIS services/components and review and incorporate Pull Request contributions
  • Microservices architecture provides separation between the numerous components
  • Automated test coverage prevents regressions and gives the team the safety net to release often
  • Continuous Integration and Deployment (CI/CD) ensures developers get immediate feedback and QA activities can catch issues quickly
  • Code is peer reviewed during Jira ticket workflow and in Pull Requests
  • Documentation, CHANGELOGs and demo data are kept up-to-date as code development happens in each service/component

Do Release Preparation & Code Freeze

  • Verify the pre-requisites, including all automated tests are passing and all CHANGELOGs are up-to-date; see Release Prerequisites 
  • Conduct a manual regression test cycle 1-2 weeks before the release, if possible
  • Begin a Code Freeze: shift agile teams' workloads to bugs and clean-up, rather than committing large new features or breaking changes ("slow down" 1-2 weeks before release)
    • Note: Branching is not part of the current process, but may be adopted in the future along with CI/CD changes to support more teams working in parallel.
  • Write draft Release Notes including sections on 'Compatibility', 'Changes to Existing Functionality', and 'New Features'
  • Schedule or timing for releases is documented and may be discussed and revised by the community

Publish a Release Candidate

  • Each component that has any changes since the last release is released and semantically versioned (e.g., openlmis-requisition:6.3.4 or openlmis-newthing:1.0.0-beta)
    • Note: Usually, all components are released with the Reference Distribution. Sometimes, due to exceptional requests, the team may release a service/component at another time even when there is not another Reference Distribution release.
  • Reference Distribution Release Candidate is released with these components (e.g., openlmis-ref-distro:3.7.0-rc1)
    • Note: We archive permanent documentation for every release, but not for every release candidate.
  • Share Release Candidate with the OpenLMIS community along with the draft Release Notes and invite testing and feedback

Review Period

  • Active Development is paused and the only development work that happens is release-critical bug fixes or work on branches (note: branches are not yet recommended and not supported by CI/CD)
  • Conduct a full manual regression test cycle (including having developers conduct testing)
  • Run automated performance testing and review results
  • Collect all bug reports in Jira, including those from community early adopters, and including bugs in code, documentation and translations, tagging with the RC AffectsVersion and triaging which are critical for release
  • Overall timeline for review period starts when the first Release Candidate is shared and should last at least 1 week, during which time subsequent Release Candidates may be published

Fix Critical Issues

Are there critical release issues? If not, after the first Release Candidate (RC1) we may move directly to a release. Otherwise, we will fix critical issues and publish a new Release Candidate (e.g. RC2).

  • Developers fix critical issues in code, documentation, and translations. Only commits for critical issues will be accepted. Other commits will be rejected.
  • Every commit is reviewed to determine whether portions or all of the full regression test cycle must be repeated
  • And we continue to hold every ticket up to our on-going guidelines and expectations:
    • Every commit is peer reviewed and manually tested, and should include automated test coverage to meet guidelines
    • Every commit must correspond to a Jira ticket and have gone through review and QA steps, and have Zephyr test cases in Jira

Once critical issues are fixed, publish a new Release Candidate and conduct another Review Period.

Publish the Release

When a Release Candidate has gone through a Review Period without any critical issues found, then this candidate is promoted to General Availability (GA) as an official release of OpenLMIS.

  • Update the Release Notes to state that this is the official GA release and include the date
  • Release the Reference Distribution; this GA release contains the same components as the accepted Release Candidate and has an official version number tag (e.g. openlmis-ref-distro:3.7.0)
  • Share the Release with the OpenLMIS community along with the final Release Notes

After publishing the release, Active Development can resume.

Implementation Release Process

Implementations are typically composed of multiple core OpenLMIS components plus some custom components or extensions, translations and integrations. It is recommended that OpenLMIS implementations follow a similar process as above to receive, review and verify that updates of OpenLMIS v3 perform correctly with their customizations and configuration.

Key differences for implementation releases:

  • Upstream Components: Implementations treat the OpenLMIS core product as an "upstream" vendor distribution. When a new core Release Candidate or Release are available, they are encouraged to pull the new upstream OpenLMIS components into the implementations CI/CD pipeline and conduct testing and review.
  • Independent Review: It is critical for the implementation to conduct its own Review Period. It may be a process similar to the diagram above, with multiple Release Candidates for that implementation and with rounds of manual regression testing to ensure that all the components (core + custom) work together correctly.
  • Conduct Testing/UAT on Staging: Implementations should apply Release Candidates and Releases onto testing/staging environments before production environments. Testing should be conducted on an environment that is a mirror of production (with a recent copy of production data, same server hardware, same networks, etc). There may be a full manual regression test cycle or a shorter smoke test as part of applying a new version onto the production environment. There should also be a set of automated tests and performance tests, similar to the core release process above, but with production data in place to verify performance with the full data set.
  • Follow Best Practices: When working with a production environment, follow all best practices: schedule a downtime/maintenance window before making any changes; take a full backup of code, configuration and data at the start of the deployment process; test the new version before re-opening it to production traffic; always have a roll-back plan if issues arise in production that were not caught in previous testing.





  • No labels