Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 5 Next »

Working Draft

This is a working draft of a new release process that OpenLMIS is considering adopting in Fall 2017 for future releases. Once this is reviewed, including by Technical and Product committee stakeholders, this information will be incorporated into the official release documentation on docs.openlmis.org and this wiki page will be archived.

Core Release Process

Active Development

  • Multiple agile teams develop OpenLMIS, in addition to receiving Pull Request contributions
  • Microservices architecture provides separation between the numerous components
  • Automated test coverage prevents regressions and gives the team the safety net to release often
  • Continuous Integration and Deployment (CI/CD) ensures developers get immediate feedback and QA activities can catch issues quickly
  • Code is peer reviewed during Jira ticket workflow and in Pull Requests
  • Documentation and CHANGELOGs are kept up-to-date along with the corresponding code and demo data

Do Release Preparation

  • Verify the pre-requisites, including all automated tests are passing and all CHANGELOGs are up-to-date; see Release Prerequisites 
  • Conduct a manual regression test cycle 1-2 weeks before the release, if possible
  • Write draft Release Notes including sections on 'Compatibility', 'Changes to Existing Functionality', and 'New Features'
  • Shift agile teams' workloads to bugs and clean-up, rather than committing large new features or breaking changes ("slow down" 1-2 weeks before release)
  • Schedule for releases is currently monthly patch releases and quarterly minor releases (as of 3.2.0, this has been the schedule since 3.0.0)

Publish a Release Candidate

  • Each component that has any changes since last release is released and semantically versioned (e.g., openlmis-requisition v6.3.4 or openlmis-newthing v1.0.0-beta)
  • Reference Distribution Release Candidate is released with these components (e.g., openlmis-ref-distro v3.7.0-rc1)
  • Share Release Candidate with the OpenLMIS community along with the draft Release Notes and invite testing and feedback

Review Period

  • Active Development is paused and the only development work that happens is release-critical bug fixes or work on branches (note: branches are not yet recommended and not supported by CI/CD)
  • Conduct a full manual regression test cycle (including having developers conduct testing)
  • Run automated performance testing and review results
  • Collect all bug reports in Jira, including those from community early adopters, and including bugs in code, documentation and translations, tagging with the RC AffectsVersion and triaging which are critical for release
  • Overall timeline for review period should last at least 1 week, during which time subsequent Release Candidates may be published

Fix Critical Issues

Are there critical release issues? If not, after the first Release Candidate (RC1) we may move directly to a release. Otherwise, we will fix critical issues and publish a new Release Candidate (e.g. RC2).

  • Developers fix critical issues in code, documentation, and translations. Only commits for critical issues will be accepted. Other commits will be rejected.
  • Every commit is peer reviewed and manually QA'd
  • Every commit is reviewed to determine whether portions or all of the full regression test cycle must be repeated

Once critical issues are fixed, publish a new Release Candidate and conduct another Review Period.

Publish the Release

When a Release Candidate has gone through a Review Period without any critical issues found, then this candidate becomes the released version of OpenLMIS.

  • Release Notes are updated to state that this is the official release and include the date of the release
  • Reference Distribution is released! It contains the same components as the accepted Release Candidate and has its official version (e.g. openlmis-ref-distro v.3.7.0)
  • Share Release with the OpenLMIS community along with the final Release Notes

After publishing the release, Active Development can resume.

Implementation Release Process

Implementations are typically composed of multiple core OpenLMIS components plus some custom components or extensions, translations and integrations. It is recommended that OpenLMIS implementations follow a similar process as above to receive, review and verify updates of OpenLMIS v3.

Key differences for implementation releases:

  • Upstream Components: Implementations treat the OpenLMIS core product as an "upstream" vendor distribution. When a new core Release Candidate or Release are available, they are encouraged to pull the new upstream OpenLMIS components into the implementations CI/CD pipeline and conduct testing and review.
  • Independent Review: It is critical for the implementation to conduct its own Review Period. It may be a process similar to the diagram above, with multiple Release Candidates for that implementation and with rounds of manual regression testing to ensure that all the components (core + custom) work together correctly.
  • Conduct Testing/UAT on Staging: Implementations should apply Release Candidates and Releases onto testing/staging environments before production environments. There may be a full manual regression test cycle or a shorter smoke test as part of applying a new version onto the production environment. There should also be a full set of automated tests and performance tests, similar to the core release process above, but with production data in place to verify performance with the full data set. The testing/staging environment should have a recent copy of production data and should have an identical environment.
  • Follow Best Practices: When working with a production environment, follow all best practices: schedule a downtime/maintenance window before making any changes; take a full backup of code, configuration and data at the start of the deployment process; test the new version before re-opening it to production traffic; always have a roll-back plan if issues arise in production that were not caught in previous testing.





  • No labels