Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Changed General Availability/GA terminology to Golden Master


Note
titleWorking Draft

This is a working draft of a new release process that OpenLMIS is considering adopting in Fall 2017 for future releases. Once this is reviewed by stakeholders, this information will be incorporated into the official release documentation on docs.openlmis.org and this wiki page will be archived.

...

When a Release Candidate has gone through a Review Period without any critical issues found, then this candidate is promoted to General Availability (GA) release candidate becomes the Golden Master to be published as an official release of OpenLMIS.

  • Update the Release Notes to state that this is the official GA release and include the date
  • Release the Reference Distribution; this GA release contains the same components as the accepted Release Candidate and has an official the exact code and components in the Golden Master Release Candidate are tagged as the OpenLMIS Reference Distribution release with a version number tag (e.g. openlmis-ref-distro:3.7.0)
  • Share the Release with the OpenLMIS community along with the final Release Notes

...

  • Upstream Components: Implementations treat the OpenLMIS core product as an "upstream" vendor distribution. When a new core Release Candidate or Release are available, they are encouraged to pull the new upstream OpenLMIS components into the implementations CI/CD pipeline and conduct testing and review.
  • Independent Review: It is critical for the implementation to conduct its own Review Period. It may be a process similar to the diagram above, with multiple Release Candidates for that implementation and with rounds of manual regression testing to ensure that all the components (core + custom) work together correctly.
  • Conduct Testing/UAT on Staging: Implementations should apply Release Candidates and Releases onto testing/staging environments before production environments. Testing should be conducted on an environment that is a mirror of production (with a recent copy of production data, same server hardware, same networks, etc). There may be a full manual regression test cycle or a shorter smoke test as part of applying a new version onto the production environment. There should also be a set of automated tests and performance tests, similar to the core release process above, but with production data in place to verify performance with the full data set.
  • Follow Best Practices: When working with a production environment, follow all best practices: schedule a downtime/maintenance window before making any changes; take a full backup of code, configuration and data at the start of the deployment process; test the new version before re-opening it to production traffic; always have a roll-back plan if issues arise in production that were not caught in previous testing.

...