Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

We will need to add a reporting docker-compose file  file and templates for the specific service to the openlmis-ref-distro repository that stands up the reporting stack.

We would like to implement the reporting stack without creating new Docker images where ever possible, therefore we propose to create one repository for each component of the reporting stack:

...

creating a reporting folder in the openlmis-ref-distro repo that will contain the docker-compose.yml file and associated template files. Tooling that we build to load templates into the reporting stack will exist elsewhere in separate tool repo(s).

The docker-compose file will need to reference Docker files for the following services and the repository will need to contain templates for them:

  • NiFi
    • Description: Contain all code related to running a production quality, versioned Nifi environment. This repository will include Nifi specific templates and API calls to auto-load those templates when the docker container is running.
    • Official Website: https://nifi.apache.org/
    • Docker Container(s):
  • openlmis-kafkaKafka
  • openlmis-druidDruid
    • Description: This repository will contain all code related to running a production quality, versioned druid cluster that's used for the reporting storage engine. The linked Docker container acts as an example single node cluster.
    • Official Website: http://druid.io/
    • Docker Container(s):
  • openlmis-postgresqlPostgreSQL
  • openlmis-supersetSuperset
  • openlmis-zookeeperZookeeper
    • Description: Apache Zookeeper is a centralized service for maintaining configuration information, naming, providing distributed synchronization, and providing group services. This repository will contain all code related to running a production quality version of Apache Zookeeper.
    • Official Website: https://zookeeper.apache.org/
    • Docker Container(s):

Configuration

  • We may need to load NiFi flow templates and execute other API calls to set values in these templates.
  • We may need to create and configure Kafka topics using command line calls, alternatively we may do this via NiFi.
  • We may need to load in dashboard templates through the SuperSet API.

OpenLMIS Reporting Repository Template

...