Stand-up and Configure the Reporting Stack

This document identifies the steps needed to stand up and configure the reporting stack for a new server.

NOTE: Before you get started, the OpenLMIS server needs to have a user with username "admin".

NOTE: Step-by-step instruction can be found on the page Set up the reporting stack step-by-step

Update the Config and Start the Services

Clone the OpenLMIS-ref-distro GitHub repository into your local developer environment. We assume your environment already has Docker installed.

git clone https://github.com/openlmis/openlmis-ref-distro

Once cloned update your settings.env file as is a standard practice to identify the IP address that consul will run on. Make sure to update the SCALYR_API_KEY if you're planning on using it.

Update the following configuration variables that are specific for the reporting stack:

  1. Review and update the .env variables

    1. Update all passwords (Note: If the database username/password is updated, update the database SQLALCHEMY_DATABASE_URI)

    2. Update the domain names for superset and Nifi in /etc/hosts file

    3. If using SSL, change the *_ENABLE_SSL variable to true and add your SSL certificate and chain into /openlmis-ref-distro/reporting/config/services/nginx/tls directory
  2. Update the OpenLMIS system's URL in the Superset config.py file. They are currently pointing to UAT.openlmis.org
    1. Open /openlmis-ref-distro/reporting/config/services/superset/superset_config.py
    2. Change the following variables to point to your version of OpenLMIS
      1. base_url
      2. access_token_url
      3. authorize_url
      4. HTTP_HEADERS allow-from variable
  3. Start the services:

    # Destroy any existing volumes
    docker-compose down -v
    # Build and start the services
    docker-compose up --build
    # If not using Scalyr, you should use:
    # docker-compose up --build -d --scale scalyr=0
  4. After 2 or 3 minutes open your web browser and navigate to the URLs
    1. Nifi can be accessed at {BaseUrl}/nifi
    2. Superset can be accessed at {BaseUrl}/login

Load in data from NiFi flows

Now, we need to update Nifi with the appropriate BASE URL and secrets to properly extract information and store it.

  1. Open Nifi {BaseUrl}/nifi
  2. Login to nginX using the credentials defined in the previous step
  3. Four processor groups will automatically load with all the configurations and running.


NB: If it's the first time running the reporting stack, and you'd like to get data to view immediately, follow the steps bellow:

  1. Stop all process groups
  2. Edit the baseURL, admin_username and admin_password in requisitions, reference data and permissions process groups by right clicking anywhere in the process group then selecting variables
  3. Edit the first processor in each of the process groups editing Scheduling strategy under SCHEDULING setting from CRON Driven to Timer Driven and set the Run Schedule to 100000 Sec
  4. In Requisitions connector, edit the processor titled Get requisitions from /api and add this ${baseUrl}/api/requisitions/search?access_token=${access_token} to the remote URL property
  5. Start all process groups with the materialized views process group being the last since it refreshes the views based on the data from the other three process groups
  6. Revert all changes done to the Scheduling strategy to CRON Driven to have the data pulled in everyday and remote URL for requisitions


Loading Charts and Dashboards

Now, we need to load superset to be able get the data from the Postgres database and load charts and dashboards

  1. Click on provider then the sign in button which will then prompt you for authentication and authorization.
  2. Go to the dashboard tab that lists the available dashboards

           

Backing Up Charts and Dashboards

After making changes to a dashboard, administrators may wish to backup its definition to a flat file. This may be done as shown below.



Some browsers prevent the download of .json files by default. If you aren't presented with a Save As dialog, ensure that your browser isn't blocking popups.


Notes about Importing the Backup File

If you export dashboard 123 built off of datasource 456 and try to load it into a fresh instance of superset, you’ll have a lot of broken pointers because that fresh instance of superset will try to start with ids of 1 for both.

For the reporting stack, we built everything off of a “fresh” instance of superset, so when the content is imported it doesn’t encounter this issue.
It should be possible to edit the .json file by hand to fix the broken pointers, although we have not not tried this yet and don't know precisely where they'll be encountered.




OpenLMIS: the global initiative for powerful LMIS software