Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

This document identifies the steps needed to stand up and configure the reporting stack for a new server.

...

  1. Review and update the .env variables

    1. Update all passwords (Note: If the database username/password is updated, update the database SQLALCHEMY_DATABASE_URI)

    2. Update the domain names for superset and Nifi in /etc/hosts file

    3. If using SSL, change the *_ENABLE_SSL variable to true and add your SSL certificate and chain into /openlmis-ref-distro/reporting/config/services/nginx/tls directory
  2. Update the OpenLMIS system's URL in the Superset config.py file. They are currently pointing to UAT.openlmis.org
    1. Open /openlmis-ref-distro/reporting/config/services/superset/superset_config.py
    2. Change the following variables to point to your version of OpenLMIS
      1. base_url
      2. access_token_url
      3. authorize_url
      4. HTTP_HEADERS allow-from variable
  3. Start the services:

    Code Block
    # Destroy any existing volumes
    docker-compose down -v
    # Build and start the services
    docker-compose up --build
    # If not using Scalyr, you should use:
    # docker-compose up --build -d --scale scalyr=0


  4. After 2 or 3 minutes open your web browser and navigate to the URLs
    1. Nifi can be accessed at {BaseUrl}/nifi
    2. Superset can be accessed at {BaseUrl}/login

...

  1. Open Nifi {BaseUrl}/nifi
  2. Login to nginX using the credentials defined in the previous step
  3. Four processor groups will automatically load with all the configurations and running.


Info

NB: If it's the first time running the reporting stack, and you'd like to get data to view immediately, follow the steps bellow:

  1. Stop all process groups
  2. Edit the baseURL, admin_username and admin_password in requisitions, reference data and permissions process groups by right clicking anywhere in the process group then selecting variables
  3. Edit the first processor in each of the process groups editing Scheduling

...

  1. strategy under SCHEDULING setting from CRON Driven to Timer Driven and set the Run Schedule to 100000 Sec
  2. In Requisitions connector, edit the processor titled Get requisitions from /api and add this ${baseUrl}/api/requisitions/search?access_token=${access_token} to the remote URL property
  3. Start all process groups with the materialized views process group being the last since it refreshes the views based on the data from the other three process groups
  4. Revert all changes done to the Scheduling strategy to CRON Driven to have the data pulled in everyday and remote URL for requisitions


Loading Charts and Dashboards

...

  1. Click on provider then the sign in button which will then prompt you for authentication and authorization.
  2. Go to the dashboard tab that lists the available dashboards

           

Backing Up Charts and Dashboards

After making changes to a dashboard, administrators may wish to backup its definition to a flat file. This may be done as shown below.



...