Skip to end of metadata
Go to start of metadata

Target release


Document status


Document owner

Mary Jo Kochendorfer

Technical Lead

Josh Zamor

Table of Contents


The reporting and analytics vision is to support OpenLMIS users with both routine reporting and ad-hoc analysis. Users need a way to extract and visualize data. This specification is not defining the exact list of reports needed but rather what types of reports (map, offline, etc.) and functionality is desired. Please see the Scope Out section for a clear explanation of what pieces this spec does not include like alerts (via email/sms) and integrations to systems like DHIS2.

This wiki page is an attempt to define the reporting and analysis needs for OpenLMIS users based on the following generalized user personas identified by the community.  


In an attempt to gain buy-in and agreement by the OpenLMIS community on what the needs, requirements and definitions are for reporting and analytics within OpenLMIS.  Do not focus on the technical solution as there may be multiple approaches to meeting the desired behaviors. Focus of this spec is to outline what the end users will need and want.  This spec will support the decision making process for implementing reporting and analytics within OpenLMIS.  

To support this conversation, we define reporting, dashboards and analytics in the following ways:

Reports, routine reporting and sometimes called built-in reporting contain detailed data in a tabular format and typically display numbers and text only, but they can use visualizations to highlight key data. Some key characteristics of a report include :

  • It presents numbers and text in a table.
  • It can contain visualizations (like a map, chart, graph) but only used to highlight findings in the data.
  • It is optimized for printing and exporting to a digital document format such as CSV, Word or PDF.
  • It is geared towards people who prefer to read data, for example, lawyers, who would rather read text over interpreting visualizations, and accountants, who are comfortable working with raw numbers.
  • Rendered from a defined query/template.
  • Does not require high technical capacity to run or access the report
  • Usually viewed or run on a routine bases (more than once a year)
  • Usually drives users to an action


Designer is an engineer.

Routine reporting applies typically to more than one implementation.

Dashboards are a data visualization tool that displays the current status of metrics and key performance indicators (KPIs) for OpenLMIS end users. Dashboards consolidate and arrange numbers, metrics and sometimes performance scorecards on a single screen. Some key characteristics of a dashboard:

  • All the visualizations fit on a single computer screen — scrolling to see more violates the definition of a dashboard.
  • It shows the most important performance indicators / performance measures to be monitored.
  • Interactivity such as filtering and drill-down can be used in a dashboard; however, those types of actions should not be required to see which performance indicators are under performing.
  • It is not designed exclusively for high-level decision makers or but rather should be used by the general workforce as effective dashboards are easy to understand and use.
  • The displayed data automatically updated without any assistance from the user. The frequency of the update will vary by organization and by purpose. The most effective dashboards have data updated at least on a daily basis.
  • Does not require high technical capacity to access


Designer is an engineer.

A dashboard applies typically to more than one implementation.

Analytics and Ad-hoc analysis leverages tools that offer the ability to select ad-hoc date ranges, pick different products, or drill down to more detailed data to be dashboards and my dynamic than reports. Some key characteristics of data analytics include : 

  • user can define desired data element combinations (via a programming language or wizard).
  • Not routine and the combination of data elements is run infrequently (once year).
  • It fits on one screen, but there may be scroll bars for tables with too many rows or charts with too many data points.
  • It is highly interactive and usually provides functionality like filtering and drill downs.
  • It is primarily used to find correlations, trends, outliers (anomalies), patterns, and business conditions in data.
  • The data used in a ad-hoc analysis is generally historical data. However, there are some cases where real-time data is analyzed.
  • An outcome of ad-hoc analysis might be to define/identify performance indicators for use in reporting and/or dashboards.
  • It is typically relied on by technically savvy users like data analysts and researchers with knowledge of and experience with report programming languages.


Designer needs to be technically inclined to their BI&A tool of choice (e.g. Excel, DHIS2, Tableau, etc).  Per-seat licensing for report designer and consumer may apply depending on tool choice.

Typically more implementation specific, less re-use across implementations.

Above definitions were taken and adapted from a DashboardInsights article.

User Personas 

UserTech aptitudeScope of supervisionKind of ReportsAccess (how/when)Frequency**Report Types

Store Manager

Facility in-charge/administrator

LowOnly responsible for one geographic facility but may have multiple storeroomsRoutine reporting
  • Application or email
  • ( ? )
Monthly or more (if available)
  • Offline reports (PDF, Excel)
  • Printable reports
  • dashboard

Intermediate Store Manager

(could be district, regional, province)

District/regional/provincial health officer, program coordinators, pharmacist

  • Low
  • Can navigate excel (relies on an existing format)
Supervises a subset of facilities or specific zone. Facilities must be mapped to zones.Routine reporting
  • Email and Application
  • Can be delayed (days)

Monthly or weekly depending on replenishment schedules

Daily around due dates for requisitions.

  • offline reports
  • Tabulated/paginated printable reports
  • dashboards
  • maps (reporting rate, stock out, etc.)
  • Filterable results
  • Serach-able results
  • Aggregate reports with drill down to granular items

Central Personnel or

Central Program personnel

  • Medium
  • Strong Excel skills (can do filters, charts and graphs)
  • No programming experience
  • knows DHIS2
National supervision of all geographic facilities OR national supervision of one program.


Some ad-hoc (where VAN is rolled out)

  • Application
  • Immediate
  • Monthly (depending on replenishment schedules)
  • bi-annually or annually (for forecasting/planning)

  • ad hoc donor visualizations
  • maps (stock outs, CCE performance)
  • tabulated/paginated reports
  • Excel export of results
  • filterable results
  • Serach-able results
  • Aggregate reports with drill down to granular items
  • printable reports
  • offline reports (potentially for program leads)

Technical Administrator

(can be from the MIS department of the MOH; sometimes there is both a Technical Administrator and a Managerial Administrator)

  • Medium-High
  • Database and Programming experience
  • Have some knowledge of the data model
  • May have some knowledge of other reporting tools (Tableau, etc)

Create new reports, templates and conduct ad-hoc reporting.

Ad-hoc requests from MOH, stakeholders ("I want to see...")Upon request
  • ad hoc requests from MOH, stakeholders
  • establish new reports and create visualizations (including types above such as maps, printable, etc)
  • may extract data from OpenLMIS into other reporting tools
  • Open Question: How much of this is built-in functionality in OpenLMIS versus in a connected tool?


(most likely a contractor/vendor supporting the system)

  • High
  • Probably knows Java and SQL
  • Probably can configure DHIS2
May create new reports or customize and alter existing reports based on requests from stakeholders


Routine (help running or troubleshooting/
customizing routine reports) 

  • Upon request (either at time of implementation or as a support request)

(Would use and customize all of the above)

Stakeholders outside OpenLMISVariesMay represent directorates (public health, any interest programs), stakeholders, partners and donors like UNICEF, GAVI, etc.Ad-hocReports may be shared with these users outside OpenLMIS as needed, whether on paper or by forwarding an email containing a PDFPDF

**If the implementer is using stock management, frequency of reports would increase.


  • If the implementer is using stock management, frequency of reports would increase with transaction data. 

User Stories

#TitleUser StoryLabelImportanceNotes




Create user personas and outline objectives

OLMIS-2077 - Getting issue details... STATUS

Research of technical approach/options

OLMIS-2078 - Getting issue details... STATUS

Open Questions

Below is a list of questions to be addressed as a result of this requirements document:

1How much money does a MOH want to spend on resources to support creating report templates? We want to figure out guardrails on what types of budgets MOHs have to support the creation of reports. This will help us assess reporting tool options. For example, if one solution only allows for highly-specialized resources which cost a lot, we may not go with it. FYI, Josh Zamor.
In progress.
2What type of skill sets can we expect implementers to have to create the report templates for routine reporting?
In progress.

Example reports

Community members are encouraged to share key reports and examples for the team to review and keep in mind as moving forward.  Assuming the perspective of a national-level program manager who wants visibility into the activity of their program by region/district, but who would also like the option to get granular-level visibility into facility-by-facility data as well. 

1 - Stockout days 

  • Ideally would be able to view on a map geographically 
  • Could show avg. # of stockout days in a district – aggregated data from all facilities in that district to give a picture of the overall stockout rate for that district 
  • User could click on the facility list and see the specific facilities reporting stockouts 

2- Stock levels/Consumption (these are two different indicators)

  • Nice to view on a map geographically
  • Shows SOH at beginning of period and SOH at the end of period for each facility 
  • Graph could show min/max levels to indicate that stock delivery is adequate to bring stock levels up to max, if they’re ordering too much, or too little 
  • Can show that a facility stock level is below the min or near stocking out 
  • Should show value of commodity, even if the commodity is provided to the consumer for free, in order to reinforce idea that stocks have real value

3 -Timeliness and completeness of reporting 

  • Could show aggregate for the district of whether district is reporting on time
  • Similar to Stockout Report, user could click on the facility list and see the specific facilities which were not reporting on time 

4 - On Time and full deliveries 

  • Could show aggregate data for a district/region for on time and complete deliveries that month. 
  • Similar to Stockout Report, user could click on the facility list and see the specific facilities which reported delayed or incomplete deliveries

5 - Inventory aging

  • Show items near to expiry (configurable date range, since length of the pipeline will determine how critical a given range is)
  • Show items with VVM stage 2 status

6 - Expiry/wastage/Loss (based on reason codes)

  • Show items that expired/damaged/lost and require disposal (units and value)
  • Show loss rate; total lost / total usable stock used during period as percent (units and value) 

7 - Stocked according to plan (traffic light or similar indicator)--this is similar to but distinct from #2

  • Show stocks that are within planned parameters (min/max or EOP/max, with config tolerance thresholds) for all commodities managed
  • Show stocks that are below threshold for min/EOP
  • Show stocks that are above threshold for max

8 - Forecast accuracy

  • Show the ratio of actual consumption by product during a particular period compared to the consumption forecasted for the same period (central level dashboard)

9 - Emergency order rate

  • Show number of unplanned orders/deliveries in period compared to planned deliveries

10 - Performance Score Card

  • Compare HF/district/region against all peers for period and over time with a balanced scorecard comprising critical KPIs (configurable)
  • This is both for self-assessment (How are we doing?) and for supervision/oversight (how are they doing?)

Some other ideas we tossed around but didn’t discuss as thoroughly were:


Financial Reports 

A program manager would use this report to measure how much money a facility should have (relating to cost recovery) 

A program manager could also see a graph depicting the total costs of requisitions for their area (get a clearer picture of where their money is going) 

Could see whether facility is in the red/black based on cost recovery analysis 

Should show value of inventory, value of good sold/issued/dispensed, value of goods lost

Emergency Orders 

Good to show on a map - district level making emergency orders to provincial level, potential risk of stockouts. 

Out of Scope

  • Please note this spec does not include in application alerts (like an alert to an approver that a requisitions is ready for approval) which are addressed within each functional area (like Requisitions) based on business logic.  
  • This spec does not include the scope of exporting data into another system like DHIS2.  If an implementation of OpenLMIS decides to use another application for all reporting needs, then the following feature set would not be used by that implementer.


  1. One of the key takeaways the technical team would like to get out of this is:

    • assuming the person that needs a report that doesn't exist, are they able to build it them-self, and what technical skill do they posses to help them achieve that

    Since most of our current personas don't typically have relevant report writing proficiency, I think we should add some new personas that typify the sort of contractors that might be hired to write the reports for them.

    1. Josh Zamor, ok. I was thinking that would be the 'implementer' persona.  We could add the context that there is a need for a technical resource to support the development of report templates.  For instance, Ben and SolDevelo will be building those report templates for Malawi as implementers.

      1. The Product Committee meeting on March 28, 2017 discussed the personas and provided feedback. We updated the wiki page, especially the Technical Administrator persona, to capture that.

        We did identify that there is usually someone in-country who has the technical proficiency to conduct ad-hoc reporting upon demand. They understand the data model, SQL, and tools (maybe Excel or Tableau) well enough to make ad-hoc reports as needed. My understanding, however, is that this person might be doing that reporting and visualization in tools outside the OpenLMIS software, and is probably not programming their new reports or visualizations into the software system when they do ad-hoc reporting. So it may be more important for OpenLMIS to provide a "data export" capability that spits out a CSV or a format suitable for Tableau or other tools to ingest. This would happen on an as-needed, ad-hoc, case-by-case basis. The Technical Administrator may use different tools at different times, and different people might play this persona role over time.

        1. OpenLMIS should be able to support a routine and configurable batch file upload to another application as part of the interface with either a health information mediator or a data aggregator like DHIS2. I'm probably stating the obvious since I'm new to this conversation, and I do see the out of scope bullet above for this particular spec, but the thread was here, soooo...